Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist-a highly skilled and experienced developer with expertise in Java, Apache Spark, Spring boot, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Airflow / Composer.
In this role, you will:
- Stream and Batch Processing :
- Design, develop, and maintain real-time and batch data pipelines using Apache Spark OR Apache Beam.
- Implement stateful stream processing, event-time handling, and windowing with Spark.
- Optimize Spark jobs for performance, scalability, and fault tolerance.
- Build scalable, high-performance applications using Java.
- Write clean, maintainable, and efficient code following best practices.
- Integrate Spark pipelines with external systems Elasticsearch and RDBMS databases.
- Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution.
- Monitor and troubleshoot Airflow DAGs to ensure smooth operations.
- Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications.
- Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions.
- Participate in code reviews, design discussions, and technical decision-making.
- Monitor production systems using tools like ELK or Grafana.
- Optimize resource usage and costs for GCP services and Kubernetes clusters.
To be successful in this role, candidate should meet the following requirements:
- Strong proficiency in Java with experience in building scalable and high-performance applications.
- Basic to intermediate knowledge of Python for scripting and automation.
- Hands-on experience with Apache Spark for real-time stream processing and batch processing.
- Proficiency in Apache Airflow for building and managing workflows.
- Experience with Composer on GCP is a plus.
- Strong experience with Google Cloud Platform (GCP) services:
- Dataflow, BigQuery, Pub/Sub, GCS, and Composer.
- Hands-on experience with Docker for containerization.
- Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE).
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Familiarity with Helm charts for Kubernetes deployments.
- Experience with monitoring tools like ELK, Grafana
- Good communication skills
You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSDI