Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Consultant Specialist
In this role, you will:
- Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam.
- Implement stateful stream processing, event-time handling, and windowing with Flink.
- Optimize Flink jobs for performance, scalability, and fault tolerance.
- Build scalable, high-performance applications using Java.
- Write clean, maintainable, and efficient code following best practices.
- Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases.
- Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution.
- Monitor and troubleshoot Airflow DAGs to ensure smooth operations
- Leverage GCP services to build and deploy cloud-native solutions:
- Dataflow: Design and deploy real-time and batch data processing pipelines.
- BigQuery: Perform data analysis and optimize queries for large datasets.
- Pub/Sub: Implement messaging and event-driven architectures.
- GCS: Manage and optimize cloud storage for data pipelines.
- Composer: Orchestrate workflows using Apache Airflow on GCP.
To be successful in this role, you should meet the following requirements:
- Strong proficiency in Java with experience in building scalable and high-performance applications.
- Basic to intermediate knowledge of Python for scripting and automation.
- Hands-on experience with Apache Flink for real-time stream processing and batch processing.
- Knowledge of Flink’s state management, windowing, and event-time processing.
- Experience with Flink’s integration with GCP services.
- Knowledge of Apache Beam for unified batch and stream data processing.
- Proficiency in Apache Airflow for building and managing workflows.
- Experience with Composer on GCP is a plus.
- Strong experience with Google Cloud Platform (GCP) services:
- Dataflow, BigQuery, Pub/Sub, GCS, and Composer.
- Familiarity with GCP IAM, networking, and cost optimization.
- Hands-on experience with Docker for containerization.
- Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE).
You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSBC Software Development India