Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist/ Consultant Specialist/Senior Software engineer/Software engineer (Based on number of years of experience and role) 

  • We are seeking a highly skilled and experienced Senior Data Engineer with expertise in Java, Java 8, Microservices, Springboot 3.0.0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink.
  • Stream and Batch Processing:
  • Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam.
  • Implement stateful stream processing, event-time handling, and windowing with Flink.
  • Optimize Flink jobs for performance, scalability, and fault tolerance. 
  • Build scalable, high-performance applications using Java.
  • Write clean, maintainable, and efficient code following best practices.
  • Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases
  • Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution.
  • Monitor and troubleshoot Airflow DAGs to ensure smooth operations
  • Leverage GCP services to build and deploy cloud-native solutions:
  • Dataflow: Design and deploy real-time and batch data processing pipelines.
  • BigQuery: Perform data analysis and optimize queries for large datasets.
  • Pub/Sub: Implement messaging and event-driven architectures.
  • GCS: Manage and optimize cloud storage for data pipelines.
  • Composer: Orchestrate workflows using Apache Airflow on GCP.
  • Deploy and manage containerized applications on Google Kubernetes Engine (GKE).
  • Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications.
  • Design and manage NoSQL databases using MongoDB, including schema design, indexing, and query optimization.
  • Ensure data consistency and performance for high-throughput applications.
  • Use Python for scripting, automation, and building utility tools.
  • Write Python scripts to interact with APIs, process data, and manage workflows.
  • Architect distributed systems with a focus on scalability, reliability, and performance.
  • Design fault-tolerant systems with high availability using best practices.
  • Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions.
  • Participate in code reviews, design discussions, and technical decision-making. 
  • Monitor production systems using tools like Stackdriver, Prometheus, or Grafana.
  • Optimize resource usage and costs for GCP services and Kubernetes clusters.
Requisitos

To be successful in this role, you should meet the following requirements:

  • Strong proficiency in Java mentioned above with experience in building scalable and high-performance applications.
  • Basic to intermediate knowledge of Python for scripting and automation.
  • Hands-on experience with Apache Flink for real-time stream processing and batch processing.
  • Knowledge of Flink’s state management, windowing, and event-time processing.
  • Experience with Flink’s integration with GCP services.
  • Knowledge of Apache Beam for unified batch and stream data processing.
  • Proficiency in Apache Airflow for building and managing workflows.
  • Experience with Composer on GCP is a plus.
  • Strong experience with Google Cloud Platform (GCP) services:
  • Dataflow, BigQuery, Pub/Sub, GCS, and Composer.
  • Familiarity with GCP IAM, networking, and cost optimization.
  • Hands-on experience with Docker for containerization.
  • Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE).
  • Expertise in MongoDB, including schema design, indexing, and query optimization.
  • Familiarity with other NoSQL or relational databases is a plus.
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Ability to work in an agile environment and adapt to changing requirements.
  • Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming.
  • Knowledge of other cloud platforms (AWS, Azure) is a plus.
  • Familiarity with Helm charts for Kubernetes deployments.
  • Experience with monitoring tools like Prometheus, Grafana, or Stackdriver.
  • Knowledge of security best practices for cloud and Kubernetes environments.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Opportunity to work on cutting-edge technologies and large-scale data processing systems.
  • Collaborative and innovative work environment.
  • Continuous learning and professional development opportunities.
  • Work on impactful projects that solve real-world problems.

You’ll achieve more when you join HSBC.
www.hsbc.com/careers 

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India