Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. 

Department: Enterprise Technology

In this role you will:

  • We are seeking a highly skilled and experienced Senior Data Engineer with expertise in Java, Python, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink.
  • Stream and Batch Processing:-Design, develop, and maintain real-time and batch data pipelines using EMF.
  • Implement stateful stream processing, event-time handling.
  • Optimize Stored Procedures  jobs for performance, scalability, and fault tolerance
  • Application Development:- Build scalable, high-performance applications using EMF.
  • Write clean, maintainable, and efficient code following best practices
  • Workflow Orchestration:- Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution.
  • Monitor and troubleshoot Airflow DAGs to ensure smooth operations
  • Cloud-Native Solutions:Leverage GCP services to build and deploy cloud-native solutions
  • Dataflow: Design and deploy real-time and batch data processing pipelines.
  • BigQuery: Perform data analysis and optimize queries for large datasets.
  • Pub/Sub: Implement messaging and event-driven architectures.
  • GCS: Manage and optimize cloud storage for data pipelines.
  • Composer: Orchestrate workflows using Apache Airflow on GCP.
  • Python Scripting and Automation:- Use Python for scripting, automation, and building utility tools.
  • Write Python scripts to interact with APIs, process data, and manage workflows
  • System Design and Architecture:-Architect distributed systems with a focus on scalability, reliability, and performance.
  • Design fault-tolerant systems with high availability using best practices.
  • Collaboration:-Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions.
  • Participate in code reviews, design discussions, and technical decision-making.
  • Monitoring and Optimization:-Monitor production systems using tools like Stackdriver, Prometheus, or Grafana.
  • Optimize resource usage and costs for GCP services and Kubernetes clusters.
Requirements

To be successful in this role you should meet the following requirements:

  • Programming:-Strong proficiency in Java with experience in building scalable and high-performance applications.
  • Basic to intermediate knowledge of Python for scripting and automation.
  • Stream and Batch Processing:-Knowledge of Apache Beam for unified batch and stream data processing
  • Workflow Orchestration -Proficiency in Apache Airflow for building and managing workflows.
  • Experience with Composer on GCP is a plus.
  • Cloud Platform Expertise:-Strong experience with Google Cloud Platform (GCP) services
  • Dataflow, BigQuery, Pub/Sub, GCS, and Composer.
  • Familiarity with GCP IAM, networking, and cost optimization.
  • Soft Skills:-Strong problem-solving and analytical skills
  • Excellent communication and collaboration abilities.
  • Ability to work in an agile environment and adapt to changing requirements.
  • Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming.
  • Knowledge of other cloud platforms (AWS, Azure) is a plus.
  • Familiarity with Helm charts for Kubernetes deployments.
  • Experience with monitoring tools like Prometheus, Grafana, or Stackdriver.
  • Knowledge of security best practices for cloud and Kubernetes environments.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

Candidate User Guide: India HTC - IND HSDI : IJP candidate user guide (service-now.com)

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working, and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India

Nom du recruteur
Tejal Lande
E-mail du recruteur
tejal.lande@hsbc.co.in