Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

In this role, you will:

  • Design and Develop Scalable Data Pipelines: Architect and implement end-to-end data workflows using Apache Airflow for orchestration, integrating multiple data sources and sinks across cloud and on-prem environments.
  • BigQuery Data Modeling and Optimization: Build and optimize data models in Google BigQuery for performance and cost-efficiency, including partitioning, clustering, and materialized views to support analytics and reporting use cases.
  • ETL/ELT Development and Maintenance: Design robust ETL/ELT pipelines to extract, transform, and load structured and semi-structured data, ensuring data quality, reliability, and availability.
  • Cloud-Native Engineering on GCP: Leverage GCP services like Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions to build resilient, event-driven data workflows.
  • CI/CD and Automation: Implement CI/CD for data pipelines using tools like Cloud Composer (managed Airflow), Git, and Terraform, ensuring automated deployment and versioning of workflows.
  • Data Governance and Security: Ensure proper data classification, access control, and audit logging within GCP, adhering to data governance and compliance standards.
  • Monitoring and Troubleshooting: Build proactive monitoring for pipeline health and data quality using tools such as Stackdriver (Cloud Monitoring) and custom Airflow alerting mechanisms.
  • Collaboration and Stakeholder Engagement: Work closely with data analysts, data scientists, and business teams to understand requirements and deliver high-quality, timely data products.

 

Requirements

To be successful in this role, you should meet the following requirements: 

  • Mandatory 2+ hands on working experience on GCP Bigquery (Mandatory)
  • Mandatory 2+ hands on working experience on Apache Airflow (Mandatory)
  • Mandatory 2+ hands on working experience on Python (Mandatory)
  • Mandatory 2+ hands on working experience on Linux/Unix (Mandatory)
  • Mandatory 2+ hands on working experience on PL/SQL Scripting (Mandatory)
  • Mandatory 2+ hands on working experience on ETL tools (Mandatory)- (Mandatory) – Data stage/ Informatica/ Prophecy.
  • GCP Certification on ACE (Associate Cloud Engineer) is added advantage.

You’ll achieve more when you join HSBC.

www.hsbc.com/careers 

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India