Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

In this role, you will:

  • Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy.
  • Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration.
  • Develop and optimize complex SQL queries and Python-based data transformation logic.
  • Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes.
  • Automate deployment of data pipelines using CI/CD practices in Azure DevOps.
  • Ensure data quality, security, and compliance with best practices.
  • Monitor and troubleshoot performance issues in data pipelines.
  • Collaborate with cross-functional teams to define data requirements and strategies.
Requirements

To be successful in this role, you should meet the following requirements:

  • 5+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL.
  • Hands-on experience with Prophesy for data pipeline development.
  • Proficiency in Python for data processing and transformation.
  • Experience with Apache Airflow for workflow orchestration.
  • Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes.
  • Familiarity with GitHub and Azure DevOps for version control and CI/CD automation.
  • Solid understanding of data modelling, warehousing, and performance optimization.
  • Ability to work in an agile environment and manage multiple priorities effectively.
  • Excellent problem-solving skills and attention to detail.
  • Experience with Delta Lake and Lakehouse architecture.
  • Hands-on experience with Terraform or Infrastructure as Code (IaC).
  • Understanding of machine learning workflows in a data engineering context.

You’ll achieve more when you join HSBC.

www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSDI