Job description

The health and safety of our employees and candidates is very important to us. Due to the current situation related to the Novel Coronavirus (2019-nCoV), we’re leveraging our digital capabilities to ensure we can continue to recruit top talent at the HSBC Group.  As your application progresses, you may be asked to use one of our digital tools to help you through your recruitment journey.  If so, one of our Resourcing colleagues will explain how our video-interviewing technology will be used throughout the recruitment process and will be on hand to answer any questions you might have.


Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of GCP Data Engineer.


In this role, you will:

  • As a GCP Data Engineer you will be responsible for delivering large scale high volume data enrichment through business configuration of data pipelines on Google Cloud Platform (GCP)
  • You will work within the Data Engineering team as well as with the Solution Architect, Product Owner and Business Analysts.
  • You are expected to support existing data enrichment engines / tool / utilities including their enhancements as well as design and build new data enrichment pipelines.
  • You will assist with the design and build out of data pipelines incorporating the data enrichment engines
  • You will be part of an agile team, take on complex problems and solve them through intelligent design and scalable robust code.
  • Support ‘issue resolution’ and improve processing performance on GCP including; Big Query, Cloud Storage and Stackdriver


To be successful in this role, you should meet the following requirements:

  • Excellent understanding of and at least 1 project delivery using GCP; Big Query, Compute, Pub/Sub, Cloud Storage.
  • Strong knowledge of Industry Best Practice for ETL Design, Principles, Concepts.
  • Possess + 1 years experience with Big Query SQL; creating Big Query tables, updating metadata, authorised views, query optimisation, partitioning and clustering techniques
  • DevOps and Agile engineering practitioner with experience of test driven development
  • Ability to work independently on specialized assignments within the context of project deliverables
  • Take ownership of providing solutions and tools that iteratively increase engineering efficiencies.
  • Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines
  • Excellent communication skills & team player and collaborator


Required Experience : Google Cloud Platform; Big Query, Stackdriver

Desired Experience : Data Flow, Dataproc, Git, Continuous Integration Continuous Deployment, Test Automation


You’ll achieve more when you join HSBC.





HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.


Issued by – HSBC Software Development India