Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

  • The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.
  • We are looking for a GCP developer who can design, develop, test and deploy ingest pipelines connected to a variety of on-prem and Cloud data sources – both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.
  • You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

In this role, you will:

  • Onboard new data sources - negotiate, agree, define, and document effective IT data contracts with source data providers (security, formats, schemas, extraction and load schedules, SLAs, data validation rules, error scenarios, retry mechanisms and etc)
  • Design, build, test and deploy performant and effective Cloud data ingest pipelines (GCP DataFusion, Spark etc.) via API / SFTP / etc. into GCP Warehouse
  • Develop, test and deploy GCP Data Fusion custom plugins.
  • Build automated tests to validate ETL pipelines. 
  • Handle incremental and full data loading strategies for structured and semi-structured data with medium-to-high volume, velocity, variety.
  • Perform data enrichment, standardization, cleanse, aggregation in Datafusion ensuring data integrity, consistency and compliance with Business and overall organisations standards, data governance and sovereignty.
  • Develop procedures and scripts for data migration, back-population, and feed-to-warehouse initialization.
  • Carrying Data Ops required activities ensuring pipelines health, performance and data in time delivery for the consumers.
  • Protect the solution data Masking and Lineage capabilities as needed.
  • Review and refine, interpret and implement business and technical requirements.
Requirements

To be successful in this role, you should meet the following requirements:

  • Tech stack:
    • GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager
    • Git, Ansible Tower / Ansible scripts, Jenkins, 
    • Java, Python, Terraform, Cloud Composer/Airflow
  • Must Have
    • Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.
    • Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts.
    • Good Java knowledge : experience in Java development
    • Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. 
    • Experience in working in Agile environment and toolset.
    • Strong problem-solving and analytical skills
    • Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.
    • Strong organisational and multi-tasking skills.
    • Good team player who embraces teamwork and mutual support.

Nice to Have

  • Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
  • Hands on development in Python, Terraform
  • Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion)
  • Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect)
  • Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query
  • Experience in working in DataOps model
  • Experience in Data Vault modelling and usage.
  • Proficiency in Git usage for version control and collaboration.
  • Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)

You’ll achieve more when you join HSBC.
www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India