Department: Application Solutions & EngineeringBusiness
Unit: Financial Services (FSI)
Role Title: Senior Data EngineerReporting
To: Data Engineering Chapter Lead / Head of Data & AI EngineeringEmployment
Type: Full-time / Permanent1.
Role OverviewThe Senior Data Engineer is responsible for designing, building, and maintaining large-scale, secure, and high-performance data pipelines supporting critical Financial Services workloads.The role focuses on data modernization, regulatory data aggregation, and AI/ML enablement across domains such as Core Banking, Payments, Risk, Treasury, and Regulatory Reporting.2. Key ResponsibilitiesoDesign, implement, and optimize ETL/ELT data pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.oBuild and operationalize real-time streaming pipelines leveraging Kafka / Confluent / Azure Event Hubs for risk and liquidity data.oIntegrate and transform data across Core Banking, Trade, Payments, Treasury, CRM, and Compliance systems.oImplement data quality, validation, and lineage controls using tools such as Great Expectations / Deequ / dbt tests.oDevelop and maintain data models and schemas (3NF, Dimensional, Data Vault 2.0).oCollaborate with Security and Governance teams to implement data security, masking, encryption, and tokenization in compliance with MAS TRM / PDPA / PCI-DSS.oParticipate in data platform modernization projects (Teradata / DB2 ? Snowflake / Databricks / Synapse).oCollaborate with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.oSupport regulatory reporting (MAS 610/649) and Basel III/IV data flows.oMaintain CI/CD pipelines for data infrastructure using Azure DevOps / Terraform / GitHub Actions.3. Required Technical SkillsCategoryTools / TechnologiesLanguagesPython, PySpark, SQL, ScalaData PlatformsAzure Data Lake, Synapse, Databricks, SnowflakeOrchestrationApache Airflow, Azure Data Factory, dbtStreamingKafka, Confluent, Event HubsGovernanceApache Atlas, Azure Purview, CollibraSecurityEncryption, RBAC, Tokenization, Audit LoggingCI/CD & IaCTerraform, Azure DevOps, GitHub Actions4. Experience and Qualificationso6 - 10 years of experience in data engineering, with at least 3 years in BFSI (banking, insurance, or capital markets).oProven experience building real-time and batch data pipelines on Azure or AWS.oExposure to regulatory data models (MAS 610, Basel III, IFRS 9/17, BCBS 239).oFamiliarity with DevOps and MLOps integration.oBachelor's or Master's degree in Computer Science, Data Engineering, or a related field.oCertifications preferred: Microsoft Azure Data Engineer Associate, Databricks Data Engineer Professional, Snowflake SnowPro Core.5. Key AttributesoStrong analytical and problem-solving mindset.oAbility to work across multi-disciplinary and geographically distributed teams.oExcellent written and verbal communication skills.oHigh accountability and ownership for quality and delivery.
Job Type: Contract
Pay: $3,966.62 - $12,175.27 per month
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.