Data Engineer (idmc, Aws, Redshift, Etl, Databricks)

SG, Singapore

Job Description

Responsibilities:



Design, develop, and maintain scalable data pipelines and ETL/ELT processes for structured and unstructured data. Define and implement data models and warehouse architectures (star/snowflake schema) for analytics and reporting. Develop and optimize ETL workflows using IDMC (Informatica Intelligent Data Management Cloud) Perform data ingestion, transformation, and loading across multiple sources like Oracle, MSSQL, MySQL, and Teradata. Manage and maintain databases across platforms (Oracle, MSSQL, MySQL, Teradata). Write and optimize complex SQL queries, stored procedures, and performance tuning for large datasets Integrate and process large-scale data using Big Data technologies (e.g.Hadoop, Spark, Hive, or cloud equivalents like AWS Glue, Redshift, S3 ,Azure Data Factory, or GCP Dataflow). Implement data validation, profiling, and quality checks to ensure accuracy and reliability. Collaborate with data governance teams to maintain metadata, lineage, and compliance with security standards. Document data flows, integration processes, and design specifications for maintainability. Support migration and modernization initiatives (e.g., moving from on-premises DWH to cloud-based systems).



Requirements :



Minimum 5 years of experience in data warehousing, big data, or advanced analytics solutions. Experience with databases (e.g., Oracle, MS SQL, MySQL, Teradata, Databricks). Expertise in data repository design (e.g., operational data stores, data marts, data lakes). Proficiency in data query techniques (e.g., SQL, NoSQL, Spark SQL). Hands-on experience with Databricks (Delta Lake, MLflow, Spark). Experience with Informatica Data Management Cloud (IDMC) for data integration, transformation, and governance. Must-have: Strong knowledge of AWS cloud services (e.g., AWS Glue, Redshift, S3, Lambda, Kinesis, Athena, EMR). Experience in building and optimizing ETL/ELT workflows using AWS native tools, Databricks, or IDMC. Understanding of event-driven architectures and microservices. Data modeling experience (e.g., Star Schema, Snowflake Schema). Experience in data visualization tools (e.g., Power BI, Tableau). Infrastructure as Code(IaC): Terraform, CloudFormation. CI/CD & DevOps best practices for data pipelines and cloud infrastructure. Identity and Access Management (IAM), security best practices, and data governance. AWS Certified Solutions Architect - Associate or Professional Databricks Certified Data Engineer * Informatica IDMC Certification

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1665587
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    SG, Singapore
  • Education
    Not mentioned