Big Data Platform Engineer

Singapore, Singapore

Job Description


Job Title: Big Data Platform Engineer
Location: Singapore
Team: Global Data Platform
We are looking for a passionate and experienced Big Data Platform Engineer to join our dynamic Global Data Platform team. This role offers the opportunity to work on cutting-edge technologies and contribute to building and operating resilient, scalable, and secure data platforms.
Key Responsibilities:

  • Manage and operate core Global Data Platform components such as VM Servers, Kubernetes, Kafka, and applications within the Apache stack, including Collibra, Dataiku, and similar tools.
  • Automate infrastructure and security components, and implement CI/CD pipelines to ensure seamless and efficient execution of ELT/ETL data pipelines.
  • Enhance data pipeline resilience through monitoring, alerting, and health checks, ensuring high standards of data quality, timeliness, and accuracy.
  • Apply DevSecOps principles and Agile methodologies to deliver robust and integrated platform solutions incrementally.
  • Collaborate with enterprise security, digital engineering, and cloud operations teams to define and agree on architectural solution frameworks.
  • Investigate system issues and incidents, identify root causes, and implement continuous improvements to optimize platform performance.
  • Stay up to date with emerging technologies and industry trends to drive innovation and new feature development.
Required Skills and Experience:
  • Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field.
  • 5 7 years of experience designing or building large-scale, fault-tolerant distributed systems (e.g., data lakes, data meshes, streaming data platforms).
  • Strong hands-on expertise with distributed technologies like Kafka, Kubernetes, Spark, and the broader Hadoop ecosystem.
  • Experience in storage migration (e.g., from HDFS to S3 or similar object storage).
  • Proficient in integrating streaming and batch ingestion pipelines using tools like Kafka, Control-M, or AWA.
  • Demonstrated experience with DevOps and automation tools such as Jenkins, Octopus, and optionally Ansible, Chef, XL Release, XL Deploy.
  • Strong programming skills in Python and Java (or other languages like Scala, R), along with Linux/Unix scripting and automation using Jinja, Puppet, and firewall configuration.
  • Experience with Kubernetes pod scaling, Docker image management via Harbor, and CI/CD deployment of containers.
  • Familiarity with data serialization formats such as Parquet, ORC, or Avro.
  • Exposure to machine learning and Data Science platforms like Dataiku is a plus.
  • Cloud migration experience is advantageous.
  • Comfortable working in Agile environments (e.g., Scrum, SAFe).
  • Knowledge of the financial services industry and its products is a strong asset.
Soft Skills:
  • Excellent communication skills with the ability to collaborate across technical and business teams.
  • Detail-oriented and highly organized, with strong prioritization and multitasking abilities.
  • Proactive, customer-focused, and collaborative approach to problem-solving and project execution.
  • A strong advocate for data-driven culture and the democratization of data across the organization.

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1518902
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Singapore, Singapore
  • Education
    Not mentioned