Data Engineer

Singapore, Singapore

Job Description


At Singtel, our mission is to Empower Every Generation. We are dedicated to fostering an equitable and forward-thinking work environment where our employees experience a strong sense of Belonging, to make meaningful Impact and Grow both personally and professionally. By joining Singtel, you will be part of a caring, inclusive and diverse workforce that creates positive impact and a sustainable future for all.

Be a Part of Something BIG!

Singtel Group IT team, deliver the right technologies and innovative capabilities. We ignite our digital future together as we deliver innovative technologies and capabilities in partnership with our business stakeholders, customers, our partners and each other. We want to deliver solutions that enable inspiration, education, optimism, and connections, allowing people to work smarter as they pioneer, entertain, learn, and dream.

This role will be responsible to design and deliver data engineering products, solutions, data pipelines and run operations.

We are committed to celebrating inclusion and diversity and is a strong believer to upskill and nurture all individuals. Come join us today as we build our team, and Empower Every Generation to live, work and play in new ways!

Make an Impact by:

  • Design, develop and automate large scale, high-performance distributed data pipelines (batch and/or real-time streaming) that meet both functional and non-functional requirements
  • Deliver high level & detailed design to ensure that the solution meet business requirements and align to the data architecture principles and technology stacks
  • Partner with business domain experts, data scientists, and solution designers to identify relevant data-assets, domain data model and data solutions. Collaborate with product data engineers to coordinate backlog feature development of data pipelines patterns and capabilities
  • Drive Modern Data Platform operations using Data Ops, ensure data quality, monitoring the data system. Also support Data science MLOps platform
  • Drive and deliver industry standard Devops (CI/CD) best practices, automate development and release management
  • Support and contribute to data pipeline documentations, development guidelines & standards for data-pipeline, data model and layer design
Skills for Success
  • Bachelor\xe2\x80\x99s degree in IT, Computer Science, Software Engineering, Business Analytics or equivalent
  • Minimum of 5 years of experience in Data Engineering, Data Lake Infrastructure, Data Warehousing, Data Analytics tools or related, in design and developing of end-to-end scalable data pipelines and data products
  • Experience in building and operating large and robust distributed data lakes (multiple PBs) and deploying high performance with reliable system with monitoring and logging practices
  • Experience in designing and building data pipelines using some of the most scalable and resilient open[1]source big data technologies; Spark, Delta-Lake, Kafka, Airflow and related distributed data processing
  • Build and deploy high performance modern data engineering & automation frameworks using programming languages such as Scala/Python and automate the big data workflows such as ingestion, aggregation, ETL processing etc
  • Good understanding of data modeling and high[1]end design, data engineering / software engineering best practices - include handling and logging errors, monitoring the system, fault-tolerant pipelines, data quality and ensuring a deterministic pipeline with DataOps
  • Excellent experience in using ANSI SQL for relational databases like \xe2\x80\x93 Postgres, MySql, Oracle and knowledge of Advanced SQL on distributed analytics
  • Experience working in Telco Data Warehouse and / or Data Lake engines \xe2\x80\x93 Databricks SQL, Snowflake, etc
  • Proficiency programming languages like Scala, Python, Java, or scripting languages like Bash
  • Experience on cloud systems like AWS, Azure, or Google Cloud Platform o Cloud data engineering experience in at least one cloud (Azure, AWS, GCP) o Experience with Databrick (Cloud Data Lakehouse)
  • Experience on Hadoop stack: HDFS, Yarn, Hive, HBase, Cloudera, Hortonworks
  • Build and deploy using CI/CD toolkits
Rewards that Go Beyond
  • Hybrid work arrangements
  • Full suite of health and wellness benefits
  • Ongoing training and development programs
  • Internal mobility opportunities
Your Career Growth Starts Here. Apply Now!

We are committed to a safe and healthy environment for our employees & customers and will require all prospective employees to be fully vaccinated.

Singtel

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1367827
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Singapore, Singapore
  • Education
    Not mentioned