Data Analyst/engineer

Toa Payoh, S00, SG, Singapore

Job Description

You tame messy data, model truth, and ship dashboards people actually use. Fast clock speed, sleeves rolled. Build our Data Hub so every team pulls from one clean, performant source--freeing product engineers to ship product.

Mission




Own the pipelines, models, and metrics layer that power decisions at Grain. Turn chaos into clarity: reliable datasets, crisp dashboards, and self-serve tools that scale.

Outcomes (first 90 days)



Data Hub foundation: documented sources, core datasets modelled and tested Leadership has reliable data for top decision areas (such as demand forecasting, breakdown of product and channel performance, customer retention, etc.) -- no more "let me check and get back to you" Team productivity: 30% reduction in ad-hoc requests as teams self-serve common questions

Responsibilities



Build data pipelines

: pull data from our systems (sales, inventory, finance, marketing), clean it, and make it queryable

Create the metrics layer

: define key business metrics once so everyone uses the same definitions (no more "which revenue number is right?")

Ship dashboards people use

: fast, clear visualisations that answer real questions; teach teams to find their own answers

Keep it fast and cheap

: optimize queries, manage warehouse costs, monitor performance

Ensure quality

: write tests, set up alerts when data breaks, establish freshness expectations

Protect customer data

: handle PII safely, control who sees what, maintain audit trails

Push data where it's needed

: send clean data back to sales, marketing, and support tools

Raise the bar

: write docs people actually read, run demos, make everyone more data-literate

Competencies



Strong SQL

: efficient, readable queries that answer tough questions. You understand execution plans and can optimize slow queries

Data modeling

: fact/dimension tables, slowly changing dimensions, event schemas - and when to break the rules

Modern stack

: dbt + orchestration (Airflow/Dagster/Prefect), cloud warehouse (BigQuery/Snowflake/Redshift/Postgres)

BI tools

: Built dashboards people actually use (Looker, Metabase, Superset, or similar)

Quality

: Testing, lineage, monitoring - you catch issues before stakeholders do

Stakeholder work

: Turn "show me engagement" into concrete metrics and actionable insights

Bonus points

: Python scripting, CI/CD, infrastructure-as-code (Terraform), event tracking, privacy-by-design thinking

How we work




Problem Prototype Prove valueProductionise. One source of truth > many spreadsheets. Stewardship over flash: reliable, observable, cost-aware. We value integrity, excellence, service--use data to uplift people.

What's in it for you



Autonomy and ownership of the data stack. Ship work used daily by every function. Mentorship, growth into Staff/Analytics Eng or Data Platform Eng path. Competitive compensation and birthday leave.

What to include in your application



CV or LinkedIn + GitHub (if any). 2-3 dashboards or repos you've built (screenshots/links) with a short note on impact. A one-pager: your approach to building a "Data Hub" in 90 days.

Interview process (typical)





Intro (45m): values, motivations, how you approach messy data.

Technical deep dive (60-90m): SQL/design exercise + performance tuning discussion.

Take-home (3-4h max): model a tiny domain in dbt + a dashboard; include tests & docs.

Stakeholder panel (45m): walk-through, trade-offs, storytelling.

* References.

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD1660936
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Toa Payoh, S00, SG, Singapore
  • Education
    Not mentioned