We’re building a next-generation data platform at Kitchen Warehouse and we need someone who can own the blueprint and build it too. As our Data Architect you will design, implement, and govern a scalable Medallion Architecture on Google Cloud Platform, then roll up your sleeves and deliver the critical pipelines and data models that bring that architecture to life.
This is not a “draw diagrams and hand them off” role. You will be the design authority for our data ecosystem while staying deeply hands-on with BigQuery, dbt, Fivetran, and LookML every day. You’ll set the standards the team follows , and you’ll be the first person to live by them.
Key Responsibilities
- Platform Architecture: Design and continuously evolve the end-to-end data architecture, Bronze/Silver/Gold layers in BigQuery, ingestion patterns via Fivetran, transformation orchestration through dbt.
- Hands-On Engineering: Build and maintain production data pipelines, dbt projects (macros, tests, documentation, CI/CD), and BigQuery datasets. You architect it, you ship it.
- Governance & Data Quality: Implement data governance using Google Dataplex for cataloguing, lineage, and quality rules. Define and enforce PII masking strategies via Google Sensitive Data Protection (SDP).
- Semantic Layer Ownership: Author and maintain LookML models to create a single source of truth for business metrics, dimensions, measures, explores, and derived tables.
- Performance & Reliability: Optimise BigQuery slot utilisation, partition/cluster strategies, and query performance. Monitor pipeline health and build automated alerting.
- BI Enablement: Partner with analysts to enable high-performance reporting in Looker and Power BI, ensuring dashboards are backed by governed, Gold-layer data.
- Standards & Documentation: Define naming conventions, code review standards, branching strategies, and architectural decision records (ADRs) for the data team.
- Team Uplift: Mentor junior data engineers on best practices, dbt modelling patterns, SQL optimisation, and architecture thinking.
Qualifications & Skills
- Experience: 5+ years in data engineering or analytics engineering, with at least 2 years in an architecture or tech-lead capacity.
- GCP Ecosystem: Deep, production-level experience with BigQuery (partitioning, clustering, materialised views, SQL optimisation) and broader GCP services.
- dbt Mastery: Proven track record managing complex dbt projects end-to-end, modular SQL, custom macros, data tests, documentation, and CI/CD integration.
- Data Integration: Hands-on experience with Fivetran (or equivalent managed ELT tools) for source-to-warehouse ingestion at scale.
- Semantic Modelling: Experience writing and maintaining LookML (dimensions, measures, explores, derived tables) or an equivalent semantic layer.
- Governance Mindset: Familiarity with data cataloguing, lineage tracking, and PII management tooling (Dataplex, SDP, or equivalents).
- Communication: Ability to translate architectural decisions into plain language for non-technical stakeholders, e.g., articulating why a Silver layer exists before Gold.
- Version Control: Comfortable with Git-based workflows, pull request reviews, and CI/CD for data assets.
Nice to Have
- Experience in Retail and/or eCommerce data environments (POS, inventory, order management, promotional analytics).
- Familiarity with Medallion Architecture patterns (or Lakehouse architecture) in a production setting.
- Experience with Apache Airflow or similar orchestration tools.
- Power BI development experience (DAX, data modelling, dataflows).
- Google Cloud certifications (Professional Data Engineer, BigQuery).
What We Offer
- A greenfield opportunity to design a data platform from scratch, not maintain someone else’s legacy.
- Direct partnership with the Head of Engineering & Architecture, short feedback loops, real influence on technical direction.
- A modern, opinionated stack: GCP, BigQuery, dbt, Fivetran, Looker, Dataplex.
- Competitive compensation aligned with senior-level technical expertise.
