Back to jobs

Data Engineer

Noise Digital
Greater Vancouver Metropolitan Area
Full-time
AI tools:
Cloud Composer
BigQuery
Applications go directly to the hiring team

Full Description

Noise Digital is looking for a Data Engineer to join our engineering team. This role focuses on

the technical execution of data pipelines within our Google Cloud Platform environment. You

will work alongside our lead engineers to develop orchestration workflows in Cloud Composer

(Apache Airflow) and manage data models within BigQuery.

This is a hands-on technical role centered on building reliable data infrastructure, designing

schemas, and developing custom integration logic.

Core Responsibilities

* Pipeline Orchestration: Develop, test, and maintain Airflow DAGs in Cloud Composer.

* This includes managing task dependencies, implementing custom operators, and

* monitoring production workflows.

* Data Modeling & Architecture: Implement schema designs and table structures in

* BigQuery. You will be responsible for creating ERDs, optimizing query performance

* through partitioning/clustering, and managing Dataform scripts for SQL transformations.

* Python Development: Write modular Python for custom API integrations, data processing

* tasks, and internal engineering tools.

* Team Collaboration: Work within the existing engineering framework to support

* project-specific data requirements, following established Git-based CI/CD workflows and

* documentation standards.

* Performance Optimization: Assist in the audit and optimization of BigQuery resources to

* manage processing costs and improve execution speeds.

Technical Requirements

* GCP Experience: Professional experience working within the Google Cloud Platform

* ecosystem.

* Cloud Composer / Apache Airflow: Proven experience building and troubleshooting

* complex orchestration workflows.

* Python: Proficiency in Python for data engineering and systems integration.

* SQL & BigQuery: Expert-level SQL skills and experience with structured data modeling,

* including schema design and ERDs.

* Version Control: Experience using Git/GitHub for collaborative development and CI/CD.

Preferred Qualifications

* Certification: GCP Professional Data Engineer (PDE).

* Domain Experience: Experience modelling data for marketing and/or tourism

* performance/attribution.

* AI Exposure : Exposure building LLM agents or data retrieval layers for LLM agents (RAG,

* Embeddings, MCP)

* Tooling: Hands-on experience with Dataform or dbt for version-controlled SQL modeling.

Working Conditions:

* This position operates under a hybrid work model. Employees are currently required to

* work from the office once per week to foster team collaboration and culture. While the

* role primarily allows for work-from-home arrangements, the required in-office frequency

* is subject to change based on evolving company and departmental needs.

Applications go to the hiring team directly