AI Data Engineer - GCP
Eliassen GroupFull Description
AI Data Engineer - GCP
Remote
Our client seeks an AI Data Engineer focused on GCP and AI to unify BigQuery data products and automate development workflows. This is a one-year contract to hire role. The role will design and operate LLM-driven multi-agent systems that generate code, validate data, and orchestrate pipelines across GCP. Success will improve delivery speed, data quality, and scalability for analytics and AI use cases across the enterprise.
Due to client requirements, applicants must be willing and able to work on a w2 basis. We are seeking candidates who can convert fulltime permanent without visa sponsorship or transfer. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $55.00 to $60.00/hr. w2
Responsibilities
* Build and operationalize LLM-based multi-agent workflows that read Jira stories and interpret requirements, generate SQL, Python, and Dataform-ready code, perform automated data validation and unit testing, and prepare artifacts for deployment and raise pull requests.
* Apply agent patterns such as planners, evaluators, and tool-using agents to automate engineering tasks.
* Integrate LLM agents with BigQuery, Python, ETL/ELT pipelines, and internal frameworks.
* Automate repetitive development and testing steps to reduce manual engineering work.
* Collaborate with teams exploring VS Code Copilot, Auto Model, Cloud Model Sonnet, Gemini MCP, and Claude Code.
* Support multiple Orion squads including SKU, Customer, Fulfillment, Stores, DCs, and Website, and contribute to future program waves.
Experience Requirements
* 5+ years total engineering experience.
* Strong Python development background.
* Hands-on experience with LLM agents, multi-agent frameworks, or AI orchestration patterns.
* Strong understanding of prompt engineering, context management, and agent-based task decomposition.
* Solid data engineering experience across ETL/ELT pipelines, data modeling, orchestration tools, and cloud data ecosystems.
* Experience with GCP and BigQuery, or AWS with some BigQuery exposure.
* Ability to integrate LLM agents with data pipelines, APIs, and enterprise systems.
Preferred Skills:
* Exposure to VS Code Copilot
* Exposure to Auto Model (preferred).
* Exposure to Cloud Model Sonnet
* Exposure to Gemini MCP
* Exposure to Claude Code
* Familiarity with knowledge graph concepts
* Experience with Dataform or Jenkins-based orchestration