AI Data Engineer
DeepRec.aiFull Description
AI Data Engineer – Python, Snowflake & Machine Learning (Contract, Remote)
We’re looking for an experienced AI Data Engineer to build scalable data pipelines and deploy production-ready machine learning solutions within a modern cloud data platform.
The role:
Own end-to-end data and ML workflows, from ingestion and transformation through to model deployment and monitoring, with a focus on reliability and scalability in Snowflake and Python.
What you’ll be doing
* Build and maintain data ingestion pipelines into Snowflake (structured and time-series data)
* Prepare ML-ready datasets (feature engineering, aggregations, train/test splits)
* Develop, train, and deploy ML models using Python (scikit-learn, XGBoost, LightGBM)
* Operationalise ML workflows in Snowflake using Snowpark
* Write model outputs back to Snowflake for downstream use
* Monitor pipelines and models, including data quality checks and retraining triggers
Technical environment
* Snowflake (SQL, Streams, Tasks, Snowpipe)
* Python for data engineering and ML (including Snowpark)
* ML frameworks: scikit-learn, XGBoost, LightGBM
* Time-series data processing (desirable)
* Azure Data Lake / Microsoft Fabric (nice to have)
Contract details
* Initial contract with strong extension potential
* Fully remote (Has to be European based)
* €300 - €325 Per Day