Back to jobs

Azure Data Engineer

Aarorn Technologies Inc
Toronto, Ontario, Canada
Contract
5,000 – 5,500 / year
AI tools:
Azure Data Factory
Azure SQL MI
Azure Functions

Job Title: Azure Data Engineer

Location: Toronto, ON (3x onsite a week)

Employment Type: Contract

Pay Rate: CAD$50 - $55/HR INC

Job Description

Design, build, and maintain Azure Data Factory (ADF) pipelines for batch and event-driven data integration. Develop ETL workflows with Azure SQL MI, Azure Functions, and other Azure services. Support event-based data processing, handle structured and semi-structured data, and implement monitoring, error handling, and performance tuning. Collaborate across teams to ensure secure, reliable, and optimized data delivery.

Roles & Responsibilities

Key Skills & Experience

* Strong hands-on ADF development

* ETL and SQL (Azure SQL MI) expertise

* Azure Functions and event-driven architectures

* Data integration patterns and pipeline orchestration

* Apache Kafka (producer/consumer, event streaming)

* Messaging/streaming platforms in Azure or hybrid setups

* Enterprise data platform experience

* Ability to work independently

Detailed JD

* Design, develop, and maintain Azure Data Factory (ADF) pipelines for batch and event-driven data integration.

* Build and optimize ETL workflows integrating Azure SQL MI, Azure Functions, and other Azure services.

* Develop and support event-based data processing solutions.

* Work with structured and semi-structured data sources.

* Implement error handling, logging, monitoring, and performance tuning for data pipelines.

* Collaborate with platform, application, and downstream consumer teams to ensure reliable data delivery.

* Follow best practices for security, performance, and maintainability in Azure data solutions.

Mandatory Skills & Experience

* Strong hands-on experience with Azure Data Factory (ADF) development

* Solid experience in ETL development and SQL (Azure SQL MI)

* Experience with Azure Functions

* Proven experience with event-based or event-driven architectures

* Strong understanding of data integration patterns and pipeline orchestration

* Ability to work independently

* Apache Kafka experience (producer/consumer concepts, event streaming)

* Exposure to messaging or streaming platforms in Azure or hybrid environments

* Prior experience working in enterprise data platforms

Disclaimer: AI tools may assist in the recruitment process; however, all hiring decisions are made by the recruitment team based on a comprehensive evaluation of candidates.

Applications go to the hiring team directly