Data Engineer
KTek ResourcingAs a Data Engineer at KTek Resourcing, you'll have the opportunity to shape scalable Lakehouse data platforms using advanced tools like Databricks on Google Cloud Platform. This role allows you to influence data migration strategies and collaborate with multidisciplinary teams to innovate and enhance data-driven solutions.
Skills & Expertise
Key Responsibilities
Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
Lead migration of Jobs from other cloud data platforms to Databricks.
Collaborate with data engineers and analytics teams to enable scalable data products.
Full Description
Role name: GCP + Data Bricks Architect (10+ YEARS)
Responsibilities:
* Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
* Architect integrations and migrations from Google Cloud Platform to Databricks on GCP. (MUST)
* Design robust batch and streaming data pipelines leveraging Apache Spark, structured streaming, and modern ELT patterns.
* Lead migration of Jobs from other cloud data platform to Databricks.
* Implement secure data governance, access control, and lineage using Unity Catalog.
* Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning.
* Collaborate with data engineers, analytics teams, and ML engineers to enable scalable data products and analytics.
* Define best practices for data quality, reliability, and observability across the data platform.
* Provide technical leadership, architecture guidance, and mentorship to data engineering teams.
Requirements:
* 10+ years of experience in data engineering or data architecture in Big Data platforms.
* 3+ years hands-on experience with Databricks platform architecture
* Strong expertise in Apache Spark,Delta Lake,Databricks Architecture, SQL,Python, SQL and GCP Platform architecture.