Data Engineer
KTek ResourcingFull Description
Role name: GCP + Data Bricks Architect (10+ YEARS)
Responsibilities:
* Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
* Architect integrations and migrations from Google Cloud Platform to Databricks on GCP. (MUST)
* Design robust batch and streaming data pipelines leveraging Apache Spark, structured streaming, and modern ELT patterns.
* Lead migration of Jobs from other cloud data platform to Databricks.
* Implement secure data governance, access control, and lineage using Unity Catalog.
* Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning.
* Collaborate with data engineers, analytics teams, and ML engineers to enable scalable data products and analytics.
* Define best practices for data quality, reliability, and observability across the data platform.
* Provide technical leadership, architecture guidance, and mentorship to data engineering teams.
Requirements:
* 10+ years of experience in data engineering or data architecture in Big Data platforms.
* 3+ years hands-on experience with Databricks platform architecture
* Strong expertise in Apache Spark,Delta Lake,Databricks Architecture, SQL,Python, SQL and GCP Platform architecture.