Big Data Developer
GalentFull Description
Key Responsibilities
* Design, develop, and maintain scalable ETL pipelines using Big Data technologies such as Hadoop, Spark, and Hive
* Work extensively with Snowflake for data warehousing, data integration, and performance optimization
* Develop and maintain data processing applications using Java or Scala
* Build and optimize complex SQL queries and data models
* Integrate data from various sources using ETL tools like Informatica, Talend, or Apache Airflow
* Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions
* Work with cloud platforms (preferably AWS) for deploying and managing data solutions
* Leverage Gen AI tools to enhance development productivity and automate code generation
* Ensure data quality, reliability, and performance of data pipelines
* Implement best practices for CI/CD, version control, and infrastructure-as-code
Required Skills
* Strong experience with Big Data technologies: Hadoop, Spark, Hive
* Expertise in Snowflake platform
* Proficiency in Java or Scala programming
* Advanced knowledge of SQL and data modeling
* Experience with ETL tools (Informatica, Talend, Airflow)
* Familiarity with AWS or other cloud platforms
* Understanding of API development