Data Engineer (Python & AWS)
SoftBlue SAJoin an experienced team and build together modern IT solutions for international customers. If you prefer working in a friendly environment without unnecessary formalities and the quality of created code is important to you, you are at the right place!
SoftBlue SA is a place full of passion, mutual inspiration and challenges. We have a modern Research and Development Center, where we develop and test innovative solutions in programming and IoT.
Data Engineer (Python & AWS)
We are looking for a highly skilled Data Engineer to join our client in the healthcare sector.
Our Requirements
Technical Expertise:
* Python Programming: 10+ years of experience with strong Software Engineering skills focused on data processing.
* AWS & Serverless: Extensive experience with AWS Cloud, specifically focusing on Serverless Architecture and services (including AWS Lambda, AWS S3 Tables, and AWS Cognito).
* Automated Testing: Proven experience in developing End-to-End (E2E) automated tests to ensure pipeline reliability, utilizing tools such as Boto3 for AWS resource validation.
* Data Concepts: Practical knowledge of Data Mesh and Medallion Architecture, along with general data processing and analysis.
* DevOps & IaC: Hands-on experience with Infrastructure as Code (AWS CDK or Terraform) and CI/CD pipelines (GitLab pipelines or GitHub Actions).
* Modern Tooling: Experience with ETL/ELT solutions, dbt, and GraphQL.
Communication & Soft Skills
* English Language: Minimum B2 level, enabling smooth daily technical and business communication in a global environment.
* Collaboration: Excellent communication skills and the ability to thrive in a collaborative, international team.
* Standards: A strong commitment to high standards of ethics, quality (Clean Code), and reliable delivery.
Nice To Have
* Experience with Snowflake and SQL.
* Knowledge of Data Vault 2.0 modeling.
* Experience with Databricks.
* Familiarity with the Microsoft ecosystem: C# / .Net, T-SQL, SQL Server, and Azure DevOps.
* Experience with Star Schema database modeling.
* Knowledge of Descriptive Statistics
Your Responsibilites
* Design and build scalable Data Ingestion pipelines within the AWS cloud ecosystem.
* Lead technical delivery and implementation of core platform components, ensuring architectural integrity across the entire data lifecycle.
* Collaborate with Engineering Managers and cross-functional teams across the globe and Poland.
* Drive technical excellence by improving team processes, architecture standards, and engineering best practices.
* Support and consult with stakeholders to ensure successful delivery of high-impact, data-driven solutions.
* Contribute to the growth and maturity of the team’s cloud and data engineering capabilities.
We Offer
* Challenging role within the company that creates innovative solutions,
* Work in international environment on demanding projects,
* Remote work model,
* Subsidized private medical care, life insurance, multisport card,
* Integration meetings,
* Employee referral program.
If you have a deep expertise in Python and AWS, and building scalable Serverless data architectures is where you truly excel, this is the perfect role for you!