Senior Data Engineer
ScytaleAbout Scytale
Scytale is a global leader, an AI-powered company transforming how organizations achieve and maintain trust and compliance. Our platform automates frameworks SOC 2, ISO 27001, SOX ITGC, GDPR, PCI DSS, and 60+ more frameworks, making compliance smarter, faster, and continuously audit-ready.
Built for startups, scale-ups, and enterprises, Scytale combines intelligent automation, real-time monitoring, and AI-driven insights to reduce manual work and eliminate compliance blind spots. Recognized as a G2 Leader in GRC and an AWS Rising Star Partner of the Year, Scytale is trusted by hundreds of companies worldwide and known for its technology and partnership-driven approach.
Role Overview
We are seeking a Senior Data Engineer to join our growing engineering team. This is a key role for a motivated and technically skilled individual with a solid foundation in software engineering and data systems. You will work on building scalable data infrastructure, implementing robust data integrations, and collaborating with cross-functional teams to solve real-world data challenges.
What You Bring
* 7+ years of professional experience as a Data Engineer or in a similar role developing data ETL pipelines.
* Advanced proficiency in Python for backend development and scripting
* Strong SQL skills with hands-on experience in querying and modeling relational databases
* Experience with cloud platforms such as AWS, GCP, or Azure
* Hands-on with containerization technologies like Docker or Kubernetes
* Solid understanding of RESTful APIs
* Experience with version control systems (GitHub, GitLab, Bitbucket) and CI/CD workflows
* Strong grasp of software development lifecycle (SDLC) and principles of clean, maintainable code
* Demonstrated ability to work independently, own projects end-to-end, and mentor junior engineers
* Familiarity with AI concepts and prompt engineering is a plus
Nice To Have
* Experience with data security, privacy compliance, and access controls
* Knowledge of infrastructure-as-code tools (e.g., Terraform, Helm)
* Background in event-driven architecture or stream processing
What You’ll Do
* Design, develop, test, and maintain reliable data pipelines and ETL processes using Python and SQL
* Build and manage API-based data ingestion workflows and real-time data integrations
* Apply software engineering best practices: modular design, testing, version control, and documentation
* Own and optimize data workflows and automation, ensuring efficiency and scalability
* Collaborate closely with senior engineers, data scientists, and stakeholders to translate business needs into technical solutions
* Maintain and enhance data reliability, observability, and error handling in production systems
* Develop and support internal data-driven tools
* Implement data operations best practices, including automated monitoring, alerting, and incident response for pipeline health
* Work with data-devops principles: CI/CD for data workflows, infrastructure-as-code, and containerized ETL deployments
Why Join Scytale?
* Innovative Work: Be part of a cutting-edge product shaping the future of security and compliance.
* Learning & Growth: Access courses, conferences, and mentorship to grow your career.
* Hybrid Work Model: Enjoy the flexibility of hybrid working.
* Collaborative Culture: Work with inspiring colleagues in a supportive environment.
* Relaxation & Fun: Take breaks in our relaxation room or join our team events, happy hours, and holiday celebrations.
* Family First: Personal and family priorities always come first.
Ready to innovate and grow with us? Join Scytale and help transform cybersecurity compliance for companies worldwide!