Back to jobs

Robotics Engineer

AI Futures
Bavaria, Germany
Full-time
Applications go directly to the hiring team

Full Description

Autonomy & Sensor Fusion Engineer | Bavaria | Competitive Salary | Hybrid work model

I am currently partnering with a start-up who are seeking an experienced Autonomy & Sensor Fusion Engineer to help build robust navigation and perception capabilities for autonomous robotic platforms operating in challenging environments. This role focuses on reliable localization, estimation, and autonomy in conditions where GNSS may be degraded or unavailable, with real‑time performance and system integration at its core.

You’ll work across perception, navigation, and control layers, contributing advanced algorithms and production‑quality software that is validated in simulation and deployed onto real hardware.

Key Responsibilities

* Design, develop, and validate navigation, guidance, and state‑estimation algorithms for autonomous systems operating in GNSS‑denied or degraded environments

* Develop and implement advanced multi‑sensor fusion techniques, including EKF, UKF, particle filters, nonlinear observers, and factor‑graph‑based SLAM

* Fuse and synchronize data from multiple sensor modalities, including cameras, LiDAR, radar, IMUs, GNSS, magnetometers, and barometers

* Build and optimize real‑time data processing pipelines to meet low‑latency and reliability requirements

* Architect, integrate, and maintain a cohesive ROS 2–based autonomy stack, combining perception, localization, navigation, and (where applicable) control components

* Interface autonomy software with underlying control or vehicle systems, ensuring robust command, telemetry, and state feedback

* Deploy, optimize, and validate autonomy software on edge compute platforms (e.g. NVIDIA Jetson), accounting for real‑time constraints and hardware limitations

* Perform extensive validation in simulation, bench testing, and real‑world testing scenarios, including debugging complex interactions across the autonomy stack

* Research, prototype, and transition new algorithms into production systems

* Produce and maintain high‑quality technical documentation covering algorithms, software architecture, testing procedures, and system behavior

* Collaborate closely with software, hardware, controls, and systems engineering teams to ensure reliable end‑to‑end system integration

Your Profile

* MSc or PhD in Robotics, Computer Vision, Aerospace, Computer Science, or a related engineering discipline

* Strong C++ software development skills with experience debugging complex real‑time robotics systems

* Deep expertise in sensor fusion and probabilistic state estimation techniques

* Solid understanding of SLAM, visual‑inertial odometry, inertial navigation, and robotic perception pipelines

* Hands‑on experience with ROS 2 and simulation environments for algorithm development and validation

* Experience with sensor calibration, time synchronization, and multi‑sensor alignment

* Strong foundation in linear algebra, probability, estimation theory, and 3D geometry

* Proficiency with Linux environments, version control, CI/CD, and modern software engineering practices

* Self‑motivated, organized, and comfortable owning technically complex problems from design through deployment

Nice to Have

* Experience implementing or tuning guidance and control algorithms for autonomous systems

* Familiarity with platform integration and communication protocols (e.g. MAVLink or similar middleware)

* Proficiency in Python, Rust, or GPU‑accelerated programming (CUDA)

* Experience deploying autonomy software to embedded or edge platforms (e.g. NVIDIA Jetson, FPGA)

* Exposure to machine learning frameworks (PyTorch, TensorRT) or computer vision libraries (OpenCV)

* Prior experience with autonomous vehicles, drones, or robotics operating in challenging or GNSS‑denied environments

Applications go to the hiring team directly