Cerebras
Cerebras Systems is the world's fastest AI inference. We are powering the future of generative AI. We’re a team of pioneering computer architects, deep learning researchers, and engineers building a new class of AI supercomputers from the ground up. Our flagship system, Cerebras CS-3, is powered by the Wafer Scale Engine 3—the world’s largest and fastest AI processor. CS-3s are effortlessly clustered to create the largest AI supercomputers on Earth, while abstracting away the complexity of traditional distributed computing. From sub-second inference speeds to breakthrough training performance, Cerebras makes it easier to build and deploy state-of-the-art AI—from proprietary enterprise models to open-source projects downloaded millions of times. Here’s what makes our platform different: 🔦 Sub-second reasoning – Instant intelligence and real-time responsiveness, even at massive scale ⚡ Blazing-fast inference – Up to 100x performance gains over traditional AI infrastructure 🧠 Agentic
Open Positions at Cerebras
Applied Machine Learning Research Scientist
Cerebras
Python / PyTorch Developer — Frontend Inference Compiler – Dubai
Cerebras
Applied AI/ML Scientist
Cerebras
Senior Technical Program Manager – AI Infrastructure, Site Operations
Cerebras
Senior Software Engineer, AI Inference Platform
Cerebras
Manager – AI Infrastructure Operations
Cerebras
Product Manager, Strategic Verticals
Cerebras