Job Description

Overview

Cerebras Systems builds the world's largest AI chip, 56 times larger than GPUs. Our wafer-scale architecture provides the AI compute power of dozens of GPUs on a single chip, with the programming simplicity of a single device. This approach delivers industry-leading training and inference speeds and enables machine learning users to run large-scale ML applications without managing hundreds of GPUs or TPUs.

Cerebras' current customers include top model labs, global enterprises, and cutting-edge AI-native startups. OpenAI recently announced a multi-year partnership with Cerebras to deploy scale, transforming key workloads with ultra high-speed inference.

Thanks to the wafer-scale architecture, Cerebras Inference offers fast Generative AI inference, outperforming GPU-based hyperscale cloud inference services and enabling real-time iteration and increased intelligence through additional computation.

Location options: Sunnyvale, Toronto

A...

Apply for this Position

Ready to join Cerebras? Click the button below to submit your application.

Submit Application