Cerebras: What You Need To Know About The Nvidia Competitor After Wild IPO
Cerebras made a massive IPO debut valued at nearly $100 billion, signaling strong demand for AI chips beyond Nvidia's GPUs. The company makes the world's largest chip, a custom ASIC optimized for AI inference, and has secured major deals with OpenAI and Amazon Web Services. Its success is opening the door for a new wave of custom ASIC competitors challenging Nvidia's dominance.
Summary
Cerebras Systems made a landmark IPO on Thursday, debuting at nearly $100 billion in market capitalization — placing it among the largest first-day valuations in tech history, alongside Facebook and Alibaba. The milestone was seen as a strong signal of surging demand for AI chips, particularly as Big Tech looks for alternatives to Nvidia's dominant GPU architecture.
Founded in Silicon Valley in 2016, Cerebras differentiates itself by producing the largest chip ever built in the semiconductor industry — roughly the size of a dinner plate. Unlike Nvidia's general-purpose GPUs, which excel at the parallel math required for training large AI models, Cerebras makes a custom ASIC (application-specific integrated circuit) optimized for AI inference — the process of running already-trained models. As AI moves into the agentic era, inference workloads are becoming increasingly central, making specialized chips like Cerebras' more strategically valuable.
Cerebras originally filed to go public in 2024 but withdrew after scrutiny over its heavy reliance on a single customer, G42, an AI firm based in the UAE. The company has since pivoted its business model, shifting from selling chips directly to operating them inside its own data centers as a cloud service. This puts Cerebras in direct competition with cloud giants like Google, Microsoft, Oracle, and CoreWeave. The company has secured a $20 billion cloud deal with OpenAI in January and announced a partnership with Amazon Web Services in March.
Despite competitive pressures, Cerebras reports that demand for its fast inference product so far outpaces supply that it is sold out into 2027. The company is aggressively expanding both manufacturing and data center capacity to meet demand.
Cerebras is not alone in the custom ASIC space. Hyperscalers like Google and Amazon develop their own in-house chips, while specialized firms such as Groq — recently acquired by Nvidia for $20 billion in its largest purchase to date — also compete directly. Cerebras' IPO is also seen as paving the way for other ASIC startups including Rebellions, SambaNova, and D-Matrix, all of which are positioning themselves to capitalize on the unprecedented wave of AI chip demand.
Key Insights
- Cerebras claims to have built the biggest chip in the history of the semiconductor industry, roughly the size of a dinner plate, and argues that larger chips process more information faster, enabling quicker AI results.
- As AI shifts toward the agentic era, inference — not training — is becoming the key workload, which favors custom ASICs over Nvidia's general-purpose GPUs, according to the video's framing of the competitive landscape.
- Cerebras reports it is sold out of its fast inference product into 2027, with its biggest challenge being supply rather than demand, even as it aggressively expands manufacturing and data center capacity.
- Nvidia acquired Groq — one of Cerebras' closest ASIC competitors — for $20 billion in December, its largest acquisition to date, and announced custom Groq language processing units at GTC in March, signaling Nvidia's intent to compete in the custom ASIC inference space.
- Cerebras originally withdrew its IPO filing in 2024 after heavy regulatory scrutiny over its reliance on a single customer, UAE-based AI firm G42, and has since restructured to operate chips as a cloud service, now competing directly with Google, Microsoft, Oracle, and CoreWeave.
Topics
Full transcript available for MurmurCast members
Sign Up to Access