Cerebras

Image for Cerebras

Overview

Cerebras Systems Inc., based in Sunnyvale, California, is a pioneer in AI computing, specializing in the development of wafer-scale processors designed for AI training and inference workloads. Founded in 2016 by Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie, and Jean-Philippe Fricker, Cerebras has raised over $720 million in funding, reaching a valuation of approximately $4.1 billion. The company is recognized for its groundbreaking Wafer-Scale Engine technology, which substantially enhances AI processing speeds by integrating compute, memory, and interconnect fabric on a single, large chip.

Recent Developments

  • July 2025: Cerebras announced the deployment of Alibaba's Qwen3 reasoning model on their hardware, significantly reducing reasoning time from 60 seconds with GPUs to just 0.6 seconds using their infrastructure. This breakthrough, enabling over 1,500 tokens per second, positions Cerebras as a leader in high-speed AI inference.
  • May 2025: CEO Andrew Feldman expressed intentions for Cerebras to go public in 2025, following clearance from the Committee on Foreign Investment in the United States to sell shares to UAE-based Group 42.
  • March 2025: The company planned the launch of six new AI data centers across North America and Europe, boosting their global AI inference capacity by 20 times, making Cerebras the world's leading provider of high-speed inference services.
  • April 2025: Cerebras announced a DARPA contract with Canadian startup Ranovus to develop energy-efficient computing systems, further improving military and commercial applications.
  • February 2025: Partnered with French company Mistral AI to set speed records in AI-driven applications, notably responding to user queries at 1,000 words per second.
  • January 2025: Cerebras opened their systems to support DeepSeek's R1 70B reasoning model, significantly outpacing GPU-based solutions with inference speeds 57 times faster.

Company Information

AttributeInformation
Founding Date2016
HeadquartersSunnyvale, California, USA
FoundersAndrew Feldman, Gary Lauterbach, Michael James, Sean Lie, Jean-Philippe Fricker
Revenue$136.4 million (first half of 2024)
ProfitsN/A
Key InvestorsAlpha Wave Ventures, Abu Dhabi Growth Fund, Benchmark, Eclipse Ventures
IndustrySemiconductors, Artificial Intelligence
Number of Employees525 (as of 2025)

Early History

Cerebras Systems was conceptualized in 2015 and officially founded the following year by a group of former SeaMicro engineers, including Andrew Feldman, who envisioned a new approach to AI computing. By creating the Wafer-Scale Engine—a revolutionary AI processor that covers an entire wafer—the team aimed to drastically improve AI computation efficiency. Early funding rounds, including a $27 million Series A and subsequent B and C rounds, helped develop their initial prototype. In 2018, Cerebras secured $88 million in Series D funding, achieving unicorn status, followed by a Series E round in 2019 which propelled its valuation to $2.4 billion. The launch of their first Wafer-Scale Engine in 2019 marked a significant milestone, setting new standards in the AI hardware industry.

Company Profile and Achievements

Cerebras' business model revolves around the design and production of AI-specific processors that deliver extraordinary computation speeds. These processors are embedded in their CS systems, which are deployed as supercomputers capable of supporting some of the largest AI models. The company's Wafer-Scale Engine is noted for its unprecedented scale and performance, integrating compute, memory, and network capabilities into a single silicon chip.

  • 2019: Launched the first Wafer-Scale Engine, the largest chip ever constructed.
  • 2021: Released CS-2 System, enhancing AI processing capabilities with WSE-2 technology featuring 850,000 cores.
  • 2022: Revealed Andromeda, a supercomputer comprising 16 WSE-2 chips, establishing new AI computation benchmarks.
  • 2023: Expanded with innovative supercomputing networks such as the Condor Galaxy in partnership with G42.
  • 2024: Unveiled the WSE-3, a third-generation processor built on a 5nm process, supporting advanced AI operations.
  • 2025: Launch of AI data centers poised to deliver the fastest inference capabilities globally.

Current Operations and Market Position

Cerebras remains at the forefront of technology innovation within the AI domain, leading with its Wafer-Scale Engine technology. The company’s processors are widely used in supercomputing centers and large-scale AI deployments. Recent partnerships and data center expansions have solidified its standing as a key player in AI hardware, directly competing with Nvidia and other incumbents. Cerebras' recent focus on reducing AI inference time and cost by utilizing their robust engineering and innovative approaches continues to strengthen their market position.

Conclusion

Through its pioneering technology and strategic initiatives, Cerebras Systems has positioned itself as a leader in the semiconductor and AI industries. The company's advancement of wafer-scale processing is not only a technical triumph but also a critical factor for enabling next-generation AI applications. As Cerebras moves toward a possible public offering and continues to expand its infrastructure, it is poised to further influence AI processing on a global scale, making significant contributions to industries ranging from healthcare to defense.