Menu Close
Cerebras

Invest in Cerebras

Artificial Intelligence

Founded: 2016

Headquarters: Sunnyvale, California

cerebras.net

Status

Available

Last Round Valuation

$50 Billion

Share Price

$50.00

Sign Up to Start Investing

Already have an account? Sign In

About
Summary
Valuation

About Cerebras

Cerebras Systems is a developer of computing chips designed for the singular purpose of accelerating AI. Cerebras builds computer systems for complex artificial intelligence deep learning applications. With a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types, Cerebras creates chips that are the size of wafers, which are trays that would usually have several chips on it, to train AI models to accomplish tasks faster. The Company’s flagship product, the CS-2 system, which is powered by the world’s largest processor – the 850,000 core Cerebras WSE-2, enables customers to accelerate their deep learning work by orders of magnitude over graphics processing units.

Key Facts

“At Cerebras Systems, our goal is to revolutionize computing. It’s thrilling to see some of the most respected supercomputing centers around the world deploying our CS-2 system to accelerate their AI workloads and achieving incredible scientific breakthroughs in climate research, precision medicine, computational fluid dynamics and more.” said Andrew Feldman, CEO and Co-Founder of Cerebras Systems.

With enterprises in all sectors realizing the importance of AI, adoption is increasing. However, most AI projects fail due to not being cost effective.

According to recent surveys, business leaders have put the failure rate of AI projects between 83% and 92% due to AI not being cost efficient. “As an industry, we’re worse than gambling in terms of producing financial returns,” Aible CEO Arijit Sengupta says.

"AI will be in every electronic product we buy and sell and use from toys to automobiles. And the ability to make AI useful depends on really good engineers, really good AI modelers, data scientists, as well as the hardware. But all that will be for naught if you're not running it on efficient hardware because you can't afford it. So this is all about making AI pervasive and affordable and – people overuse the term – but democratizing AI is still a goal. It's not where we are, but it's gonna take a lot of work to get there,” according to the Founder of Cambrian AI Research.

Cerebras Systems’ main offering is a chip, The Wafer-Scale Engine (WSE). It is a revolutionary central processor for the Company’s deep learning computer system. The second-generation WSE (WSE-2) powers the CS-2 system: it is the largest computer chip ever built and the fastest AI processor on Earth.

Unlike legacy, general-purpose processors, the WSE was built from the ground up to accelerate deep learning.

The WSE-2’s main competitive advantages are: (1) Massive high bandwidth on-chip memory; (2) Speed (Faster than a traditional cluster could possibly achieve); (3) Cost and power efficiency (One chip is equivalent to a cluster of legacy machines); (4) Ease of programing; (5) Radically reduced programming complexity; and (6) Reduced wall-clock compute time, and time to solution

Cerebras’ WSE-2 gives unprecedented levels of computation, memory and interconnect bandwidth on a single, wafer-scale piece of silicon. Further optimizations by sparsity harvesting allow the computation capabilities to be maximized. The outcome is huge performance in an integrated chip without bottlenecks, in which every node is programmable and independent of others. With this revolutionary approach to AI, companies can reduce the cost of curiosity.

The net result of the Company’s innovation to date is unmatched utilization, performance levels, and scaling properties that were previously unthinkable.

Cerebras offers a revolutionary AI infrastructure, with its CS-2 system. The CS-2 is designed from the ground up to power, cool, and deliver data to the revolutionary WSE-2 processor so that it can deliver unparalleled performance to users. The package is easy to deploy, operate, and maintain in client datacenters today. This means no compromises and peak performance with no large-scale cluster deployment complexity for customers.

Large-scale data centers use mass amounts of computers, while Cerebras offers far less supercomputers for the same output. This means customers can deploy datacenter-scale AI computers to unlock world leading innovation in just a few days or weeks – delivering greater performance in a space- and power-efficient package built for the job.

Cerebras also offers the Cerebras Software Platform, CSoft. The Cerebras Software Development Kit allows researchers to extend the platform and develop custom kernels – empowering them to push the limits of AI and HPC innovation. The Cerebras Machine Learning Software integrates with the popular machine learning frameworks TensorFlow and PyTorch, so researchers can effortlessly bring their models to the CS-2 system.

Cerebras also offers Cloud Cirrascale. Cerebras Cloud provides access to a blazing fast AI solution, right at customers fingertips. It provides access to the latest Cerebras technology hosted at Cirrascale Cloud Services, a specialist in deep learning infrastructure. This joint solution delivers the model training performance of many processors in a single instance, so users can develop new neural network architectures, Machine Learning methods, and algorithms that were previously impractical – all without worrying about infrastructure management.

Cerebras can double the amount of computing its chips do for double the power, unlike current systems that need more than twice as much power to double their computing capacity. Current AI systems use tens of megawatts of power over months and use the equivalent of a small city’s power to train models – Cerebras allows the same computer power with far less energy at faster speeds.

Cerebras allows researchers and organizations with tiny budgets — in the tens of thousands of dollars range — to access AI training tools that were previously only available to much larger organizations with lots of money.

In August 2024, Cerebras filed confidentially for an initial public offering.

In September 2024, Cerebras announced a Memorandum of Understanding (MoU) with Aramco to deliver high-performance AI inference to industries, universities, and enterprises in Saudi Arabia. Aramco plans to leverage Cerebras' cutting-edge CS-3 systems to build, train, and deploy world-class large language models, accelerating AI innovation in the region.

In September 2024, Cerebras Systems filed for an initial public offering (IPO). According to its filing with the U.S. Securities and Exchange Commission, the company reported a net loss of $66.6 million on $136.4 million in revenue for the six months ending June 30th, compared to a net loss of $77.8 million on $8.7 million in revenue during the same period the previous year.

Your access to private investment

Explore companies like this

Sign up to see more information

Get Started Button Arrow

Summary

trade

Customers

Nulla malesuada justo velit, eget

Donec rhoncus sapien sit amet mauris consectetur convallis. Croin mollis eros vel orci.

trade

Key Investors

Pellentesque tempus, ipsum id suscipit tristique, Aenean lacinia quis tortor sed, ornare Integer vel.

trade

Size of Market

Mauris velit nisl, ultrices ut est quis, dapibus euismod tellus. Ut finibus ultricies sollicitudin. Vivamus a aliquet magna. Fusce lacinia urna in laoreet facilisis. Aliquam dignissim efficitur sapien eu convallis.

trade

Market Position

Nam volutpat purus vel lacus pellentesque porttitor sed in orci. Vivamus porttitor sit amet tellus non sollicitudin. Suspendisse sit amet turpis nisi. In sem velit, interdum at augue id, eleifend tristique nunc, Maecenas scelerisque.

Industry

Mauris velit nisl

Primary Vertical

Artificial Intelligence

Employees

500

Mosaic Score

840

Money

760

Momentum

850

Management

760

Market

960

What is a Mosaic Score?

Key Officers

Name Work History Title Status
John Smith Aenean, quam eu, diam dignissim Chief Executive Officer Current
John Smith Aenean, quam eu, diam dignissim Chief Executive Officer Current
John Smith Aenean, quam eu, diam dignissim Chief Executive Officer Current
John Smith Aenean, quam eu, diam dignissim Chief Executive Officer Current

Valuation

Valuation Over Time

chart

Last Round

Sept. 2019, Series B1

Valuation Post-Money

$400.2M

Amount Raised

$2.5M

Total Funds Raised

$75.2M

Implied Valuation

$84M

Funding Rounds

Date Round Amount Valuation Investors
MM/DD/YYYY Growth Equity $1M $2.8B Name goes here
MM/DD/YYYY Growth Equity $1M $2.8B Name goes here
MM/DD/YYYY Growth Equity $1M $2.8B Name goes here
MM/DD/YYYY Growth Equity $1M $2.8B Name goes here

Competitive Landscape

Company Valued Revenue
Coinbase
San Francisco, CA
$2.8B $1.3B
Coinbase
San Francisco, CA
$2.8B $1.3B
Coinbase
San Francisco, CA
$2.8B $1.3B

Invest Simply, From Anywhere