In a groundbreaking development, scientists have unveiled the Wafer Scale Engine 3 (WSE-3), the most colossal computer chip ever created, boasting an astonishing 4 trillion transistors. This technological marvel is destined to fuel an unprecedented era of artificial intelligence (AI) supremacy, according to its creators.

The WSE-3 marks the third iteration of Cerebras’ supercomputing platform, tailored specifically to propel AI systems like OpenAI’s GPT-4 and Anthropic’s Claude 3 Opus into uncharted realms of performance.

Measuring 8.5 by 8.5 inches (21.5 by 21.5 centimeters), akin to its predecessor, the WSE-2, this silicon behemoth integrates a staggering 900,000 AI cores. Despite consuming the same power as its forerunner, the WSE-3 doubles its processing might, as per company announcements. This feat demonstrates a remarkable adherence to Moore’s Law, which posits a doubling of transistor count approximately every two years.

In comparison, the reigning champion in AI training chips, Nvidia’s H200 graphics processing unit (GPU), pales in comparison with a mere 80 billion transistors—a staggering 57-fold difference from Cerebras’ creation.

The WSE-3 chip’s destiny lies in powering the Condor Galaxy 3 supercomputer, slated to take residence in Dallas, Texas, as per company statements released on March 13. Comprising 64 Cerebras CS-3 AI system “building blocks,” all driven by the WSE-3 chip, this mammoth machine is projected to churn out a colossal 8 exaFLOPs of computing prowess when fully operational.

Further amalgamated with the existing Condor Galaxy 1 and Condor Galaxy 2 systems, this network will achieve an astronomical 16 exaFLOPs. For context, Oak Ridge National Laboratory’s Frontier supercomputer, currently the world’s most potent, musters a comparatively modest 1 exaFLOP.

The Condor Galaxy 3 supercomputer’s primary mission is to train AI systems of unparalleled scale, surpassing even the likes of GPT-4 or Google’s Gemini, according to company insiders. Rumored leaks suggest that GPT-4 boasts around 1.76 trillion parameters for training, whereas the Condor Galaxy 3 stands poised to handle AI systems with a staggering 24 trillion parameters.

In summary, the unveiling of the WSE-3 heralds a new era in computing, promising to push the boundaries of AI capabilities to unprecedented heights and solidify Cerebras’ position at the forefront of technological innovation.

By Impact Lab