The new machine will be a close relative to Tesla’s upcoming Dojo supercomputer.
By Chris Young
Tesla’s Senior Director of AI, Andrej Karpathy, unveiled the electric vehicle automaker’s new supercomputer during a presentation at the 2021 Conference on Computer Vision and Pattern Recognition (CVPR).
Last year, Elon Musk highlighted Tesla’s plans to build a “beast” of a neural network training supercomputer called “Dojo”.
For several years, the company has been teasing its Dojo supercomputer, which Musk has hinted will be the world’s fastest supercomputer, outperforming the current world leader, Japan’s Fugaku supercomputer which runs at 415 petaflops.
The new supercomputer seems to be a predecessor to the Dojo project, with Karpathy stating that it is the number five supercomputer in the world in terms of floating-point operations per second (FLOPS).
This supercomputer is certainly not lacking in the processing department. As Karpathy highlights in his presentation, the supercomputer has 720 nodes of 8x A100 80GB (5760 GPUs total). It also has 1.8 EFLOPS (720 nodes * 312 TFLOPS-FP16-A100 * 8 gpu/nodes), 10 PB of “hot tier” NVME storage @ 1.6 TBps, and 640 Tbps of total switching capacity.
Continue reading… “Tesla Shows Off Its Brand New AI-Training Supercomputer”
