https://x.com/KanekoaTheGreat/status/1866939781302849650
KanekoaTheGreat
@KanekoaTheGreat
NEW: Gavin Baker discusses how @elonmusk
and @xai
achieved what was thought impossible with the construction of Colossus, the world's largest AI supercomputer.
They successfully connected over 100,000 GPUs, a task considered impossible by engineers at Microsoft, Google, and Meta.
"Elon, as he so often does, focused deeply on this, thought about it from First Principles, and he came up with a very different way of designing a data center, and he was able to make 100,000 GPUs coherent."
"No one thought it was possible."
"Engineers at Meta and Google and other firms said, we can't do it. There's no way he can do it."
"As a result of that, Grok 3 is now training on this giant Colossus supercomputer, the biggest in the world, with 100,000 GPUs."
"No one had built a cluster bigger than 32,000 H100s… Grok 3 is the first new data point to support whether scaling laws are breaking or holding because no one thought you could make 100,000 hoppers coherent."
"You will have a friend in your pocket with an IQ of maybe 130 who knows everything, has more up-to-date knowledge of the world, and is more grounded in factual accuracy."
"If there's a stock down 25%, I ask every AI, why is the stock down? Generally, Grok is the one who knows. Because of X's dataset, Grok knows exactly what is happening in the world today."
xAI's Grok is reportedly valued at around $50 billion, meaning Elon Musk leveraged X's data to turn his $44 billion investment in Twitter into a profit.
3:50 / 9:49
9:15 AM · Dec 11, 2024
·
127.8K
Views