Elon Musk revealed on social media platform X that his AI startup, xAI, is training the third version of its Grok model using a massive amount of resources - up to 100,000 Nvidia H100 GPUs.
The H100 is Nvidia's most powerful chip and the hottest tech product of 2023. With 80 billion transistors, it's the strongest tool available for running large language models like GPT, which power apps such as ChatGPT and Bard. Nvidia CEO Jensen Huang even called the H100 the "iPhone of AI."
The H100 is also one of Nvidia's priciest processors, costing about $40,000 each. This means Grok 3 is being trained on AI chips worth $3-4 billion.
The use of 100,000 GPUs marks a big step up for Grok 3 compared to Grok 2. In an April interview with Nicolai Tangen, CEO of Norges Bank Investment Management, Musk said Grok 2 only needed about 20,000 GPUs for training.
X.AI Corp has been pitched to Silicon Valley investors by highlighting Musk's achievements at Tesla Inc. and SpaceX. Marketing materials emphasize that xAI and the Grok chatbot can be use high-quality data from X, the social media platform Musk renamed from Twitter.
So far, xAI has released two versions: Grok-1 and Grok-1.5. The newest version is only for early test subscribers and current X users. According to the Tesla CEO, Grok 2 will launch in August, with Grok 3 following by year-end.
The 100,000 GPU figure sounds huge but isn't unreasonable. Tech giant Meta is actually stockpiling even more of these AI chips than Tesla.
Specially, CEO Mark Zuckerberg said in January that Meta will buy about 350,000 H100 GPUs by the end of 2024. He also mentioned Meta will own up to 600,000 chips, including other GPU models.
0 Comments