Menu
Deep ResearchPROAsk Elon
July 22, 2025@elonmusk →
230k GPUs, including 30k GB200s, are operational for training Grok @xAI in a single supercluster called Colossus 1 (inference is done by our cloud providers). At Colossus 2, the first batch of 550k GB200s & GB300s, also for training, start going online in a few weeks. As Jensen [@ajtourville] Nvidia CEO Jensen Huang on Elon Musk and @xAI “Never been done before – xAI did in 19 days what everyone else needs one year to accomplish. That is superhuman – There's only one person in the world who could do that – Elon Musk is singular in his understanding of engineering.”
Engagement vs. median curated tweet
53.3K
Likes
8.3K
Retweets
4.5K
Replies
First Principles AI
First Principles AI
Ask anything about Elon
5 free

Ask anything about Elon — companies, predictions, tweets, controversies, vehicles, family.