Grok 5 Training on Colossus 2: xAI's 1.5GW Supercluster Expansion Detailed

Grok 5 Training on Colossus 2: xAI's 1.5GW Supercluster Expansion Detailed

Training for Grok 5 is underway on Colossus 2, and the infrastructure story behind it is remarkable. xAI's Memphis supercluster is on track to hit 1.5 gigawatts of compute capacity by April 2026 — a scale that has no direct equivalent in the industry. In March, the company filed a $659 million construction permit for a new 312,000 square foot facility adjacent to the existing Colossus 2 site, signaling that this build-out is far from finished.

Grok 5 itself is expected to use a mixture-of-experts architecture with trillions of parameters, a design that allows selective activation of model capacity to balance power and efficiency at enormous scale. If the training run proceeds on schedule, Grok 5 would land as one of the largest models ever trained on a single unified supercluster — a meaningful distinction in a field where compute access increasingly determines model capability ceilings.

The Colossus 2 expansion represents xAI's most direct response to the compute advantage that Microsoft-backed OpenAI and Google DeepMind have historically enjoyed. With a gigawatt-class training environment coming fully online this spring, xAI is no longer playing catch-up on infrastructure — it is building the kind of sustained compute capacity that frontier model development requires over multiple generations. Grok 5's arrival, whenever it comes, will be the first real test of what that investment can produce.

Read the full article at Digital Applied →