Elon Musk has finally admitted that xAI used OpenAI's models to train Grok via model distillation.
The "open" in xAI's mission was always about a transparent alternative to the "closed" OpenAI. But using the output of the closed model to train the "open" alternative is the ultimate irony. It's a shortcut that's common in the industry, but for Musk, it was a strategic brand failure.
Distillation is the a la carte menu of AI training: you use the giant, expensive model to label data or generate synthetic data for a smaller, more efficient model. It's efficient, but it's essentially a form of intellectual property theft by proxy.
The battle for AI supremacy is no longer about who has the most original data, but who can best distill the intelligence of their competitors.