Twelve months ago, the conventional wisdom was that frontier AI required frontier capital. The compute requirements, the training data, and the engineering talent necessary to compete with GPT-4 or Claude were simply beyond the reach of open-source communities.
That consensus is fracturing. Meta's decision to release the Llama family under a permissive licence triggered a wave of community fine-tuning that has produced models competitive with commercial offerings on most standard benchmarks.
The implications for enterprise adoption are significant. A company that previously faced a choice between expensive API access and building from scratch now has a third path: start from open weights, fine-tune on proprietary data, and deploy on infrastructure it controls.