Skip to main content

xAIReleased Grok-1 (314B parameters) as open source under Apache 2.0 license

On March 17, 2024, xAI released the complete weights and architecture of Grok-1, their 314 billion parameter Mixture-of-Experts model, under Apache 2.0 license. Musk stated the decision aimed to 'foster innovation and promote accountability and public evaluation' in response to growing demand for transparency in AI. Unlike many proprietary models, Grok-1 provided complete transparency by releasing raw base model weights and network architecture.

Scoring Impact

TopicDirectionRelevanceContribution
Corporate Transparency+towardsecondary+0.50
Open Source+towardprimary+1.00
Overall incident score =+0.664

Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.59)

Evidence (1 signal)

Confirms product_decision Mar 17, 2024 verified

xAI released Grok-1 314B parameter model weights and architecture under Apache 2.0 license

On March 17, 2024, xAI released complete Grok-1 model weights and architecture on GitHub under permissive Apache 2.0 license. Repository includes base model checkpoint and network implementation. Musk stated the release aimed to 'foster innovation and promote accountability.'

Related: Same Topics