The Grok AI model’s base code has been made publicly available by Elon Musk’s xAI, although it lacks any training code. On GitHub, the business referred to it as the ‘314 billion parameter Mixture-of-Expert model.’
Grok’s xAI open-source basic model lacks training code
xAI stated in a blog post that the model wasn’t designed with any specific use in mind, like having talks with it. Without providing further information, the business stated that Grok-1 was trained on a ‘custom’ stack. Commercial use cases are allowed under the terms of the Apache License 2.0, which licenses the model.
Musk announced on X last week that xAI planned to make the Grok model available to the public this week. Last year, the business made Grok available to X social network Premium+ customers as a chatbot. Notably, although the open-source architecture lacks social network connections, the chatbot might still be able to access some of the X data.
Some of the AI models from many well-known firms, such as Meta’s LLaMa, Mistral, Falcon, and AI2, have been made available to the public. Google also unveiled two brand-new open models in February: Gemma2B and Gemma7B.
There are already some AI-driven tool developers discussing the integration of Grok into their products. The CEO of Perplexity, Arvind Srinivas, announced on X that Grok will be enhanced for conversational search and made accessible to Pro users.
Musk sued OpenAI earlier this month, alleging that the business had betrayed the nonprofit AI objective. The two have been engaged in a court war. He has since criticized OpenAI and Sam Altman on X several times.