Stability AI unveiled a new open-source AI language model called StableLM. Stability expects to repeat the catalyzing effects of its Stable Diffusion open-source image synthesis model. StableLM could be used to build an open-source alternative to ChatGPT.
StableLM is currently available in alpha form on GitHub in 3 billion and 7 billion parameter model sizes, with 15 billion and 65 billion parameter models to follow, according to Stability AI.
“Language models will form the backbone of our digital economy, and we want everyone to have a voice in their design,” writes in Stability blog post. “Models like StableLM demonstrate our commitment to AI technology that is transparent, accessible, and supportive.”
And also, “The release of StableLM builds on our experience in open-sourcing earlier language models with EleutherAI, a nonprofit research hub. These language models include GPT-J, GPT-NeoX, and the Pythia suite, which were trained on The Pile open-source dataset. Many recent open-source language models continue to build on these efforts, including Cerebras-GPT and Dolly-2.”
StableLM purports to achieve similar performance to OpenAI’s benchmark GPT-3 model while using far fewer parameters—7 billion for StableLM versus 175 billion for GPT-3. “Our StableLM models can generate text and code and will power a range of downstream applications,” says Stability. “They demonstrate how small and efficient models can deliver high performance with appropriate training.”
You can check PAACADEMY‘s workshop to learn more about AI tools: