The large language model sector continues to swell as StabilityAI, maker of the popular image-generation tool Stable Diffusion, has launched a suite of open-source language model tools.
Dubbed, StableLM, the publicly-available alpha versions of the suite currently contain models featuring three and seven billion parameters with 15, 30, and 65-billion parameter models noted as “in progress” and a 175-billion model planned for future development.
Announcing StableLM❗We’re releasing the first of our large language models, starting with 3B and 7B param models, with 15-65B to follow. Our LLMs are released under CC BY-SA license.We’re also releasing RLHF-tuned models for research use. Read more→ https://t.co/R66Wa4gbnW pic.twitter.com/gvDDJMFBYJ
By comparison, GPT-4 has a parameter count estimated at one trillion, six times higher than its predecessor GPT-3.
The parameter count may not be an even measure of LLM efficacy, however, as Stability AI noted in its blog post announcing the launch of StableLM:
It’s unclear at this time exactly how robust the StableLM models are. The StabilityAI team noted on the organization's Github page that more information about the LMs capabilities would be forthcoming, including model specifications and training settings.
Related: Microsoft is developing its own AI chip to power ChatGPT
Provided the models perform well enough in testing, the arrival of a powerful open-source alternative to OpenAI’s ChatGPT could prove interesting for the cryptocurrency trading world.
As Cointelegraph reported, people are building advanced trading bots on top of the GPT API and new variants that incorporate third-party tool access, such as BabyAGI and AutoGPT.
The addition of open-source models into the mix could
Read more on cointelegraph.com