Image: ShutterstockStable Diffusion as an image generator tool has generated significant popularity as a transparent and scalable alternative to proprietary AI tools. Now it's developer, Stability AI, has delved into the large-language-model sector by creating a language model named StableLM.An open-source model, it is available in an Alpha version which is a 3 billion and 7 billion parameter model. It will be followed by 15 billion, 30 billion, and 65 billion models, while a 175 billion parameter model will be developed in the future. Compared to StableLM, ChatGPT 4 has parameter counts of 1 trillion.Developers would likely be able to use the model for research and inspection for commercial and research purposes, while Stability AI aims to include three goals for the language model: transparency, accessibility, and support to users. It could lead to trust and transparency for the tool with researchers.Entities across the public and private sectors could use the models for their own applications by modifying them and would be enabled to do so without sharing any sensitive information and their personal AI infrastructure.Stability AI would also make their model accessible to regular users to help developers to build applications without taking help from proprietary AI companies. In turn, it would lead to the democratization of AI, helping the broader academic and research societies.Regarding support for their users, Stability AI stated, "We build models to support our users, not replace them. We are focused on efficient, specialized, and practical AI performance – not a quest for god-like intelligence. We develop tools that help everyday people and everyday firms use AI to unlock creativity, boost their productivity, and open up new economic opportunities. "Stability AI also has past experience in open-source language models in association with the nonprofit research hub EleutherAI. Most of those language models were trained on the open-source dataset, The Pile, and include the Pythia suite, GPT-NeoX, and GPT-J. Recent models like Cerebras-GPT and Dolly-2 were also built using similar resources. StableLM is also trained using a dataset on The Pile, comprising content worth 1.5 trillion tokens.Shashank is the founder of yMedia. He ventured into crypto in 2013 and is an ETH maximalist. Twitter: @bhardwajshash
Check out our Festive offers upto Rs.1000/- off website prices on subscriptions + Gift card worth Rs 500/- from Eatbetterco.com. Click here to know more.