NEW: Find your perfect tool with our matching quiz

Take a quiz

Table of Content

Meta Launches OPT-175B Sharing Access to Large-Scale Language Models to Public

News
|
Rokas Jurkėnas

The largest social media company Meta (Formerly known as Facebook) has announced that they are releasing Open Pretrained Transformer (OPT-175B), a language AI system with more than 100 billion parameters.

Meta Launches OPT-175B Sharing Access to Large-Scale Language Models to Public

In their blog[1], Meta AI describes this launch as an effort to democratize access to powerful AI for research. They have currently released the model under a noncommercial license to focus on research use cases. 

“Access to the model will be granted to academic researchers; those affiliated with organizations in government, civil society, and academia; along with industry research laboratories around the world.” But we are still not very clear on at what scale will the access be granted to the researchers. If you want to request access to the model, you can simply fill out the form.

About Large Language Models

These models are natural language processing systems trained with a high volume of text to generate creative and coherent text in almost all formats. These models can create news articles, legal summaries, and movie scripts and provide assistance as customer service chatbots.

Currently, Open AI’s GPT-3 is one of the leaders in the industry of large language models with over 175 billion parameters. It is available for personal as well as commercial use.

Open Pretrained Transfomer is a large-scale language model launched by Meta. OPT-175B doesn’t only include the model but also has the codebase. They have also published extensive notes and logbooks about the training process. A suite of smaller-scale baseline models is also launched with a lower parameter count.

Solving the Deep Learning Carbon Problem

Meta OPT 175B Large Language Model
Image Credit: Meta

You can see a pattern that most innovative and cutting-edge AI research is generated exclusively by tech giants like Google, Meta, Microsoft, and Nvidia. Since it takes a high amount of energy and computation power to train and run big AI models, it tends to be expensive while it leaves a high carbon footprint.

A research study in 2019 by Roy Schwartz and Jesse Dodge highlighted that “The computations required for deep learning research have been doubling every few months, resulting in an estimated 300,000x increase from 2012 to 2018. These computations have a surprisingly large carbon footprint.”

Meta has claimed that they have solved the carbon side of the problem by reducing the carbon footprint to 1/7th times as compared to Open AI’s GPT-3. The paper claims that Meta has trained the model on 992 80-gigabyte A100 GPUs from Nvidia, with a carbon-emissions footprint of 75 tons. Whereas GPT-3 had an estimated carbon budget of 500 tons. Open AI is yet to confirm or deny the stance.

Wrap Up

With this announcement, we will see faster innovations in the research of Deep Learning. Although, there are certain ethical and moral questions like “What is the responsible use of AI?” that needs to be addressed at a global level. Along with Meta, we too are hopeful that the AI community including academic researchers, civil society, policymakers, and industry will get together to answer these questions.

Author

Avatar photo
Rokas Jurkėnas

Need some help with No-code?

For several years, I have been developing various products and systems using the No Code tools.

References

Read more