To watch and register to the workshops, visit PAACADEMY’s new website (paacademy.com→).

Mark Zuckerberg announced that Meta trained a new large language model, LLaMA

On Friday, CEO Mark Zuckerberg announced that Meta has trained and will soon launch a new large language model named LLaMA. The purpose of this model is to assist scientists and engineers in exploring AI applications, including answering questions and summarizing documents.
AI Creative Challenge 4.0_ Winner01

Become A Digital Member

Subscribe only for €3.99 per month.
Cancel anytime!

Table of Contents

meta llama

On Friday, CEO Mark Zuckerberg announced that Meta has trained and will soon launch a new large language model named LLaMA. The purpose of this model is to assist scientists and engineers in exploring AI applications, including answering questions and summarizing documents.

The release of LLaMA by Meta’s Fundamental AI Research (FAIR) team is occurring as tech giants and well-funded startups race to showcase advancements in AI techniques and integrate them into commercial products. Large language models are the foundation of applications such as Microsoft’s Bing AI, OpenAI’s ChatGPT, and Google’s unreleased Bard.

According to Zuckerberg’s announcement, LLM technology has the potential to solve mathematical problems and conduct scientific research. “LLMs have exhibited great potential in generating text, holding conversations, summarizing written material, and performing more complex tasks, such as solving math theorems or predicting protein structures,” he wrote.

Here is LLamA’s announcement paper abstract, “We introduce LLaMA, a collection of foundation language models ranging from 7B to 65B parameters. We train our models on trillions of tokens and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, without resorting to proprietary and inaccessible datasets. In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks, and LLaMA-65B is competitive with the best models, Chinchilla70B and PaLM-540B. We release all our models to the research community.”

Share with a friend:

Courses:

Learn about parametric and computational from the online courses at the PAACADEMY:

Leave a Comment

Your email address will not be published. Required fields are marked *

Explore More

Sponsored Content

Subscribe to our weekly newsletter