Latest FeedsTechnology NewsMeta presents its large language model. Zuckerberg revealed...

Meta presents its large language model. Zuckerberg revealed what performance to expect from ChatGPT

-

Meta presents its large language model.  Zuckerberg revealed what performance to expect from ChatGPTThe race for market dominance started by OpenAI continues. More companies join him, trying to stand out from the crowd and take the right position. This time, Meta stepped in. She presented her four language models, one of which outperforms GPT-3. All this with fewer parameters and size. So it opens the possibility of being available on our mobile devices in the near future.

Meta presents its language model LLaMA-13B, which in tests is much more efficient than GPT-3. However, this is about its work on one GPU.

Meta presents its large language model.  Zuckerberg revealed what performance to expect from ChatGPT [1]

Microsoft knew much earlier about the turbulent personality of the chatbot Sydney. So why didn’t he react then?

Yesterday (24/02/2023) Meta presented its Large Language Model (LLM) called LLaMA-13B, which is based on artificial intelligence algorithms. At the same time, it is part of a larger family of LLaMA (Large Language Model Meta AI) language models. According to the company’s reports, with a much smaller size, it is to exceed ChatGPT in its performance. Thanks to this, the language assistant will be able to run locally on mobile devices and computers. As for the rest of the collection of these language models, they have from 7 to 65 billion parameters (compared to ChatGPT 175 billion and ERNIE 260 billion). These are to be four new models that differ in their capabilities and applications. They will form the basis for their successors. All of them are trained on the basis of generally available public data sets, such as Wikipedia or Common Crawl.

Synchron is already testing its solutions on humans. A breakthrough for paralyzed people?

In performance tests, the LLaMA-13B model turned out to be better than ChatGPT when working on a single GPU. It is quite interesting, because it has much less parameters than it. They play one of the most important roles when it comes to the capabilities of language models. However, the more a given model has them, the more its size increases. Thus, the information that this language model is able to achieve similar results to its (it would seem) more powerful rival is quite surprising. We would be dealing with a huge increase in efficiency. “I think we’ll be able to run language models that have some of ChatGPT’s capabilities on our smartphones and laptops within a year or two,” said independent AI researcher Simon Willison. As LLaMA-13B is 10 times smaller than GPT-3, this may turn out to be true. At the moment you can download a stripped down version of the model from Meta z GitHub. However, if you want to receive the full code, you must fill in the appropriate one form.

Meta presents its large language model.  Zuckerberg revealed what performance to expect from ChatGPT [2]

Source: Ars Technica

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest news

The Redmi Note 11 comes with all the features you need to perform on a day-to-day basis. join...
What computer to buy for gaming? Which processor to choose? Which graphics cards are the best?...
Mar 31, 2023Ravie LakshmananCyber Crime / Hacking News The Cyber Police of Ukraine, in collaboration with law enforcement officials...

Must read

PayTM Add Money Offers & Codes March 2023

PayTM Add Money Offers PayTM Add Money Offers &...