[ad_1]

Cristiano Amon, President and CEO of Qualcomm, speaks during the Milken Institute Global Conference on May 2, 2022, in Beverly Hills, California.

Patrick T Fallon | AFP | Getty Images

Qualcomm And meta The company announced today that the social networking company’s new large language model, Llama 2, will be enabled to run on Qualcomm chips on phones and computers from 2024.

So far, LLM has mainly been run on large server farms, on Nvidia graphics processors, due to the technology’s massive computational power and data needs, which has boosted Nvidia’s stock, which has increased by more than 220% this year. But the AI ​​boom has been largely missed by companies that make leading processors for phones and computers, such as Qualcomm. Its stock is up about 10% so far in 2023, trailing Nasdaq’s 36% gain.

Tuesday’s announcement indicates that Qualcomm wants to position its processors as well-suited for AI but “on the edge” or on a device, rather than “in the cloud.” If large language models can be run on phones rather than in large data centers, this could significantly lower the cost of running AI models, and could lead to better and faster voice assistants and other applications.

Qualcomm will make open-source Llama 2 models of the Meta available on Qualcomm devices, which it believes will enable applications like intelligent virtual assistants. Meta Llama 2 can do many of the same things as ChatGPT, but it can be packaged into a smaller piece of software, allowing it to run on a phone.

Qualcomm’s chips include a “tensor processor unit,” or TPU, which is well-suited for the kinds of computations required by AI models. However, the amount of processing power available on a mobile device pales in comparison to a data center equipped with the latest GPUs.

The Meta llama is notable because Meta has published its “weights,” which are a set of numbers that help control how a given AI model works. Doing so will allow researchers and eventually commercial organizations to use AI models on their own computers without asking for permission or payment. Other notable LLMs, such as OpenAI’s GPT-4, or Google’s Bard, are closed source, and their weights are closely held secrets.

Qualcomm has worked with the Meta closely in the past, particularly on its Quest virtual reality hardware chips. It has also demonstrated some slowly running AI models on its chips, such as the open source image generator Stable Diffusion.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *