Qualcomm and Meta are apparently planning to work together to optimize the direct implementation of Meta’s large Llama 2 language models on devices. Direct implementation of artificial intelligence on the device will eliminate the need for cloud services. Also, by running generative AI models like Llama 2 on smartphones, computers, VR/AR headsets, and cars, developers can provide users with more personalized experiences and save on cloud costs.
The goal of this joint collaboration is to deliver artificial intelligence based on the Llama 2 language model across devices, enabling developers to deliver more innovative AI applications. Companies will also be able to use these experiences to build intelligent virtual assistants, content creation platforms and entertainment applications. One of the advantages of artificial intelligence on the device is the possibility of using it in areas where there is no Internet access.
Meta and Qualcomm have a long history of collaboration, and Qualcomm’s leadership in on-device artificial intelligence is expected to enable Meta’s technology based on the Llama 2 language model to be delivered to millions of smartphones, cars, XR headsets, glasses, computers and IoT devices.
This technology will probably be available to users from 2024 onwards. But app developers can start optimizing their apps for on-device AI using the Qualcomm AI Stack platform today. This platform is a set of dedicated processing tools that enable on-device artificial intelligence even for small and thin devices.
Source: Gizmochina
RCO NEWS