At Mobile World Congress, Qualcomm is demonstrating its vision for better exploiting artificial intelligence with the help of on-device computing and connectivity. Credit: Irene Iglesias / Computerworld España While last year it was still a matter of tapping into the potential of ChatGPT and similar services via browsers and apps, efforts are now being made to run generative AI directly on the end device or in a hybrid mode. From the device manufacturers’ point of view, this newly awakened interest is understandable, as AI offers the opportunity to get the somewhat dormant PC and smartphone market back on track. But do users in the enterprise also benefit from this? On-device-AI from Qualcomm One of these players is Qualcomm. The company has already laid the foundation for corresponding AI-optimized chips on the device with the Snapdragon 8 Gen 3 smartphone chipset and the Snapdragon X Elite notebook processor unveiled at the end of 2023, both with an integrated NPU (Neural Processing Unit). While notebooks equipped with the Snapdragon X Elite are not expected until the summer, the newly unveiled Xiaomi 14, Xiaomi 14 Ultra and Honor Magic 6 Pro are recent examples of smartphones with Qualcomm’s Snapdragon 8 Gen3 that already use on-device AI in different ways. At Mobile World Congress in Barcelona the company demonstrated the performance of the Snapdragon SoC on an Android smartphone using the Large Language and Vision Assistant (LLaVA). According to Qualcomm, this is the first multimodal large language model (LLM) to run on a smartphone. The model, which reportedly has more than seven billion parameters, can accept not only text, but also images and speech as a prompt. In one of the demonstrations, images of different ingredients are shown, after which a recipe for these ingredients is created offline and the calorie count of the resulting meal is estimated. In another demo, the open-source graphics program GIMP is demonstrated with a Stable Diffusion plugin on a Snapdragon X Elite laptop alongside an x86 laptop running Intel Core Ultra 7 to demonstrate the benefits of hardware designed for generative AI. Qualcomm claims that the Snapdragon box is three times faster at image generation than the system without an NPU. A developer platform for AI models In addition, Qualcomm unveiled the AI Hub. The platform offers a library of pre-optimized AI models for seamless deployment on devices powered by Snapdragon and Qualcomm platforms, including smartphones, PCs, AR/VR devices, and other devices. The list includes not only well-known generative AI solutions such as Stable Diffusion, Llama or ControlNet, but also various models for speech recognition, image classification, object recognition, image superscaling and the like. According to Qualcomm, developers can run the models with just a few lines of code, even on cloud-hosted devices running Qualcomm platforms. As a modem manufacturer, Qualcomm is of course not only focusing on on-device AI, but is convinced that the combination with the cloud can bring further benefits. “The future of generative AI is hybrid,” Qualcomm CEO Cristiano Amon said. “The intelligence on the device works with the cloud to provide more personalization, privacy, reliability, and efficiency.” Amon emphasized the importance of connectivity to scale and extend generative AI beyond the cloud, edge, and device. AI-Optimized Connections At the same time, it is also possible to provide next-generation connectivity with the help of AI. For example, the newly introduced Snapdragon X80 5G modem-RF system with up to six times carrier aggregation (6CA) and six receiver antennas (6Rx) uses AI to optimize the use of multiple antennas in a smartphone. In this way, according to Qualcomm, it is possible to improve signal quality and thus data throughput, as well as increase energy efficiency. The Snapdragon X80 also supports NB-NTN (Narrow Band Non Terrestrial Network) satellite connections. The latest version of its radio chip, the FastConnect 7900, also uses AI to increase performance and improve energy efficiency. The main thing here is to understand what the Wi-Fi connection is used for, for example to watch videos, listen to music or for online meetings, and optimize the connection accordingly. According to Qualcomm, this can save up to 30 percent of energy compared to applications without AI. In addition, many Wi-Fi parameters could be better adjusted to provide an optimal user experience. The FastConnect 7900 supports Wi-Fi 7, Wi-Fi 6E, and Wi-Fi 6 for peak speeds of up to 5.8 Gbps, and also integrates Bluetooth and ultra-wideband (UWB) on a single chip. It is also said to consume less electricity than the previous generation. Related content how-to Compressing files using the zip command on Linux The zip command lets you compress files to preserve them or back them up, and you can require a password to extract the contents of a zip file. By Sandra Henry-Stocker May 13, 2024 4 mins Linux news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center opinion NSA, FBI warn of email spoofing threat Email spoofing is acknowledged by experts as a very credible threat. By Sandra Henry-Stocker May 13, 2024 3 mins Linux how-to Download our SASE and SSE enterprise buyer’s guide From the editors of Network World, this enterprise buyer’s guide helps network and security IT staff understand what Secure Access Service Edge (SASE) and Secure Service Edge) SSE can do for their organizations and how to choose the right solut By Neal Weinberg May 13, 2024 1 min SASE Remote Access Security Network Security PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe