MediaTek has partnered with Meta’s Llama 2 to enhance on-device generative AI capabilities in edge devices. This ecosystem combines Llama 2 models, APUs, and NeuroPilot AI Platform, enabling faster, more secure, and cost-effective AI applications.
To improve on-device generative AI capabilities in edge devices, MediaTek, a prominent chip manufacturer, has announced a partnership with Meta’s Llama 2, an open-source Large Language Model (LLM). MediaTek aspires to create a complete edge computing ecosystem by using Meta’s Llama 2 as well as the most recent APUs (AI Processing Units) and NeuroPilot AI Platform from MediaTek. This ecosystem is made to hasten the creation of AI applications for a variety of gadgets, including smartphones, IoT devices, cars, smart homes, and more.
MediaTek’s use of Llama 2 models enables generative AI applications to operate directly on-device, in contrast to conventional methods that rely on cloud infrastructure for processing. Developers and consumers alike stand to gain a lot from this strategy, including better speed, more privacy, stronger security and dependability, less latency, operation in locations with poor connection, and fewer operational expenses.
The expanding importance of generative AI in the digital transition was emphasized by JC Hsu, Corporate Senior Vice President and General Manager of MediaTek’s Wireless Communications Business Unit. He underscored MediaTek’s commitment to provide Llama 2 users and developers the tools they need to foster innovation in the field of AI.
Create groups on WhatsApp without naming them, a new feature
Edge device makers require powerful, low-power AI processors as well as reliable connection in order to take full use of on-device generative AI technologies. The System-on-Chips (SoCs) for MediaTek’s 5G smartphone models currently have APUs made for a variety of generative AI capabilities, including AI Noise Reduction, AI Super Resolution, and AI MEMC.
Furthermore, a software stack that is optimized for Llama 2 will be included in MediaTek’s forthcoming flagship chipset, which is scheduled to be released later this year. Additionally, it will have an enhanced APU with Transformer backbone acceleration, better DRAM bandwidth access, and a smaller footprint. Performance for LLM and AIGC (AI-generated content) is anticipated to increase noticeably as a result of this improvement.
MediaTek predicts that smartphones running the next flagship SoC, which is expected to be released by the end of the year, would support Llama 2-based AI apps. This partnership highlights the continuous developments in on-device AI processing and highlights the growing significance of edge computing in the creation of AI applications.
Conclusion:-
MediaTek has partnered with Meta’s Llama 2, an open-source Large Language Model (LLM), to enhance on-device generative AI capabilities in edge devices. This ecosystem aims to create AI applications for various gadgets, including smartphones, IoT devices, cars, and smart homes. MediaTek’s use of Llama 2 models allows generative AI applications to operate directly on-device, providing better speed, privacy, security, dependability, less latency, and fewer operational expenses. The partnership highlights the growing importance of edge computing in creating AI applications and the need for powerful, low-power AI processors and reliable connections. MediaTek’s upcoming flagship chipset will include an optimized software stack for Llama 2, with Transformer backbone acceleration, better DRAM bandwidth access, and a smaller footprint.