SGGPO
MediaTek, the global semiconductor company, has just announced that it is working closely with Meta's Llama 2, the company's next-generation open-source large-scale programming language (LLM).
| MediaTek aims to build a complete edge computing ecosystem. |
Leveraging Meta's LLM, as well as MediaTek's latest APUs and the NeuroPilot AI platform, MediaTek aims to build a complete edge computing ecosystem designed to drive the development of AI applications across smartphones, IoT, vehicles, smart homes, and other edge computing devices.
Currently, most generative AI processing is done through cloud computing. However, using MediaTek's Llama 2 models will allow generative AI applications to run directly on devices.
This offers several benefits to developers and users, including smoother performance, enhanced privacy, greater security and reliability, lower latency, increased workability in areas with little or no connectivity, and lower operating costs.
To truly leverage AI generative technology on edge computing devices, edge computing device manufacturers will need to adopt high-performance, low-power AI processors and faster, more reliable connectivity to enhance computing capabilities. Each 5G mobile chip currently offered by MediaTek is equipped with APUs designed to perform a variety of AI generative features, such as AI-powered noise reduction and AI-powered resolution enhancement.
Additionally, MediaTek's next-generation flagship processor, expected to be introduced later this year, will feature a software stack optimized for running Llama 2, along with an upgraded APU with Transformer core acceleration, minimized access area and DRAM bandwidth usage, and further enhanced LLM and AIGC performance. These advancements facilitate the rapid development of use cases for on-device AI generation.
MediaTek expects AI applications based on Llama 2 to be available on smartphones equipped with the next-generation flagship SoC, which are expected to be launched later this year.
Source






Comment (0)