Samsung Unveils Revolutionary LPDDR5X DRAM for Next-Gen AI Processing in Mobile Devices

April 18, 2024

Samsung, the world leader in advanced memory technology, has announced the development of the industry’s first LPDDR5X DRAM, supporting the highest performance of up to 10.7 gigabits-per-second (Gbps) for AI applications. This cutting-edge technology is expected to expand adoption into PCs, accelerators, servers, and automobiles, significantly enhancing AI processing capabilities in various devices.

The new LPDDR5X DRAM is built on Samsung’s 12-nanometer (nm) class process technology, which has led to the smallest chip size among existing LPDDRs, solidifying Samsung’s leadership in the low-power DRAM market. This innovation comes as demand for low-power, high-performance memory increases, with LPDDR DRAM expected to expand its applications beyond mobile devices to areas such as PCs, accelerators, servers, and automobiles.

The new LPDDR5X DRAM not only improves performance by more than 25% and capacity by more than 30% compared to the previous generation but also expands the single package capacity of mobile DRAM up to 32 gigabytes (GB). This makes it an optimal solution for the on-device AI era that requires high-performance, high-capacity, and low-power memory.

The LPDDR5X incorporates specialized power-saving technologies such as optimized power variation and expanded low-power mode intervals, enhancing power efficiency by 25% over the previous generation. This improvement enables mobile devices to provide longer battery life and allows servers to minimize total cost of ownership (TCO) by lowering energy usage when processing data.

Samsung’s new LPDDR5X DRAM is expected to power the next generation of mobile devices, including Samsung’s own lineup of smartphones, tablets, and other AI-enabled devices. Is this a sign for an all AI-powered Bixby, Samsung’s assistant feature? Or will this make way for new AI LLMs in town?

The company plans to collaborate closely with customers to innovate and deliver optimized products for the upcoming on-device AI era, further solidifying its position as a global leader in in-memory technology.

In addition to Samsung’s advancements in in-memory technology, the company has also been making strides in custom CPU and GPU architectures. Samsung’s Austin R&D Center (SARC) has been instrumental in developing high-performance, low-power, complex CPU and System IP (Coherent Interconnect and memory controller) architectures and designs. The company has also been working on its GPU IP called “S-GPU” and has released custom CPU cores named Mongoose for four generations, from M1 to M4.

With Samsung’s continued investments in research and development and strategic partnerships, the company is well-positioned to lead the charge in the on-device AI era, delivering innovative solutions that meet the growing demands of consumers and businesses alike.

Read More

Leave a Reply

Your email address will not be published.

Previous Story

SHEboshis DN-404 Standard: Bridging Fungible and Non-Fungible Tokens

Next Story

Telegram CEO’s Answer to Privacy Threats: Crypto-Inspired Communication Devices to Evade Government Eyes