Samsung Unveils Ultra-Slim Chips for Faster AI on Mobile Devices

Samsung has begun mass production of its ultra-compact DRAM chips to improve performance and thermal management for AI workloads

Ben Wodecki, Junior Editor - AI Business

August 12, 2024

2 Min Read
SAMSUNG

Samsung has kicked off mass production of its LPDDR5X DRAM semiconductors, tiny chips as thin as a fingernail to improve how mobile devices run AI workloads.

They are the industry’s thinnest 12 nanometer (nm)-class memory chips and come in two sizes: 12GB and 16GB.

The chips are designed to process memory workloads directly on the device, enabling the phone’s operating system to work faster with storage devices to more efficiently handle AI workloads.

Samsung’s new LPDDR5X units are ultra-slim to provide more space in a mobile device.

The slim design accommodates a larger processor dedicated to AI tasks, enhancing performance while improving airflow — a crucial feature as advanced AI applications generate more heat.

The chips are 9% thinner than prior Samsung DRAM 12 nm units and offer around 21% improved heat resistance.

Samsung is adding AI across its line of mobile devices. Its Galaxy AI software provides consumers using Samsung’s mobiles with various generative AI applications, such as Circle to Search, which lets users circle objects in photos to search for that object on the web intuitively.

The new DRAM chips aim to provide improved memory handling for users running Galaxy AI workloads in future Samsung mobiles.

Beyond mobiles, these chips can also be featured in smartwatches or IoT devices for faster on-device memory processing.

Related:Baidu’s AI Chatbot to Power Samsung’s New Smartphones

“Samsung’s LPDDR5X DRAM sets a new standard for high-performance on-device AI solutions, offering not only superior LPDDR performance but also advanced thermal management in an ultra-compact package,” said YongCheol Bae, Samsung’s executive vice president of memory product planning.

Beyond the new 12 nm units, Samsung said it plans to develop 6-layer 24GB and 8-layer 32GB modules for future devices

“We are committed to continuous innovation through close collaboration with our customers, delivering solutions that meet the future needs of the low-power DRAM market,” said Bae.

This story first appeared in IoT World Today's sister publication AI Business.

About the Author

Ben Wodecki

Junior Editor - AI Business

Ben Wodecki is the junior editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to junior editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like