Micron unveils a high-density 256GB SOCAMM2 memory module designed specifically for AI servers. This innovative module uses 64 monolithic 32GB LPDDR5x chips, delivering exceptional capacity, bandwidth, and power efficiency to meet the demands of modern AI workloads.
Boosting AI Server Memory Capacity
Large language models (LLMs) and advanced inference pipelines require vast memory pools, prompting a redesign of server architectures. The new 256GB SOCAMM2 module supports data center systems by expanding memory per processor setup. In an eight-channel server CPU configuration, eight modules provide a total of 2TB LPDRAM capacity—a roughly one-third increase over the prior 192GB generation.
This upgrade enables larger context windows and handles more intensive inference tasks effectively.
Superior Efficiency and Compact Design
The SOCAMM2 module offers advantages over traditional server memory. “Micron’s 256GB SOCAMM2 offering enables the most power-efficient CPU-attached memory solution for both AI and HPC,” states Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. “Our continued leadership in low-power memory solutions for data center applications has uniquely positioned us to be the first to deliver a 32Gb monolithic LPDRAM die, helping drive industry adoption of more power-efficient, high-capacity system architectures.”
It consumes about one-third the power of comparable RDIMMs and takes up just one-third the physical space. These features support higher rack density in data centers, reduce thermal loads, and lower infrastructure costs. The modular design simplifies maintenance, future upgrades, and integration with liquid-cooled systems as AI models and datasets expand.
Performance Gains in AI Workloads
For unified memory architectures, the module accelerates inference by more than 2.3 times in time-to-first-token for long-context tasks via key-value cache offloading. In pure CPU-based workloads, it achieves over three times better performance per watt than standard server memory.
Micron’s LPDRAM lineup includes components from 8GB to 64GB and SOCAMM2 modules from 48GB to 256GB. Customer samples of the 256GB version are now shipping.

