Micron launches a 256GB SOCAMM2 memory module using 64 32GB LPDDR5x chips — and yes, hyperscalers can shove 8 in an AI server to reach 2TB capacity: mere mortals need not apply
Micron introduces a 256GB SOCAMM2 LPDDR5x memory module designed for AI servers, enabling configurations up to 2TB while reducing power consumption.
Micron introduces a 256GB SOCAMM2 LPDDR5x memory module designed for AI servers, enabling configurations up to 2TB while reducing power consumption.
Share
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0
