Samsung has introduced its high-capacity HBM3E memory for faster AI training and inference. Called Samsung HBM3E 12H DRAM, the memory comes with advanced TC NCF technology. The HBM in the memory’s name stands for “high bandwidth memory”. So what are the other details?
Samsung HBM3E 12H DRAM memory specifications
In October, Samsung introduced HBM3E Shinebolt, an improved version of the third generation of HBM, capable of achieving 9.8Gbps per pin (and 1.2 terabytes per packet).
The 12H in the memory’s name represents the number of vertically positioned chips in each module. This allows Samsung to increase the size of the memory by 50% to 36 GB compared to the 8H design. However, the bandwidth remains at 1.2 terabytes per second.
Let’s talk about TC NCF technology. This is the name of the material placed between its vertically positioned chips. Samsung has worked to make it thinner and has managed to reduce its size down to 7µm.
An added benefit of TC NCF is that it has enhanced thermal properties that help improve cooling. Even better, the method used in Samsung HBM3E 12H DRAM also improves efficiency.
So what will this memory be used for? For artificial intelligence, of course. As you know, AI models require a lot of RAM. Last year, Nvidia added Samsung to its list of suppliers of high bandwidth memory.
Now the company seems to be moving for it. What do you think about Samsung HBM3E? Please do not forget to share your thoughts with us in the comments section.
{{user}} {{datetime}}
{{text}}