News

Enfabrica Corporation, an industry leader in high-performance networking silicon for artificial intelligence (AI) and ...
Generative AI is arguably the most complex application that humankind has ever created, and the math behind it is incredibly ...
Samsung Electronics is reportedly pushing back the mass production of its next-gen high-bandwidth memory (HBM) chips to 2026, ...
High-Bandwidth Memory Chips Market is Segmented by Type (HBM2, HBM2E, HBM3, HBM3E, Others), by Application (Servers, Networking Products, Consumer Products, Others): Global Opportunity Analysis and ...
Enfabrica, a Silicon Valley-based chip startup working on solving bottlenecks in artificial intelligence data centers, on ...
This article explains what compute-in-memory (CIM) technology is and how it works. We will examine how current ...
Ray Wang of Futurum says SK Hynix will be able to hold on to its lead in high bandwidth memory chip technology despite ...
There has been a sharp rise in the demand for high-bandwidth memory, which is utilized alongside GPUs for AI applications, leading to a nearly 50% sequential increase in HBM memory revenue over ...
It began shipping its next-generation HBM4 memory in early June 2025, delivering 36 GB, 12-high HBM4 samples to important customers, reportedly including Nvidia.
HBM4 addresses these challenges by offering significantly higher bandwidth (up to 1.5 TB/s per stack) and increased memory capacity (up to 64GB or more per stack), while also improving power ...
High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface.
SEOUL, South Korea--(BUSINESS WIRE)--Samsung Electronics Co., Ltd., the world leader in advanced memory technology, today announced that it has developed the industry's first High Bandwidth Memory ...