High-bandwidth memory products designed to accelerate next-generation AI systems

Автор
Відомості

Ostapovych Taras

post graduate student Kyiv National Economic University named after Vadym Hetman Kyiv, Ukraine

DevOps engineer at Rapid Die Cut ostapovych@meta.ua

Annotation. HBM memory is the kind of technology that enable AI functionality of modern world. Such a technology became possible after years of innovation at large scale system by the companies that always pushing the limit of what is possible for technology, but keep in mind all of this functionality are critical for better well been of human after all. May it be possible that HBM is not the last step of technical innovation but an important one to be sure.

High-bandwidth memory products designed to accelerate next-generation AI systems, professional visualization workstations, and high-performance computing. High-bandwidth memory (HBM) is essentially a stack of memory chips, small components that store data. They can store more information and transfer data faster than an older technology called DRAM (dynamic random access memory). HBM chips are commonly used in graphics cards, high-performance computing systems, data centers, and autonomous vehicles. Most importantly, they are indispensable for running increasingly popular AI applications, including generative AI, which runs on AI processors such as graphics processing units (GPUs) made by Nvidia and Advanced Micro Devices. “The processor and memory are two important components of AI. Without memory, it’s like having a brain with logic but no memory,” said G. Dan Hutcheson, vice chairman of TechInsights, a research organization specializing in microcircuits.

The first high-bandwidth memory chip, manufactured by South Korean company SK Hynix, was completed in 2014 to meet the growing need to store large amounts of data for computing. Although artificial intelligence was still in its infancy at the time, there was a clear need to continue building the first HBM chips. Samsung was the first to develop a second-generation HBM chip in 2015, and SK Hynix and US company Micron Technology followed suit. Development of subsequent generations continued, and demand for HBM chips skyrocketed as the need for greater speed, power, and efficiency exploded with the development of AI. Imagine an HBM chip as a library. Early generations of HBM chips were the equivalent of a small-town library—a single-story building with a couple of librarians checking out books. The amount of storage space and the speed of information delivery were small and slow. Fast forward to today, the latest generation of HBM chips is the equivalent of a 50-story library with thousands of librarians. The amount of data on the chips and the processing speed are much better than earlier generations. In addition, the latest generation of chips process data more efficiently, which allows for lower temperatures and better prevention of chip damage. The development of these HBM chips over the past 10 years is a remarkable feat of innovation with no signs of slowing down. The HBM chip market is still dominated by those top three players: SK Hynix, Samsung, and Micron Technology, two of which are based outside the United States. This dynamic is no coincidence, as emerging Asian countries have been favorites for decades when American technology companies want to move production to countries with lower labor costs but a more educated workforce. While Micron competes with two other companies, SK Hynix and Samsung are the clear leaders.

From a growth perspective, Samsung represents a good opportunity for growth in the HBM memory space. Samsung’s ability to maintain market share in the rapidly growing HBM chip market makes it an important part of the next technological leap. In addition, Samsung has high operating margins, a strong balance sheet, and a more diversified business than its industry peers. 2024 could be the year of AI. However, the rise of AI is likely to have repercussions well into 2025 and beyond. The impact of major technological developments is rarely limited to just a select few companies that lead the change. By considering the direct and indirect beneficiaries of such developments, and by having a long-term investment mindset and valuation discipline, investors should be able to continue to find interesting opportunities that the market continues to overlook or discount. According to experts, the HBM chip market will grow by 42% annually until 2033 due to its importance for AI computing.