How a ‘zombie’ chipmaker became Nvidia’s vital AI ally

1 hour ago 2

Article content

Many internal opponents were horrified by the idea of taking over Hynix. “There was an outcry within SK Telecom,” says Lee In-sook, director of consulting firm Platform 9¾ and co-author of Super Momentum, a book on Hynix published last month. “They said, ‘If we take over that company, we will fail. We will only pour money into it and then we will go under too’.”

Article content

Just two months after the deal went through, SK chair Chey was charged with embezzlement of group funds. He would end up serving more than two years in prison before being given a presidential pardon.

Article content

Chey however made one wise decision: the appointment of longtime Hynix engineer Park Sung-wook as chief executive. Hynix employees trusted Park and his technical background meant “you didn’t have to explain anything and you couldn’t lie to him,” Lee said in an interview.

Article content

While many employees had fled Hynix during its years of uncertainty, those who remained had developed a do-or-die, underdog mentality. At company drinking sessions, according to Lee, staff would shout “Go man go, is man is!” — a ‘Konglish’ expression essentially meaning: “If you’re going, go; if you’re staying, stay.”

Article content

Article content

Park, who stepped down as chief executive in 2019, encouraged his team to prioritise long-term research over short-term financial performance. They focused on HBM technology at a time when few others believed in it. Its first use was in a prohibitively expensive graphics card aimed at gamers. Hynix barrelled ahead anyway, increasing R&D spending an average of 14 per cent a year between 2010 and 2024.

Article content

For years, HBM seemed like a solution looking for a problem, but then came the AI boom. “AI workloads require high volumes of memory…it wasn’t until the launch of ChatGPT spurred an explosion in demand for AI servers that Hynix’s multiyear bet finally paid off,” says Chris Miller, author ofChip War.

Article content

According to an HSBC research note, the HBM market grew from US$1 billion in 2022 to US$16 billion in 2024, and by 2027 it should reach US$87 billion. Hynix revenue has increased from Won44.6 trillion to Won97.1 trillion in the past three years.

Article content

According to Prof Kwon the memory crunch is likely to continue until the fourth quarter of 2027. “Forward capacity is being locked up well in advance…everybody agrees memory is a chokepoint now,” he said.

Article content

Article content

While Hynix’s position in HBM production is the result of what Kwon calls “a multi-decade accumulation of knowhow”, there are plenty of potential threats including arch-rival Samsung, which was slower to the AI market for HBM but has since developed formidable capability of its own.

Article content

Ray Wang, an analyst at SemiAnalysis, said Samsung’s improvement meant SK Hynix could face a “more competitive environment” for the next generation of HBM.

Article content

Kwon believes the power of core customers such as Nvidia is a greater concern in the long run. Nvidia has a market share of 85 per cent in the graphics processing units (GPUs) that use HBM to power AI. With Hynix, Samsung and Micron vying to supply HBM, Nvidia may be able to insist on ever higher standards and customised chips, says Kwon.

Article content

Chinese competitors are also making inroads at the cheaper end of the market: Hefei-based CXMT has recently claimed a 5 per cent market share in Dram.

Article content

SK Hynix also risks becoming caught up in geopolitical tension around semiconductor supplies, including U.S. President Donald Trump‘s desire for far more chips to be built within the U.S. The company has navigated big-power rivalry, maintaining a facility in Wuxi in China where it produces about 40 per cent of its Dram products, while recently announcing investment of US$10 billion in a U.S.-based subsidiary that will focus on data centres and infrastructure for AI.

Read Entire Article