A man walks past a logo of SK Hynix at the lobby of the company’s Bundang office in Seongnam on January 29, 2021.
Jung Yeon-Je | AFP | Getty Images
SK Hynix, one of the world’s largest memory chipmakers, on Thursday said second-quarter profit hit its highest level in 6 years as it maintains its leadership in advanced memory chips critical for artificial intelligence computing.
Here are SK Hynix’s second-quarter results compared with LSEG SmartEstimate, which is weighted toward forecasts from analysts who are more consistently accurate:
- Revenue: 16.42 trillion Korean won (about $11.86 billion), vs. 16.4 trillion Korean won
- Operating profit: 5.47 trillion Korean won, vs. 5.4 trillion Korean won
Operating profit in the June quarter hit its highest level since the second quarter of 2018, rebounding from a loss of 2.88 trillion won in the same period a year ago.
Revenue from April to June increased 124.7% from 7.3 trillion won logged a year ago. This was the highest quarterly revenue ever in the firm’s history, according to LSEG data available since 2009.
SK Hynix on Thursday said that a continuous rise in overall prices of its memory products — thanks to strong demand for AI memory including high-bandwidth memory — led to a 32% increase in revenue compared with the previous quarter.
The South Korean giant supplies high-bandwidth memory chips catering to AI chipsets for companies like Nvidia.
Shares of SK Hynix fell as much as 7.81% Thursday morning.
The declines came as the South Korea’s Kospi index lost as much as 1.91% after U.S. tech stocks sold off overnight, following disappointing Alphabet and Tesla earnings. Those reports mark investors’ first look at how megacap companies fared during the second quarter.
“In the second half of this year, strong demand from AI servers are expected to continue as well as gradual recovery in conventional markets with the launch of AI-enabled PC and mobile devices,” the firm said in its earnings call on Thursday.
Capitalizing on the strong AI demand, SK Hynix plans to “continue its leadership in the HBM market by mass-producing 12-layer HBM3E products.”
The company would begin mass production of the 12-layer HBM3E this quarter after providing samples to major customers and expects to ship to customers by fourth quarter.
Tight supply
Memory leaders like SK Hynix have been aggressively expanding HBM capacity to meet the booming demand for AI processors.
HBM requires more wafer capacity than regular dynamic random access memory products – a type of computer memory used to store data – which SK Hynix said is also struggling with tight supply.
“Investment needs are also rising to meet demand of conventional DRAM as well as HBM which requires more wafer capacity than regular DRAM. Therefore, this year’s capex level is expected to be higher than what we expected in the beginning of the year,” said SK Hynix.
“While overcapacity is expected to increase next year due to the increased industrial investment, a significant portion of it will be utilized to ramp up production of HBM. So the tight supply situation for conventional DRAM is likely to continue.”
SK Kim of Daiwa Capital Markets in a June 12 note said they expect “tight HBM and memory supply to persist until 2025 on a bottleneck in HBM production.”
“Accordingly, we expect a favourable price environment to continue and SK Hynix to record robust earnings in 2024-25, benefitting from its competitiveness in HBM for AI graphics processing unit and high-density enterprise SSD (eSSD) for AI-servers, leading to a rerating of the stock,” Kim said.
High-bandwidth memory chip supplies have been stretched thanks to explosive AI adoption fueled by large language models such as ChatGPT.
The AI boom is expected to keep supply of high-end memory chips tight this year, analysts have warned. SK Hynix and Micron in May said they are out of high-bandwidth memory chips for 2024, while the stock for 2025 is also nearly sold out.
Large language models require a lot of high-performance memory chips as such chips allow these models to remember details from past conversations and user preferences in order to generate humanlike responses.
SK Hynix has mostly led the high-bandwidth memory chip market, having been the sole supplier of HBM3 chips to Nvidia before rival Samsung reportedly cleared the tests for the use of its HBM3 chips in Nvidia processors for the Chinese market.
The firm said it expects to ship the next generation 12-layer HBM4 from the second half of 2025.
Credit: Source link