March 7, 2023
SEOUL – As the latest frenzy over ChatGPT, OpenAI’s artificial intelligence chatbot, has prompted global tech giants to jump into commercializing their own large-scale language models and services, anticipation is growing in the semiconductor industry that the new tech market could create lucrative business opportunities and help them clear out mounting inventories.
Reflecting the fever for ChatGPT, shares of Nvidia, whose graphic processors are in high demand for high-capacity AI services, surged more than 50 percent this year alone.
Operating on highly advanced processors, industry watchers believe the demand for chipsets and their parts — such as GPUs and CPUs — will grow exponentially if the generative AI market settles successfully.
The AI market will also be a lucrative target for memory chipmakers, as the need for high-capacity memory chips will follow. Generative AI creates all kinds of data — not only texts but also images, videos and codes.
“The increase of natural language-based, conversation-style AI services will have a positive impact on future memory demand,” Samsung Electronics Executive Vice President Kim Jae-june said during the company’s earnings call for the fourth quarter of 2022 held in January this year.
“That is why we are planning to actively capture the increase in demand related with AI services by developing high-performance, high-density memory products,” Kim said.
SK hynix Vice Chairman Park Jung-ho also said that the proliferation of AI chatbots will become “killer applications” to create large demand for advanced memory chips, along with developing technology.
The rise of the AI market, however, does not promise immediate relief for inventory, industry officials here say, as South Korea — home to the world’s top memory chipmakers, Samsung and SK — is witnessing the highest-ever stockpile of chips in 26 years.
“It is when the AI industry is more settled that we expect more demand will be created for highly advanced data centers, and also for GPUs and CPUs, which are the brains enabling the operation of the AI,” said an industry official who wished to be unnamed. Advanced GPUs and CPUs are used to train AI and deep learning models by enabling multiple computations at the same time.
“It would still be an opportunity for chipmakers producing those processors, allowing them to restructure and add a new consumer base when the market is fully established. For memory chipmakers, it would be after that that they would supply their advanced high-capacity memory products to support the chipsets produced by companies like Nvidia, AMD and Qualcomm,” he added.
With tech giants such as Microsoft, Google, Meta and Naver all announcing plans to build or already introducing their generative AI services and programs, AI appears to hold great potential to lead the future tech world.
Samsung Electronics, the world’s top memory chipmaker, has also joined hands with Naver jointly to develop AI chip solutions, signing an agreement late last year.
But the market is still in a very nascent stage, so it is too early to conclude how the generative AI market could become a lucrative source of revenue for chipmakers, another industry official said.
“ChatGPT’s introduction has proven the technology can now be commercialized, but even ChatGPT has not yet shown how the service will be used to make a profit,” the official said.
“The technology is already there. The point is how ChatGPT will be actively used. For memory chipmakers, it is just a matter of whether the market creates concrete demand, and then we can supply the advanced chips to meet the demand.”
As the global No. 2 foundry maker, Samsung could also become a beneficiary of AI’s growth. It can potentially snap up orders from chip designers, but this idea too is only an assumption, the official added.
The memory chip industry expects demand will emerge for high-performance High Bandwidth Memory, which carries data to CPUs, as well as for high-density server DRAM of 128 gigabytes.
AI-based models require high-performance processors that can do large-scale computation, and what goes together to enable the technology is the high-performance, high-density memory chips, Kim of Samsung said during the earnings call.
“Most specifically, we can expect there to be long-term demand growth for high-performance HBM, which provides data directly to CPUs and AI accelerators, as well as high-density server DRAM such as of 128 gigabytes,” the Samsung executive said.
“That is why we are planning to actively capture the increase in demand related to AI services by developing high-performance, high-density memory products,” Kim said.
In the meantime, the growth of the AI market is expected to have just a little impact on the increase of the inventory of “lower-tier” memory chips, except for some types of DRAM chips, sources said.
According to industry estimates, Samsung’s inventory assets are at an all-time high of 52.2 trillion won ($4.03 billion) as of the fourth quarter last year, up 26 percent compared to the same period in the previous year. SK’s inventory assets also jumped 75 percent on-year to some 15.6 trillion won during the cited period.