AI Chip Sector Projected to Reach $138bn by 2028 – New Report
New research says that the market for AI chips will grow to $138 billion by 2028, thanks to rising demand in datacenters.
A recent report from the research firm Futurum Group predicts that the global market for processors and accelerators for AI applications in the datacentre market will grow at a compound annual growth rate (CAGR) of 30%, rising from $38bn last year to $138bn in 2028. This is due to the huge demand for AI chips that has sent Nvidia’s stock price through the roof in the past year.
We have good news for Nvidia, which is the market leader in AI chips right now: it can’t keep up with the demand for its graphics processing units (GPUs). In 2023, Nvidia had 92% of the $28bn in GPU sales for AI operations in datacenters, which was 74% of the total market value. The sector in question is projected to grow at a rate of 30% per year for the next five years, reaching a value of $102bn in 2028. This is even better news.
The Futurum team points out that it’s not just about GPUs, though: Seventy-four percent of the $38 billion datacentre AI chip market in 2023 was made up of GPU sales. The remaining twenty percent, or $7.7 billion, was made up of CPU sales. Additionally, CPUs will keep being important; in fact, that market will grow to be worth $26bn in 2028.
And even though Nvidia is the leader in this field right now, it’s not alone. “With the rise of AI and its supporting semiconductors solutions, we are witnessing the most profound technological revolution,” said David Newman, CEO of the Futurum Group. “Companies like Nvidia, AMD, and Arm are seeing big revenue growth as AI technology improves. But the market is expected to become more competitive as new companies and startups try to get a piece of the action and keep innovation going.”
Key Players and Market Segmentation in AI Chips
AI Processors and Accelerators Used in Data Centre Market for AI Applications: 1H 2024, Global Market Sizing and Forecast Report, 2023–2028′ is the full title of the report. It looks at the roles of 18 chip developers, such as Intel and Qualcomm. It also gives prices for custom cloud accelerators and dedicated accelerators (XPUs), which together make up 3% of the market value, as well as GPUs and CPUs.
The hyperscale cloud services companies like Amazon Web Services (AWS), Google, Microsoft, and Oracle are big buyers of processors and accelerators for AI applications in the datacentre market. Together, they made up 43% of the market’s value in 2023, and the Futurum team thinks that number will rise to 50% by 2028. (The team does say that processors and accelerators made just for AI apps used by Apple, Meta, Tesla, and other companies have been left out of the study because they are only used for processing in private datacenters.)
The Futurum team also says that more research shows that the market value of AI chips is more than $5tn, which is about 30% of the weight of the S&P 500 Index, which is a market value-weighted measure of 500 of the biggest publicly traded companies in the US: This is a strong and well-deserved reputation as one of the best ways to measure not only the performance of big US companies but also the performance of the stock market as a whole. It is not an exact mirror of the top 500 companies by any means.
Some parts of AI have been being worked on for a long time, but when ChatGPT came out all of a sudden in late 2022, it sparked a lot of excitement and pushed the AI hype machine into action. A lot of the news coverage that followed was mostly overblown fluff, but using different parts of AI is now a real trend that is already fundamentally changing fields like datacenters, the creation of new drugs and chemicals, education, finance, healthcare, and manufacturing. Companies in the cloud are supporting these changes by offering AI processing services and tools, either on their own or as parts of their own goods.
The Importance of Datacenters in AI Development
Datacenters are very important for handling the huge amounts of data that need to be processed, for pattern recognition, and for training neural networks. Datacenter managers are under a lot of pressure to keep up with AI developments while also desperately trying to find ways to use less energy. In many cases, the need for more and more energy to run AI is getting so high that some datacenters are about to melt down.
Because the global datacenter sector is so important and gets a lot of investment, AI chip makers are coming up with new ways to make their processors work better. One of these is AI accelerators, which are pieces of hardware that are designed to change how AI algorithms work so that very complicated calculations can be done faster. They are best for data-driven parallel computing and are sometimes called neural processing units (NPUs) or deep learning computers. Because of this, they are very good at processing multimedia data (like pictures and movies) and data for neural networks. They are especially good at AI-related jobs like recognizing speech, blurring the background in video calls, and editing photos or videos in ways like finding objects.
AI “inference,” on the other hand, uses “inference engines” that use logical rules on the knowledge base to evaluate and examine new data. In machine learning (ML), information is recorded, stored, and labelled so that these kinds of systems can be “trained.” Once the machine has been trained enough, the learnt knowledge is used to help it understand new data and draw conclusions from it without any interaction or input from a person.
Current and Future Use Cases of AI Chips
According to the Futurum study, the main way AI chipsets are used right now is for visual and audio analytics (24%). This is followed by simulation and modelling (21%). Object identification, tracking, and monitoring are the use cases that are growing the fastest.
The study looks at 17 different industry verticals. Manufacturing, industrial, media and entertainment each have an 11% market share. The study does say that over the next five years, IT and telecom will grow at the fastest rate (about 40%), followed by healthcare (37%), then financial services and insurance (39%), and finally IT and telecom (39%). Analyzing images and sounds, simulating and modelling, and creating, analyzing, and summarizing writing will likely be the top three uses by 2028.