Chip makers are abuzz about the latest hot-thing in tech: artificial-intelligence tools that generate text with minimal prompting, require massive computing power to run and promise a lucrative new revenue stream.
For semiconductor makers, the new tools, if widely adopted, could result in tens of billions of dollars in net annual sales, analysts estimate.
Excitement over so-called generative AI has reached fever pitch since the release late last year of San Francisco-based OpenAI’s chatbot, called ChatGPT. The technology has captivated users by producing convincingly real if sometimes inaccurate responses, helping it attract billions of dollars from
Microsoft Corp.
and other investors.
Jensen Huang,
the chief executive officer of
Nvidia Corp.
, America’s largest chip company by market value, said the technology has reached an inflection point. “Generative AI’s versatility and capability has triggered a sense of urgency at enterprises around the world to develop and deploy AI strategies,” he said as the company posted quarterly earnings Wednesday and unveiled a new cloud-computing initiative to capitalize on the business opportunity.
Nvidia shares were up more than 12% in early Thursday trading.
The interest in such AI tools is causing companies to reset their business expectations with pace, he said. “There’s no question that whatever our views are of this year as we enter the year has been fairly dramatically changed as a result of the last 60, 90 days.”
The excitement comes as the chip industry is wrestling with a sharp downturn in the semiconductor industry with sales of personal computers, smartphones and other electronics flagging. Most chip makers have reported slowing sales as recession concerns have caused consumers and businesses to pull back on spending.
Nvidia is the undisputed market leader in chips used for AI in the unglamorous world of data centers where tools such as ChatGPT make computations and spit out results. It had about an 80% share of such AI processors as of 2020, according to an Omdia estimate.
With so much money up for grabs, though, other chip makers want in on the action.
Intel Corp.
CEO
Pat Gelsinger
said Wednesday that his company had a broad suite of chips to address the generative-AI opportunity, including specialist chips geared toward AI computation, graphics chips for data centers and a new generation of data-center central process units—the digital brains of computers—that he said performed well in AI work.
“That performance we expect will become much more of the mainstream of computing as AI gets infused into every application going forward,” he said.
Advanced Micro Devices Inc.,
which makes CPUs, graphics chips and other hardware tailored for AI, is also betting large cloud-computing companies that run many of the computations essential to the technology will be investing heavily in chips. That business should start to become more meaningful next year, AMD CEO
Lisa Su
said late last month.
Generative AI could add $20 billion a year to the overall AI chip market by 2027, according to Vivek Arya, an analyst at Bank of America. Nvidia, he said, should be able to maintain at least a 65% market share in AI chips.
Internet-search giant Google, a unit of
Alphabet Inc.,
this month offered a glimpse of a homegrown rival to ChatGPT that it calls Bard. China’s
Baidu Inc.
is developing an AI-powered chatbot similar to ChatGPT called Ernie Bot, which it plans to launch next month. Microsoft is already giving users a limited taste of ChatGPT within its Bing search engine results.
In the near term, at least, Nvidia’s dominance in AI may position it best to cash in. The company gained its lead by allowing software developers to exploit properties of its graphics chips that proved adept at AI starting about 15 years ago. Now, the company’s chips are the only viable products that can be used to create massive AI language systems, UBS analysts said in a note, adding that they estimate that ChatGPT requires around 10,000 of the company’s graphics chips to train.
Mr. Huang suggested the company next month may update its outlook for the size of its potential market, after giving a projection of $1 trillion roughly a year ago for its business spanning from providing chips for videogaming to cars.
“Because of the incredible capabilities and versatility of generative AI and all of the convergence breakthroughs that happened toward the middle and end of last year, we’re probably going to arrive at that [market size] sooner than later,” he said. “There’s no question that this is a very big moment for the computer industry.”
Nvidia is trying to get there faster by starting to offer a cloud-computing service for businesses to develop generative AI chatbots and other tools using its hardware and software. The service, which would be offered through established cloud-computing companies, aims to lower barriers to entry for the spread of AI’s use in business.
Nvidia said it is working with all the major cloud-computing providers, which include
Amazon.com Inc.,
Microsoft and Google, on generative AI tools, as well as with consumer internet companies and startups.
Write to Asa Fitch at asa.fitch@wsj.com
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
May 30 (Reuters) - Australian regulators have over the recent years fined a slew of companies over breaches and non-compliance issues, with the country's "Big F
InDrive chief executive Arsen TomskyTom Kiehn As it completes its withdrawal from its home market, Russian ride-hailing company inDrive is turning its attention
It’s fair to say that the conversational generative artificial intelligence (AI) tool ChatGPT has taken the world by storm. Just a few months after it was rel
Keeping phone calls and data packages whirring around the world requires huge amounts of energy. It is no wonder, then, that the telecoms sector — with