Nvidia is not the only firm cashing in on the AI gold rush - The Economist

1 year ago
52

🥇 Bonuses, Promotions, and the Best Online Casino Reviews you can trust: https://bit.ly/BigFunCasinoGame

Nvidia is not the only firm cashing in on the AI gold rush - The Economist

A GREY RECTANGULAR building on the outskirts of San Jose houses rows upon rows of blinking machines. Tangles of colourful wires connect high-end servers, networking gear and data-storage systems. Bulky air-conditioning units whirr overhead. The noise forces visitors to shout. The building belongs to Equinix, a company which leases data-centre space. The equipment inside belongs to companies from corporate giants to startups, which are increasingly using it to run their artificial-intelligence (AI) systems. The AI gold rush, spurred by the astounding sophistication of “generative” systems such as ChatGPT, a hit virtual conversationalist, promises to generate rich profits for those who harness the technology’s potential. As in the early days of any gold rush, though, it is already minting fortunes for the sellers of the requisite picks and shovels. On May 24th Nvidia, which designs the semiconductors of choice for many AI servers, beat analysts’ revenue and profit forecasts for the three months to April. It expects sales of $11bn in its current quarter, half as much again as what Wall Street was predicting. As its share price leapt by 30% the next day, the company’s market value flirted with $1trn. Nvidia’s chief executive, Jensen Huang, declared on May 29th that the world is at “the tipping point of a new computing era”. Other chip firms, from fellow designers like AMD to manufacturers such as TSMC of Taiwan, have been swept up in the AI excitement. So have providers of other computing infrastructure—which includes everything from those colourful cables, noisy air-conditioning units and data-centre floor space to the software that helps run the AI models and marshal the data. An equally weighted index of 30-odd such companies has risen by 40% since ChatGPT’s launch in November, compared with 13% for the tech-heavy NASDAQ index (see chart). “A new tech stack is emerging,” sums up Daniel Jeffries of the AI Infrastructure Alliance, a lobby group. On the face of it, the AI gubbins seems far less exciting than the clever “large language models” behind ChatGPT and its fast-expanding array of rivals. But as the model-builders and makers of applications that piggyback on those models vie for a slice of the future AI pie, they all need computing power in the here and now—and lots of it. The latest AI systems, including the generative sort, are much more computing-intensive than older ones, let alone non-AI applications. Amin Vahdat, head of AI infrastructure at Google Cloud Platform, the internet giant’s cloud-computing arm, observes that model sizes have grown ten-fold each year for the past six years. GPT-4, the latest version of the one which powers ChatGPT, analyses data using perhaps 1trn parameters, more than five times as many as its predecessor. As the models grow in complexity, the computational needs for training them increase correspondingly. Once trained, AIs require less number-crunching capacity to be used in a process called inference. But given the range of applications on offer, inference will, cumulatively, also demand plenty of processing oomph. Microsoft has more than 2,500 customers for a service that uses technology from OpenAI, ChatGPT’s creator, of which the software giant owns nearly half. That is up ten-fold since the previous quarter. Google’s parent company, Alphabet, has six products with 2bn or more users globally—and plans to turbocharge them with generative AI. The most obvious winners from surging demand for computing power are the chipmakers. Companies like Nvidia and AMD get a licence fee every time their blueprints are etched onto silicon by manufacturers such as TSMC on behalf of end-customers, notably the big providers of cloud computing that powers most AI applications. AI is thus a boon to the chip designers, since it benefits from more powerful chips (which tend to generate higher margins), and more of them. UBS, a bank, reckons that in the next one or two years AI will increase demand for specialist chips known as graphics-processing units (GPUs) by $10bn-15bn. As a result, Nvidia’s annual data-centre revenue, which accounts for 56% of its sales, could double. AMD is bringing out a new GPU later this year. Although it is a much smaller player in the GPU-design game than Nvidia, the scale of the AI boom...

Loading comments...