Amazon.com Inc.’s cloud-computing unit announced updated versions of its in-house computer chips while also forging closer ties with Nvidia Corp — dual efforts designed to ensure it can get enough supplies of crucial data-centre processors.
New homegrown Graviton4 chips will have as much as 30% better performance than their predecessors, Amazon Web Services (AWS) said at its annual re:Invent conference in Las Vegas. Computers using the processors will start coming online in the coming months.
The company also unveiled Trainium2, an updated version of a processor designed for artificial intelligence (AI) systems. It will begin powering new services starting next year, Amazon said. That chip provides an alternative to so-called AI accelerators sold by Nvidia — processors that have been vital to the build-out of artificial intelligence services.
But Amazon also touted “an expansion of its partnership” with Nvidia, whose chief executive officer, Jensen Huang, joined AWS counterpart Adam Selipsky on stage. AWS will be the first big user of an updated version of that company’s Grace Hopper Superchip, and it will be one of the data centre companies hosting Nvidia’s DGX Cloud service.
The new in-house chips are part of AWS’s push to maintain its lead over Microsoft Corp.’s Azure and Alphabet Inc.’s Google Cloud Platform. Earlier this month, Microsoft announced its own processors and AI accelerators. Like Amazon, it has pointed to the efficiency that they bring compared with buying off-the-shelf parts from traditional sellers such as Intel Corp.
AWS said that it’s built more than 2 million Graviton processors since beginning the project some five years ago. All of the top 100 of its users of EC2 — AWS’s family of processing power for rent — have chosen to use Graviton-based computing, the company said.
See also: Tesla Cybertruck to go on tour in China to burnish tech cred
“By focusing our chip designs on real workloads that matter to customers, we’re able to deliver the most advanced cloud infrastructure to them,” said Dave Brown, a vice president at AWS. “Graviton4 marks the fourth generation we’ve delivered in just five years. And with the surge of interest in generative AI, Trainium2 will help customers train their machine learning models faster, at a lower cost and with better energy efficiency.”
Amazon faces a delicate balancing act. Though it’s keen to tout the technological prowess of its new chips, the company also needs to maintain its relationship with Nvidia. Getting sufficient supply of Nvidia products has become a status symbol in the technology industry, with figures such as Larry Ellison and Elon Musk boasting about their ability to get the chips.
For now, Nvidia’s products are considered the industry’s best. Until Trainium or other alternatives can match their capabilities, companies like Amazon can’t afford to alienate Nvidia. And catching up won’t be easy. Nvidia’s prized H100 chip is getting a more advanced update called H200 in the first half of 2024, followed by a whole new design later in the year.
See also: Samsung races Apple to develop blood sugar monitor that doesn't break skin
Nvidia also continues to diversify. With the DGX Cloud service, it’s offering software and services, aiming to spread the use of AI more widely. AWS will also operate a supercomputer based on Nvidia hardware. When not in use for internal research and development, the “supercluster” will be made available to other users as a service.
(Updates with Nvidia CEO appearance in the fourth paragraph.)