AMD reveals next-generation AI chips with OpenAI CEO Sam Altman

Metro Loud
10 Min Read


Lisa Su, CEO of Superior Micro Gadgets, testifies through the Senate Commerce, Science and Transportation Committee listening to titled “Profitable the AI Race: Strengthening U.S. Capabilities in Computing and Innovation,” in Hart constructing on Thursday, Could 8, 2025.

Tom Williams | CQ-Roll Name, Inc. | Getty Photographs

Superior Micro Gadgets on Thursday unveiled new particulars about its next-generation AI chips, the Intuition MI400 collection, that can ship subsequent 12 months.

The MI400 chips will be capable to be assembled right into a full server rack known as Helios, AMD stated, which can allow 1000’s of the chips to be tied collectively in a means that they can be utilized as one “rack-scale” system.

“For the primary time, we architected each a part of the rack as a unified system,” AMD CEO Lisa Su stated at a launch occasion in San Jose, California, on Thursday.

OpenAI CEO Sam Altman appeared on stage on with Su and stated his firm would use the AMD chips.

“While you first began telling me in regards to the specs, I used to be like, there is not any means, that simply sounds completely loopy,” Altman stated. “It is gonna be a tremendous factor.”

AMD’s rack-scale setup will make the chips look to a consumer like one system, which is vital for many synthetic intelligence prospects like cloud suppliers and firms that develop massive language fashions. These prospects need “hyperscale” clusters of AI computer systems that may span complete information facilities and use large quantities of energy.

“Consider Helios as actually a rack that capabilities like a single, large compute engine,” stated Su, evaluating it towards Nvidia’s Vera Rubin racks, that are anticipated to be launched subsequent 12 months.

OpenAI CEO Sam Altman poses through the Synthetic Intelligence (AI) Motion Summit, on the Grand Palais, in Paris, on February 11, 2025. 

Joel Saget | Afp | Getty Photographs

AMD’s rack-scale know-how additionally permits its newest chips to compete with Nvidia’s Blackwell chips, which already are available configurations with 72 graphics-processing items stitched collectively. Nvidia is AMD’s major and solely rival in large information heart GPUs for growing and deploying AI functions.

OpenAI — a notable Nvidia buyer — has been giving AMD suggestions on its MI400 roadmap, the chip firm stated. With the MI400 chips and this 12 months’s MI355X chips, AMD is planning to compete towards rival Nvidia on worth, with an organization govt telling reporters on Wednesday that the chips will value much less to function because of decrease energy consumption, and that AMD is undercutting Nvidia with “aggressive” costs.

To this point, Nvidia has dominated the marketplace for information heart GPUs, partially as a result of it was the primary firm to develop the form of software program wanted for AI builders to benefit from chips initially designed to show graphics for 3D video games. Over the previous decade, earlier than the AI increase, AMD centered on competing towards Intel in server CPUs.

Su stated that AMD’s MI355X can outperform Nvidia’s Blackwell chips, regardless of Nvidia utilizing its “proprietary” CUDA software program.

“It says that now we have actually robust {hardware}, which we all the time knew, but it surely additionally exhibits that the open software program frameworks have made great progress,” Su stated.

AMD shares are flat to this point in 2025, signaling that Wall Avenue would not but see it as a serious risk to Nvidia’s dominance.

Andrew Dieckmann, AMD’s basic manger for information heart GPUs, stated Wednesday that AMD’s AI chips would value much less to function and fewer to accumulate.

“Throughout the board, there’s a significant value of acquisition delta that we then layer on our efficiency aggressive benefit on prime of, so vital double-digit share financial savings,” Dieckmann stated.

Over the subsequent few years, large cloud corporations and international locations alike are poised to spend a whole lot of billions of {dollars} to construct new information heart clusters round GPUs as a way to speed up the event of cutting-edge AI fashions. That features $300 billion this 12 months alone in deliberate capital expenditures from megacap know-how corporations.

AMD is anticipating the overall marketplace for AI chips to exceed $500 billion by 2028, though it hasn’t stated how a lot of that market it might declare — Nvidia has over 90% of the market at present, in line with analyst estimates.

Each corporations have dedicated to releasing new AI chips on an annual foundation, versus a biannual foundation, emphasizing how fierce competitors has turn into and the way vital bleeding-edge AI chip know-how is for corporations like Microsoft, Oracle and Amazon.

AMD has purchased or invested in 25 AI corporations up to now 12 months, Su stated, together with the buy of ZT Methods earlier this 12 months, a server maker that developed the know-how AMD wanted to construct its rack-sized techniques.

“These AI techniques are getting tremendous difficult, and full-stack options are actually crucial,” Su stated.

What AMD is promoting now

At the moment, probably the most superior AMD AI chip being put in from cloud suppliers is its Intuition MI355X, which the corporate stated began transport in manufacturing final month. AMD stated that it might be out there for lease from cloud suppliers beginning within the third quarter.

Firms constructing massive information heart clusters for AI need alternate options to Nvidia, not solely to maintain prices down and supply flexibility, but additionally to fill a rising want for “inference,” or the computing energy wanted for really deploying a chatbot or generative AI utility, which might use rather more processing energy than conventional server functions.

“What has actually modified is the demand for inference has grown considerably,” Su stated.

AMD officers stated Thursday that they consider their new chips are superior for inference to Nvidia’s. That is as a result of AMD’s chips are geared up with extra high-speed reminiscence, which permits larger AI fashions to run on a single GPU.

The MI355X has seven occasions the quantity of computing energy as its predecessor, AMD stated. These chips will be capable to compete with Nvidia’s B100 and B200 chips, which have been transport since late final 12 months.

AMD stated that its Intuition chips have been adopted by seven of the ten largest AI prospects, together with OpenAI, Tesla, xAI, and Cohere.

Oracle plans to supply clusters with over 131,000 MI355X chips to its prospects, AMD stated.

Officers from Meta stated Thursday that they have been utilizing clusters of AMD’s CPUs and GPUs to run inference for its Llama mannequin, and that it plans to purchase AMD’s next-generation servers.

A Microsoft consultant stated that it makes use of AMD chips to serve its Copilot AI options.

Competing on worth

AMD declined to say how a lot its chips value — it would not promote chips by themselves, and end-users often purchase them by way of a {hardware} firm like Dell or Tremendous Micro Laptop — however the firm is planning for the MI400 chips to compete on worth.

The Santa Clara firm is pairing its GPUs alongside its CPUs and networking chips from its 2022 acquisition of Pensando to construct its Helios racks. Which means higher adoption of its AI chips must also profit the remainder of AMD’s enterprise. It is also utilizing an open-source networking know-how to intently combine its rack techniques, known as UALink, versus Nvidia’s proprietary NVLink.

AMD claims its MI355X can ship 40% extra tokens — a measure of AI output — per greenback than Nvidia’s chips as a result of its chips use much less energy than its rival’s.

Knowledge heart GPUs can value tens of 1000’s of {dollars} per chip, and cloud corporations often purchase them in massive portions.

AMD’s AI chip enterprise continues to be a lot smaller than Nvidia’s. It stated it had $5 billion in AI gross sales in its fiscal 2024, however JP Morgan analysts predict 60% progress within the class this 12 months.

WATCH: AMD CEO Lisa Su: Chip export controls are a headwind however we nonetheless see progress alternative

Share This Article