arstechnica.com Open in urlscan Pro
3.130.231.34  Public Scan

Submitted URL: https://link.mail.beehiiv.com/ss/c/B8yiyuGjS2D8eI_1wSsXFBlqC47hrzJ3AvEXqJNClpBmX38Z2TjlsQHqJV0WmXm1hZa_DoUWxiGUntr0KfwdQk1MdqJ...
Effective URL: https://arstechnica.com/information-technology/2023/11/nvidia-introduces-its-most-powerful-gpu-yet-designed-for-accelera...
Submission: On November 15 via api from PT — Scanned from PT

Form analysis 1 forms found in the DOM

GET /search/

<form action="/search/" method="GET" id="search_form">
  <input type="hidden" name="ie" value="UTF-8">
  <input type="text" name="q" id="hdr_search_input" value="" aria-label="Search..." placeholder="Search...">
</form>

Text Content

Skip to main content
 * Biz & IT
 * Tech
 * Science
 * Policy
 * Cars
 * Gaming & Culture
 * Store
 * Forums

Subscribe

Close


NAVIGATE

 * Store
 * Subscribe
 * Videos
 * Features
 * Reviews

 * RSS Feeds
 * Mobile Site

 * About Ars
 * Staff Directory
 * Contact Us

 * Advertise with Ars
 * Reprints


FILTER BY TOPIC

 * Biz & IT
 * Tech
 * Science
 * Policy
 * Cars
 * Gaming & Culture
 * Store
 * Forums


SETTINGS

Front page layout


Grid


List


Site theme

light

dark

Sign in

OF MYTHS AND MONSTERS —


NVIDIA INTRODUCES THE H200, AN AI-CRUNCHING MONSTER GPU THAT MAY SPEED UP
CHATGPT


THE H200 WILL LIKELY POWER THE NEXT GENERATION OF AI CHATBOTS AND ART
GENERATORS.

Benj Edwards - 11/13/2023, 8:44 PM

Enlarge / Eight Nvidia H200 GPUs covered with a fanciful blue explosion that
figuratively represents raw compute power bursting forth in a glowing flurry.
Nvidia | Benj Edwards

READER COMMENTS

71 with

On Monday, Nvidia announced the HGX H200 Tensor Core GPU, which utilizes the
Hopper architecture to accelerate AI applications. It's a follow-up of the H100
GPU, released last year and previously Nvidia's most powerful AI GPU chip. If
widely deployed, it could lead to far more powerful AI models—and faster
response times for existing ones like ChatGPT—in the near future.

According to experts, lack of computing power (often called "compute") has been
a major bottleneck of AI progress this past year, hindering deployments of
existing AI models and slowing the development of new ones. Shortages of
powerful GPUs that accelerate AI models are largely to blame. One way to
alleviate the compute bottleneck is to make more chips, but you can also make AI
chips more powerful. That second approach may make the H200 an attractive
product for cloud providers.


FURTHER READING

Nvidia’s powerful H100 GPU will ship in October

What's the H200 good for? Despite the "G" in the "GPU" name, data center GPUs
like this typically aren't for graphics. GPUs are ideal for AI applications
because they perform vast numbers of parallel matrix multiplications, which are
necessary for neural networks to function. They are essential in the training
portion of building an AI model and the "inference" portion, where people feed
inputs into an AI model and it returns results.

"To create intelligence with generative AI and HPC applications, vast amounts of
data must be efficiently processed at high speed using large, fast GPU memory,"
said Ian Buck, vice president of hyperscale and HPC at Nvidia in a news release.
"With Nvidia H200, the industry’s leading end-to-end AI supercomputing platform
just got faster to solve some of the world’s most important challenges."

For example, OpenAI has repeatedly said it's low on GPU resources, and that
causes slowdowns with ChatGPT. The company must rely on rate limiting to provide
any service at all. Hypothetically, using the H200 might give the existing AI
language models that run ChatGPT more breathing room to serve more customers.

Advertisement



4.8 TERABYTES/SECOND OF BANDWIDTH

Enlarge / Eight Nvidia H200 GPU chips on a HGX carrier board.
Nvidia

According to Nvidia, the H200 is the first GPU to offer HBM3e memory. Thanks to
HBM3e, the H200 offers 141GB of memory and 4.8 terabytes per second bandwidth,
which Nvidia says is 2.4 times the memory bandwidth of the Nvidia A100 released
in 2020. (Despite the A100's age, it's still in high demand due to shortages of
more powerful chips.)

Nvidia will make the H200 available in several form factors. This includes
Nvidia HGX H200 server boards in four- and eight-way configurations, compatible
with both hardware and software of HGX H100 systems. It will also be available
in the Nvidia GH200 Grace Hopper Superchip, which combines a CPU and GPU into
one package for even more AI oomph (that's a technical term).


FURTHER READING

Nvidia’s new monster CPU+GPU chip may power the next gen of AI chatbots

Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud
Infrastructure will be the first cloud service providers to deploy H200-based
instances starting next year, and Nvidia says the H200 will be available "from
global system manufacturers and cloud service providers" starting in Q2 2024.

Meanwhile, Nvidia has been playing a cat-and-mouse game with the US government
over export restrictions for its powerful GPUs that limit sales to China. Last
year, the US Department of Commerce announced restrictions intended to "keep
advanced technologies out of the wrong hands" like China and Russia. Nvidia
responded by creating new chips to get around those barriers, but the US
recently banned those, too.

Last week, Reuters reported that Nvidia is at it again, introducing three new
scaled-back AI chips (the HGX H20, L20 PCIe, and L2 PCIe) for the Chinese
market, which represents a quarter of Nvidia's data center chip revenue. Two of
the chips fall below US restrictions, and a third is in a "gray zone" that might
be permissible with a license. Expect to see more back-and-forth moves between
the US and Nvidia in the months ahead.



READER COMMENTS

71 with
Benj Edwards Benj Edwards is an AI and Machine Learning Reporter for Ars
Technica. In his free time, he writes and records music, collects vintage
computers, and enjoys nature. He lives in Raleigh, NC.

Advertisement




CHANNEL ARS TECHNICA

UNSOLVED MYSTERIES OF QUANTUM LEAP WITH DONALD P. BELLISARIO

Today "Quantum Leap" series creator Donald P. Bellisario joins Ars Technica to
answer once and for all the lingering questions we have about his enduringly
popular show. Was Dr. Sam Beckett really leaping between all those time periods
and people or did he simply imagine it all? What do people in the waiting room
do while Sam is in their bodies? What happens to Sam's loyal ally Al? 30 years
following the series finale, answers to these mysteries and more await.

 * UNSOLVED MYSTERIES OF QUANTUM LEAP WITH DONALD P. BELLISARIO

 * UNSOLVED MYSTERIES OF WARHAMMER 40K WITH AUTHOR DAN ABNETT

 * SITREP: F-16 REPLACEMENT SEARCH A SIGNAL OF F-35 FAIL?

 * SITREP: BOEING 707

 * STEVE BURKE OF GAMERSNEXUS REACTS TO THEIR TOP 1000 COMMENTS ON YOUTUBE

 * MODERN VINTAGE GAMER REACTS TO HIS TOP 1000 COMMENTS ON YOUTUBE

 * HOW THE NES CONQUERED A SKEPTICAL AMERICA IN 1985

 * SCOTT MANLEY REACTS TO HIS TOP 1000 YOUTUBE COMMENTS

 * HOW HORROR WORKS IN AMNESIA: REBIRTH, SOMA AND AMNESIA: THE DARK DESCENT

 * LGR'S CLINT BASINGER REACTS TO HIS TOP 1000 YOUTUBE COMMENTS

 * THE F-35'S NEXT TECH UPGRADE

 * HOW ONE GAMEPLAY DECISION CHANGED DIABLO FOREVER

 * UNSOLVED MORTAL KOMBAT MYSTERIES WITH DOMINIC CIANCIOLO FROM NETHERREALM
   STUDIOS

 * US NAVY GETS AN ITALIAN ACCENT

 * HOW AMAZON’S “UNDONE” ANIMATES DREAMS WITH ROTOSCOPING AND OIL PAINTS

 * FIGHTER PILOT BREAKS DOWN EVERY BUTTON IN AN F-15 COCKPIT

 * HOW NBA JAM BECAME A BILLION-DOLLAR SLAM DUNK

 * LINUS "TECH TIPS" SEBASTIAN REACTS TO HIS TOP 1000 YOUTUBE COMMENTS

 * HOW ALAN WAKE WAS REBUILT 3 YEARS INTO DEVELOPMENT

 * HOW PRINCE OF PERSIA DEFEATED APPLE II'S MEMORY LIMITATIONS

 * HOW CRASH BANDICOOT HACKED THE ORIGINAL PLAYSTATION

 * MYST: THE CHALLENGES OF CD-ROM | WAR STORIES

 * MARKIPLIER REACTS TO HIS TOP 1000 YOUTUBE COMMENTS

 * HOW MIND CONTROL SAVED ODDWORLD: ABE'S ODDYSEE

 * BIOWARE ANSWERS UNSOLVED MYSTERIES OF THE MASS EFFECT UNIVERSE

 * CIVILIZATION: IT'S GOOD TO TAKE TURNS | WAR STORIES

 * SITREP: DOD RESETS BALLISTIC MISSILE INTERCEPTOR PROGRAM

 * WARFRAME'S REBECCA FORD REVIEWS YOUR CHARACTERS

 * SUBNAUTICA: A WORLD WITHOUT GUNS | WAR STORIES

 * HOW SLAY THE SPIRE’S ORIGINAL INTERFACE ALMOST KILLED THE GAME | WAR STORIES

 * AMNESIA: THE DARK DESCENT - THE HORROR FACADE | WAR STORIES

 * COMMAND & CONQUER: TIBERIAN SUN | WAR STORIES

 * BLADE RUNNER: SKINJOBS, VOXELS, AND FUTURE NOIR | WAR STORIES

 * DEAD SPACE: THE DRAG TENTACLE | WAR STORIES

 * TEACH THE CONTROVERSY: FLAT EARTHERS

 * DELTA V: THE BURGEONING WORLD OF SMALL ROCKETS, PAUL ALLEN'S HUGE PLANE, AND
   SPACEX GETS A CRUCIAL GREEN-LIGHT

 * CHRIS HADFIELD EXPLAINS HIS 'SPACE ODDITY' VIDEO

 * THE GREATEST LEAP, EPISODE 1: RISK

 * ULTIMA ONLINE: THE VIRTUAL ECOLOGY | WAR STORIES

More videos
← Previous story Next story →


RELATED STORIES

by Taboolaby Taboola
Sponsored LinksSponsored Links
Promoted LinksPromoted Links
TOP5 Audição

Quanto custa um aparelho auditivo em Outubro de 2023?TOP5 AudiçãoClique aqui


Undo
Securitas Direct

Descubra se tem direito à instalação gratuita deste alarmeSecuritas Direct


Undo
Techno Mag

Have all the TV channels? It’ s now possibleTechno MagLearn More


Undo
investing.com

At 86, This Is Where Jack Nicholson Livesinvesting.com


Undo
Securitas

Todos estão a comprar este sistema de alarme que afugenta os intrusosSecuritas


Undo
Mighty Scoops

The Oldest Stars Who Are Still AliveMighty Scoops


Undo



TODAY ON ARS

 * Store
 * Subscribe
 * About Us
 * RSS Feeds
 * View Mobile Site

 * Contact Us
 * Staff
 * Advertise with us
 * Reprints


NEWSLETTER SIGNUP

Join the Ars Orbital Transmission mailing list to get weekly updates delivered
to your inbox. Sign me up →



CNMN Collection
WIRED Media Group
© 2023 Condé Nast. All rights reserved. Use of and/or registration on any
portion of this site constitutes acceptance of our User Agreement (updated
1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Ars
Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from
links on this site. Read our affiliate link policy.
Your California Privacy Rights | Manage Preferences
The material on this site may not be reproduced, distributed, transmitted,
cached or otherwise used, except with the prior written permission of Condé
Nast.
Ad Choices




We and our partners store and/or access information on a device, such as unique
IDs in cookies to process personal data. You may accept or manage your choices
by clicking below or at any time in the privacy policy page. These choices will
be signaled to our partners and will not affect browsing data.More information
about your privacy


WE AND OUR PARTNERS PROCESS DATA TO PROVIDE:

Use precise geolocation data. Actively scan device characteristics for
identification. Store and/or access information on a device. Personalised ads
and content, ad and content measurement, audience insights and product
development. List of Partners (vendors)

I Accept
Show Purposes