NVIDIA introduced a new generation of artificial intelligence (AI) chips and software at its San Jose developer's conference on Monday, aiming to dominate as the preferred supplier for AI companies.
NVIDIA's Powerful H100 Chips
NVIDIA's H100 chips, essential for training large language models like ChatGPT, propelled it to become one of the world's most valuable companies in less than two years. On Monday, NVIDIA unveiled Blackwell, a next-gen platform with chips up to 30 times faster and 25 times more energy-efficient than the H100.
Blackwell: Engine to Power Industrial Revolution
NVIDIA CEO Jensen Huang declared at the company's annual GTC event in San Jose, attended by thousands of developers, that "Blackwell GPUs are the engine driving this new Industrial Revolution." According to the press release, Generative AI is the defining technology of our time, and by collaborating with the most innovative companies globally, "we will realize the promise of AI for every industry."
READ ALSO: NVIDIA's Next Frontier, Vietnam Expansion on AI Revolution
NVIDIA's Blackwell chips are named after mathematician David Harold Blackwell, renowned for his game and statistics theory. Claiming the title of the world's most potent chip, Blackwell offers a significant performance boost for AI companies, boasting speeds of 20 petaflops compared to the H100's four petaflops. This remarkable speed is primarily attributed to Blackwell's 208 billion transistors, surpassing the H100's 80 billion. To achieve this, NVIDIA utilized two large chip dies interconnected at speeds of up to 10 terabytes per second.
Other Tech CEOs Confident Endorsements of NVIDIA's Chips
Demonstrating the extent of the modern AI revolution's reliance on NVIDIA's chips, the company's press release features endorsements from seven CEOs overseeing companies with a combined worth of trillions of dollars. These leaders include Sam Altman from OpenAI, Satya Nadella from Microsoft, Sundar Pichai from Alphabet, Mark Zuckerberg from Meta, Demis Hassabis from Google DeepMind, Larry Ellison from Oracle, Michael Dell from Dell, and Elon Musk from Tesla.
In the statement, Musk emphasizes that NVIDIA hardware is the top choice for AI, and Altman expresses enthusiasm for Blackwell's significant performance improvements, indicating a desire to continue collaborating with NVIDIA to advance AI computing.
NVIDIA has not revealed the pricing for Blackwell chips. As reported by CNBC, the H100 chips are currently priced between $25,000 and $40,000 per chip, with entire systems utilizing these chips reaching up to $200,000 in cost.
Despite their expensive price tags, NVIDIA's chips remain highly sought after. In the past year, delivery delays stretched as long as 11 months, and access to NVIDIA's AI chips has become a status symbol for tech firms aiming to lure AI talent. Zuckerberg highlighted Meta's initiatives to construct extensive infrastructure to support its AI endeavors earlier this year, mentioning that by the end of this year, Meta will have approximately 350,000 Nvidia H100s, a total of about 600,000 H100 equivalents of compute if you factor in other GPUs.
RELATED ARTICLE: Nvidia Faces Lawsuit for Allegedly Using Copyrighted Materials to Train AI NeMo Without Permission
© 2017 Jobs & Hire All rights reserved. Do not reproduce without permission.