Nvidia CEO Jensen Huang said he believes that generative AI and accelerated computing will “redefine the future” during a keynote at the annual Computex event in Taiwan.
“Today, we’re at the cusp of a major shift in computing,” Huang said. “Generative AI is reshaping industries and opening new opportunities for innovation and growth.”
Nvidia has positioned itself as a leader in the AI space. Its accelerated computing hardware including its GPUs is sought after by businesses looking to power and scale new generative AI applications and services.
Huang said AI is redefining accelerated computing across consumer-facing AI PCs and enterprise-level computing platforms in data centers.
“The future of computing is accelerated,” Huang said. “With our innovations in AI and accelerated computing, we’re pushing the boundaries of what’s possible and driving the next wave of technological advancement.”
Huang used his keynote to showcase its eventual successor: Rubin, following the unveiling of the company’s next-gen Blackwell GPUs earlier this year.
Huang showcased a road map that would see Rubin released in 2026 and updated versions of Blackwell and Rubin dropping in 2025 and 2027, respectively.
The Nvidia CEO said the company has adopted a “one-year rhythm” enabling businesses to deploy routinely updated hardware that scales in power and performance as AI workloads increase.
“Our basic philosophy is very simple: build the entire data center scale, disaggregate and sell to you parts on a one-year rhythm and push everything to technology limits,” Huang said.
Huang said the updated hardware will provide businesses with cost reductions when running AI applications as the new GPUs use less power and achieve up to 100 times improved performance.
He said the amount of energy used when running OpenAI’s GPT-4 on Blackwell has been reduced by 350 times.
The power of Nvidia’s hardware exceeds predictions made under Moore’s Law, which suggests that the number of integrated circuits on a chip doubles about every two years.
Nvidia has witnessed a 1000x increase in AI compute in just eight years, Huang said, going from 19 TFLOPS on its Pascal GPUs in 2016 to 20,000 TFLOPS on the new Blackwells.
“Whenever we bring the computation high, the cost goes down,” Huang said. “Accelerated computing is sustainable computing.”
Physical AI, Advancements in Robotics
Huang said the next wave of AI will be physical, highlighting advances in robotics.
“AI that understands the laws of physics, AI that can work among us,” Huang said, “Robotics is here. Physical AI is here. This is not science fiction and it’s being used all over Taiwan. It’s just really, really exciting.”
Nvidia already caters to robotics applications through its pretrained model suites for training.
The company announced at its GTC event earlier this year that it’s expanding into the humanoid robot space with its GR00T humanoid robot platform, which enables robots to understand natural language and mimic human movements.
Huang said in his keynote that all future factories will be robotic.
“The factories will orchestrate robots and those robots will be building products that are robotic,” he said.
This article first appeared in IoT World Today's sister publication AI Business.
Read more about:
AsiaAbout the Author
You May Also Like