Matthew Hull, Vice President of Global AI Platform Solutions at NVIDIA, and students from the UW Foster School of Business Master, spoke on March 11 as part of UW’s AI Taskfor Town Hall. The town hall was hosted by the management and organizational affiliate instructors of Emer Dooley’s UW at Foster School of Business.
Hull, who previously worked for Seagate and HP, started by highlighting the evolution of computer hardware. Earlier in his career, Hal realized that most of his work dealt with hardware for data storage. This culminated in his work at Nvidia where they developed a high power graphics processing unit (GPU) widely adopted in AI for its ability to quickly process many computations in parallel, in a row, required for training large AI models.
“We happen to build chips, but that’s not our ultimate goal,” Hal said. “It’s about building a complete solution.”
Hull highlighted how building successful AI models relies on more than hardware. It has software that relies on data and data storage and can utilize AI. Hull also noted that AI is very computationally expensive, and that Openai and Xai models are trained on over 100,000 GPUs.
The cost of this training had a strong impact on power consumption. Hull explained how chips are efficient in each generation, but the amount of raw computing power required for AI models still outweighs these optimizations.
“As time goes by, inference will become more exponentially broader than training,” Hull said.
Hull went on to explain that although AI relies heavily on pretraining models, the important task that AI encounters is to predict inference, or outcomes outside the training data the model has access to.
Beyond reasoning, AI engineers have recently been working on including agencies in their models. Hull approached the problem by using an example of inference agent or training an AI model to infer via steps and solving those sub-problems with other AI models.
Hull recognized the significant trade-off between the benefits of using data to get better results from AI models and the potential risk of using this data in published models.
“We don’t want to over-tune our AI out of fear,” Hal said. “But we don’t want to underregulate it, so these regulatory guardrails have to come over time.”
In terms of correlation with AI, particularly autonomous driving, Hull compared how technologies like nuclear fission are similar and noted that they overregulated the growth of the nuclear industry.
The hull has moved to a discussion of student impact. Specifically, both Hull and Dooley agreed that AI is a tool, but should include humans as part of the decision-making process.
“I think humanity can focus on higher values than what we do today,” Hal said. “AIS replaces many everyday tasks, which will lead to a higher quality of life for everyone.”
More information about Nvidia’s research on AI agents can be found here.
Reach reporter Jack Li at news@dailyuw.com. X: @jackli789
Like what you’re reading? By donating here, we support high quality student journalism.