Skip to Content
8:48 Webinar

NVIDIA and Pure Storage: Unlocking enterprise AI and building a proven solution together

With NVIDIA and Pure Storage, it isn’t just GPU and flash; we’ve built an entire proven solution together.
This webinar first aired on June 18, 2025
Click to View Transcript
00:25
Welcome. Let me welcome Matthew Hall. Let me welcome Matthew Hall, Vice President of Global AI Solutions. Welcome. Pleasure to be here. Great to have you here. Good morning, Matthew. Tell us a little bit about you and your role at Nvidia.
00:39
Yeah, I just want to get started by saying, I know Charlie asked who wanted to buy 100,000 GPUs. And he said he didn't see any hands and I know there's a lot of people here, so maybe you missed them or people are too shy. So if you're in the market for 100,000 GPUs, please come see me after. We'll have a chat.
00:57
Um, in all seriousness, Sean, so I've been with Nvidia for about 8 years. My role is to democratize artificial intelligence. It's to make sure that every enterprise customer out there has the ability to deploy AI and do it in a very simplistic way that helps enable their business. I would love to be democratized democratized is good.
01:18
Why, why is it important for people here to know why Pure and Nvidia are working together? So at its very basic level, AI is 3 things. There's 3 legs to the AI stool. There's data. There's lots of processing power, compute, and then there's the math to bring it together. If you think about data and compute, what two better companies to bring AI to life?
01:43
We talk about an AI factory. Our CEO talks a lot about an AI factory. The factories of years ago, your inputs would be coal and water, generate steam, your outputs would be a car or tires or whatever it may be. The factories of today are far more than data centers.
02:02
The input is energy, the input is data. And the output is business intelligence that actually drives enterprises forward, so it's a huge deal for us to be working together to very simply deliver these AI factories out to all of our customers. Interesting AI factories. So, um, you often talk about simplicity a lot
02:21
in your presentations. I'm interested as you're sitting at the center of this adoption worldwide, we would love to understand what do you hear your customers saying prevent you from. Becoming an AI factory, what do you see is the obstacles that you think we should most care about? Yeah, absolutely.
02:36
So this has been a journey. Um, AI has been around since the 1950s with the Turing test, but what's happened over the past decade is we've brought more data and more compute power to the world than we've ever seen. So it's actually making AI a reality. And 3 years ago Chat GBT was released and it was a wake up call to the world that wow,
02:56
this actually works. And what it meant is that all of our kids could go cheat on their homework and write haikus about unicorns, but what it really meant for us. Was that enterprises started to experiment in real ways. 3 years later we're at the point where every enterprise I talked to has a plan about how to
03:16
go very big on AI. It's no longer the experimentation phase. It's how do I do this in the right way for my business? How do I serve my constituents within the organization? Years ago when this started in the enterprise, quite frankly, IT wasn't the best helper.
03:33
Folks went out and bought bought in data scientists into the marketing organization and their operations organization. And they said, we love these GPUs, we love Pytorch, we love Tensorflow, we need new storage technologies. And I said, sir, we don't have that for you.
03:49
So what happened is we saw the advent of what I call shadow AI. You had all these little pockets around the enterprise that started doing AI their own way their own data sets, their own compute resources spun up an incense in AWS, bought a workstation. What we're seeing now is CIOs, IT leadership are coming in.
04:08
And really trying to reel in all of the massive AI applications. It's now hundreds of applications that enterprises are looking at provide the right resources, the data, the ability to store the data, to move it quickly into the compute, the right compute resources to serve the entire enterprise shadow AI. To mainstreamstream AI controlled, thoughtful, performing and cost efficient.
04:32
I'm seeing the same dynamic in my house. I was the one using chat GP in the beginning, but now my kids and wife have taken it over. Yes, and I, we get family reunions run by Chat GPT. Yes, it's great. OK, well. Um, I would be interested in understanding if
04:47
we talked a little bit about this AI enterprise index of how customers can move from one face to another. What do you see as the biggest obstacles to getting that under control? Yeah, I love the index. Um, I talked to enterprises that live at stage 1234, they're all great stages.
05:05
What we're seeing a lot of enterprises endeavor to do is adopt a number of different AI capabilities, hybrid capabilities. There are a lot of benefits for folks building their own physical AI factories, whether it's in your data center, whether it's in a co-location facility. There's cost control, security, privacy that comes into play.
05:26
So number one, it's about the solution itself and we're making it very simple. It couldn't be simpler. It's not just storage and compute. We have the entire package AI factories that we can roll out with pure storage and Nvidia in a matter of weeks, right? Number 2 is the organizational structure and the thought processes.
05:45
This is a sea change in thinking. There's connected tissue between different folks in the organizations that hasn't existed before, so that connected tissue in the way that decisions are made across those various groups is extremely important. We need executive ownership. We've seen the companies that have been really.
06:03
Successful we'll get the buy-in at the C-suite level, get the funding at the C-suite level, have a forcing function of the C-suite level to pull all the teams together, and it makes them go very quickly. The ones that are struggling to get that coordination are the ones that are a little bit behind in enterprise AI. We love this focus on simplicity that you mentioned.
06:22
We've tried really hard to work with you to make sure those designs are pre-vetted, so, uh, CIOs feel like they can trust that. Um, you've, you've written a lot online about data gravity and how AI follows the data. I think was the last post that you just made. Can you tell us a little bit about what you mean by that and what that means for our
06:41
customers here? Every single customer I talk to tells me that they want to bring the compute to the data. They don't want to lift and shift the data to take it to some compute in some abstracted location. As we look at the AI journey, a lot of the data is gonna live in a data center.
07:00
It's gonna live maybe in S3 in the public cloud. It's gonna start to live at the edge. You think about where we're going with AI all of the focus to date has really been about large scale training in the data center. The next phase is inference. Data is gonna be created and stored in the self-driving cars,
07:17
the robots that are doing surgery. There's gonna be considerations about how that data is moved, how do you compute it at those various locations. But I think we all need to approach AI with a very data-centric point of view. Wherever the data lives is where we want to bring the compute,
07:32
where we want to do the work. So we talk a lot about unifying the data plane like you said, so edge to core, making sure it's one common unified platform that's important to us. Um, tell us a little bit about what it means for AI to happen everywhere. A little more about that. What do you mean by the old world of the AI
07:49
training was very different than inference. What happens in the new world do you envision? The new world, as I mentioned, robots, self-driving cars, I mean, these things are real. And when a robot is performing surgery on a relative or you're in a car with your son or daughter doing 80 miles an hour in a self-driving car,
08:07
the latency of the decision making is really important, and there's has to be structure around where that data is going to reside. It's gonna reside in the data center all the way out that advice. You're gonna really have to think about where you're doing the inference along that trap, but it's all about being close to the data. Interesting. Well, you shared a few really interesting
08:26
things with us. AI factories moving from shadow AI to mainstream IT, simplicity and AI happening everywhere. Thank you so much, really a pleasure to be with you today. Thank you. Thank you, Matthew. Thank you.
  • Artificial Intelligence
  • NVIDIA
  • Pure//Accelerate
09/2025
Pure Storage FlashArray//X: Mission-critical Performance
Pack more IOPS, ultra consistent latency, and greater scale into a smaller footprint for your mission-critical workloads with Pure Storage®️ FlashArray//X™️.
Data Sheet
4 pages
Continue Watching
We hope you found this preview valuable. To continue watching this video please provide your information below.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Your Browser Is No Longer Supported!

Older browsers often represent security risks. In order to deliver the best possible experience when using our site, please update to any of these latest browsers.