Skip to Content
14:40 Video

Building AI Centers of Excellence with Pure + Nvidia

Join Vaughn Stewart, Pure Storage VP of Global Technology Alliances, and Matt Hull, NVIDIA VP of Global AI Data Center Solutions Sales, as they discuss the latest developments and the future of AI Centers of Excellence using Pure AIRI incorporating NVIDIA DGX.
Click to View Transcript
Mm. Hey everyone, this is von Stewart with pure storage and it is my privilege to share with you our new Ai ready infrastructure areas and to to to share this news with you and have a little bit of dialogue about how this came to be? I'm joined by by Matt Hall, Vice President Global, Ai Data Center Solutions at NVIDIA.
Matt. It's always a privilege when we get a chance to sit down. How are you? Absolutely. It's been too long, it's great to be here and I'm looking forward to having the next 15 minutes of conversation with you, a lot of exciting things coming out and we've done a lot over the past four years together.
So I look forward to sharing the good news with everybody that tuned in here. Yes, I couldn't have said it any better. So obviously the the Ai data sciences ecosystem is exploding. It's been great to be a partner with NVIDIA and be a part of this journey. And and obviously today we're gonna talk about the latest incarnation of our joint AI Ready Infrastructure, a K A R E.
The new version is called areas. Um but I think before maybe we dive into what's new, maybe it'd be great to maybe just pause a little bit and talk about, you know, how area came to be what the AI market looked like at that time, Maybe some of the areas that that, you know, we've seen some success and then let's talk about today and tomorrow's requirements and where the,
you know, the new areas maybe is in a better position and more capable to help today and tomorrow's efforts. How does that sound to you? Sounds like a plan? Sounds like a plan. Alright, so, so obviously in 2018, 4 years ago we kicked off area together we were um I think maybe um leaning into the partnership a little bit more and and and helping to you know,
kind of what I think eventually became like the G. D. X. Pod kind of of reference architecture from NVIDIA and we we were kind of the pioneers in the space kind of leading in. But I think back then and and you're probably in a better position to comment on this, deny that I think back then, you know, the state of Ai and Ai based data sciences back
then was was much more immature than it is today. You know, I recall vividly a lot more kind of like siloed efforts. Maybe departmental, you know, not like not kind of like shared infrastructure, anything kind of sitting in the core data center. Am I painted a bright picture of what the world looked like four years ago when we kicked off
area, you're spot on growing this industry is moving faster than anything I've ever been fortunate to be part of. I've been here about five years, edit video and boy, it's been a blur. I was surprised to learn that it's been four years since we launched our first area. That was really an eye opener. Um when I joined in video, we were at this stage where a,
I was starting to come out of the novels and the science fiction books had become a reality. And the reason for that is we have more data and the data is sorted more than it's been in the past. It's curated and we have higher compute power than we've ever had with our GPS and we have more science and math with a lot of the AI models that are coming to bear.
So that's what's taken ai out of the abstract idea. Maybe it's something that enterprises can adopt. But five years ago it was still somewhat abstract folks realized that they could start to operationalize it. But it was really, really hard.
And what we realized that NVIDIA is we had an absolute wood chip or a power tool with our jeep you to allow us to go out and turn through the data but that's just one piece of the equation. And when we came together four years ago when we said, hey look, this is all about the data. We need to make sure that we marry the computing the data together.
That was really the spark and the journey that we've been on over those four years to continue to build out the entire ecosystem and infrastructure around the compute and the data but making it really easy and affordable for every enterprise to adopt artificial intelligence and to operationalize it. That's really been this part. So four years ago it was a lot of back office research happening and today we are deploying
um, infrastructure for enterprises that are using in production and they're really starting to see the benefits of artificial intelligence from the bottom line, their top line and their time to market. It's it's really stunning. Yeah. And four, so four years ago, you know, we came together to try to help customers, you know,
really be able to capture or harness the power of those of those GPU s and and and the architecture at that time was was, you know, D G X based married up with with flash blade and, you know, we had a we had a multiple of of of the fabric partners right to provide us high speed links. Um, and and we've updated airy, you know, as we've gone through the years, right?
We introduced the A 100 the MIG technology and it was, it was almost like this leapfrog if you will, in terms of performance and power and parallelism within the A 100 platforms. So that customers could, could run more more jobs, right? Could enable more teams could almost even think of it as as bringing a more kind of shared a i infrastructure, if you will right,
allowing more departments teams tools to kind of to kind of take place. Um, and, and so, you know, with that we've seen some some some great wins, whether it's been in like the federal government or public sector, you know powering, you know, the the, the, the um, information and, and national security missions right? In areas like hLS,
you know, accelerating, you know, research, whether it's been oncology based or covid based or automating like the reading of medical images, but you know, telco with five G rollouts, we've helped with that um, together as well as, you know, automotive and semiconductor. And I'm, I'm sure I'm missing some more as we share there.
But you know, we've had a great deal of success, but it kind of comes up to where we're at today. Right. Which is, I felt like the, with the launch of new areas, we're we're we're getting ready to kind of think kind of the, the, the the catching up to where the A 100 the MIG technology kind of took that bleep frog inside of a very by introducing flash blade s right.
And allowing us to take this significant increase in terms of our performance and our scale our density. Um, so that again, I think we can continue to to um, provide that power, keep the the, the, the data fueling the GPS if you will. Um, I said a lot there. My apologies. But um, So today we've got areas right.
A 100 based flash blade? S powered for the storage were brought in, um in video, uh, switching. Um, so that we can, we can, we can have the fastest fabric, fastest fabric out there. Right. And, and, and you know, really kind of simplify that stack from a vendor support model as well. Um, maybe I should pause her.
How do you look at the areas today relative to maybe its predecessor and maybe more of where it's going to align to customers needs today? No, it's a good question. So the faster we can make the technology, the simpler we can make the technology and the more inclusively make the technology, the more it can be adopted and we really see this snowball.
So four years ago when we launched every, our first version, um, it was adopted but folks were still looking to figure out how they could deploy ai at scale And at that point in time we could, you know, count the number of use cases on probably our fingers, and toes you know, there were 15 or 20 use cases that people were doing. We knew the industries that we were selling into, I'll tell you things have changed in the
past four or five years and we are selling into every industry, healthcare has obviously been a huge industry for us, particularly with what's happened with covid, the energy market is changing very quickly. The consumer internet market with recommend er, systems and speech recognition, financial services, retail, telecommunications, the number of applications,
the number of industries that are taking a I and turning into something that benefits their business and the number of industries that are doing this at scale has really just snowballed. We actually did a press release last year with a pizza company and I said, God, if we've gotten into production with a pizza company with artificial intelligence and they're using it to make sure that they have the right copies on the pizza,
they can optimize their drivers are driving. We are truly in every single industry and us marrying the new area infrastructure with the flash blade s and the D G X A 100 just makes it faster and easier for folks to deploy. I've been in the industry a long time. And if you look at processors, it was, you know, riding the X 86 curve, it was either a tick or a talk and you chose to adopt
or not adopt where we are with accelerated computing and parallel processing with our Gps, we are revolutionizing things every 23 years and you don't have a choice to sit on the sidelines and say, I'll wait for the tick of the talk. The games are so large and you guys appear doing a fantastic job of staying paste with us and making sure that our storage and the compute with their combined network act as one
giant concert to make sure that ai happens, it happens quickly. And the companies that we're working with benefit tremendously. I think that's well said, I think what I would share with with monster engagements is I think four years ago for a lot of organizations Ai was this science experiment with where maybe they didn't know what they could do with a. I almost like a I was it was some type of
foreign technology and I think where we're at today and what I hear in the customer conversations is AI is just the modern form of data science. Right? Data science used to be spreadsheet based, then it became computational based and today it's GPU based. And you know every time we take this advancement in technology we are taking you
know one or multiple orders of magnitude in terms of performance gains and what's possible for us to be able to uh you know learn and teach, write a system to be able to decipher from data. Um What I really find powerful about area is what we're providing with customers as a a jointly validated and supported reference architecture that that I think really fits the goals of of what we're seeing customers do
which is now moved the data science and ai into the I. T. Department into the core data center. Right starting to build shared infrastructures where they're able to pair up the you know the the the the capabilities within the A 100. The MIG technology or or virtual GPU s right to be able to slice and dice this infrastructure up, assign it to jobs right?
And gain more more, more time on those Gps. Right. I I have always believed that the GPU performance is intrinsically linked to the performance of the infrastructure, you know, meaning whether that's the, you know, the spectrum switches or the or the flash blade s on on our side. So I really see it as a way to let customers kind of adopt simply as well as have a
blueprint to scale. Now, if I pivot right, one of the things that I'm really proud of our team's efforts as we were working to to build up to areas is that we jointly sat down and we also committed to a roadmap to continue to evolve, right? We've got a pretty good purview of what the evolution of Arias looks like.
Um whether we're talking about uh planned support for the H one hundred's when those become generally available to to, to to the market, whether we look at a plan to begin to integrate um storage management into bright cluster software management from, from the NVIDIA side. Right. Another step towards simplifying um that that ecosystem um and beyond. Um and I can't share everything here because I
don't want to get into some of the some of the other things that we're still trying to lock down on the on the the commitment from engineering but suffice to say, I I feel from a a programmatic approach about growing our business. I feel that we're in a much stronger position than when we kind of started four years ago with the first area. I think we were just happy to be there and
we're kind of unsure where the ship was sailing. Yeah, I couldn't agree more. Um, things have changed dramatically. Obviously our technology has changed the software that we're running on these systems has changed dramatically and it's all about the software. If you look at what we're doing internally with our D G Xs,
we have thousands and thousands of them running internally and in video. And our researchers, we have thousands of researchers using these TJX's as the tool for them to do a I science and they're using the operating system and the software that runs on them. And as it gets better, the software, excuse me, is the the researchers pound on it to make the software better and better.
And anybody that's using an area has the DDX software stack grounding on it. We also do that with our networking, we're running our networking internally. So we have the flash blade, we have our networking, we have projects, we have the concert of the software that we've tested and the industry has changed. So as we talked about the number of industries that have started to adopt ai it's really
stunning those that haven't are probably going to be the most impactful. I call it the lemming effect. A lot of companies are starting to look over their shoulder and say, oh my God, my peer group is starting without A I. And they're breaking away from the pack, I need to get started And four years ago,
I? T really didn't know what gps were, they didn't know what data science was. We didn't have the entire industry of data centers and other folks behind us. And where we are now is we have a massive um momentum across the industry that goes well beyond just what we're selling as the area's system.
Obviously that's the core of it. But I think the acceptance and the excitement around Area two is definitely very palpable. Hey, you just shared the project code name. It was very to the product name is areas, that's a that's a good one. We're keeping it in this video. Wait, well, matt, I really appreciate your time. I think we've we've eclipsed our,
our time window for those of you who are interested in a simple scalable and jointly supported, uh you know, it's kind of turnkey ai infrastructure checkout area Areas. From pure storage and NVIDIA, you can head over to pure storage dot com or in video dot com to find more information With that Vaughn Stewart Matt Hall for Pure Storage and and Video, respectively. Have a Great one.
And so thanks
  • Artificial Intelligence
  • AIRI
  • Video
  • Pure//Accelerate
High-Performance AI for State and Local Government
AI has the potential to transform national, state, and local governments. Here’s how AIRI, AI-ready infrastructure by Pure Storage and NVIDIA, can help.
Solution Brief
4 pages
Continue Watching
We hope you found this preview valuable. To continue watching this video please provide your information below.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Meet with an Expert

Let’s talk. Book a 1:1 meeting with one of our experts to discuss your specific needs.

Questions, Comments?

Have a question or comment about Pure products or certifications?  We’re here to help.

Schedule a Demo

Schedule a live demo and see for yourself how Pure can help transform your data into powerful outcomes. 

Call Sales: 800-976-6494


Pure Storage, Inc.

2555 Augustine Dr.

Santa Clara, CA 95054

800-379-7873 (general info)

Your Browser Is No Longer Supported!

Older browsers often represent security risks. In order to deliver the best possible experience when using our site, please update to any of these latest browsers.