Skip to Content
Dismiss
Innovation
A platform built for AI

Unified, automated, and ready to turn data into intelligence.

Find Out How
Dismiss
June 16-18, Las Vegas
Pure//Accelerate® 2026

Discover how to unlock the true value of your data. 

Register Now
Dismiss
NVIDIA GTC San Jose 2026
Experience the Everpure difference at GTC

March 16-19 | Booth #935
San Jose McEnery Convention Center

Schedule a Meeting
4:44 Video

Everpure Data Stream Services Services + NVIDIA Collaborative Video

Fireside chat with Pure's Kaycee Lai and NVIDIA's Jacob Liberman
Click to View Transcript
00:00
Hi everyone. My name is Casey Lai, Vice President of AI at Pure Storage. Really excited to have them in the studio Today, Jacob Lieberman from NVIDIA. Jacob, very, very nice to see you again. Thank you. Yes, would you mind telling us a little about yourself and what you do at Nvidia?
00:18
Yeah, so my name's Jacob Lieberman, and I'm a director of Enterprise Product at Nvidia and a New initiative launched. We have named it the AI Data Platform. Very cool, very exciting. Well, Jacob, the first thing I want to talk I think everyone, when they think of AI, they picture the coolest co-pilots, Right? The biggest and baddest models, uh, and they
00:36
Often forget about the data. And so I’d love to talk to you about that. Why do you think data matters for AI? What's your perspective on this? Well, this is one of my favorite subjects, Casey. So, yeah, there's been a massive rush of enthusiasm around Gen AI and all of its Capabilities. But somewhere in the midst of it all,
00:56
We lost sight of the fact that data is still king. So whether you're training a model, fine-tuning a model, Or retrieving additional context through RAG to inform your LLM generations. You need secure access to high-quality data. Basically, you don't want the garbage in, garbage-out problem and crazy hallucinations.
01:19
Right? That's true. All right, right. Well then, in addition to that, I think the performance also matters, right? Because you need to ensure the data gets to the GPUs fast enough so you don’t Have idle GPUs.
01:31
Nobody likes that. Right, so without GPUs, it's really not. Possible to prepare data for AI at scale, and to keep up with the velocity. Of the data, the rate at which data changes and the rate at which the data grows. So, that's number one. Number two, you know, Building these pipelines to make data AI-ready is complex,
01:55
And they have many stages. There are many handoffs between different Personas and users of the data. And at any moment during one of those handoffs, somebody could drop the ball. Totally. Yeah, I think I see this as probably One of the biggest challenges that's, uh, getting in the way of AI inference
02:12
Perspective. You know, up to now, most workload has Been training, so people have been really really focused on that. But now, it's going to shift where most of the time and effort, And money is actually all going to be focused on inference. And so, the reason why it’s interesting is because of what you just said.
02:28
The minute you get to inference, right, you can only get good inference and good consumption if The data is actually AI-ready. Well, there are many challenges. I mean, first of all, Data: enterprise data is unstructured. Ninety percent of the data an enterprise acquires is unstructured in nature.
02:46
And there are many modalities: video, audio, text. PDFs with graphics, images, presentations, spreadsheets—combine those things, It becomes quite challenging to extract insight from the data. Right? Well, if you can't even get the data to be EI. Ready? You're not getting any insights, right?
03:08
I think that's definitely key. And so, I think that's why we're very excited about what we're doing here at Pure Storage. Um, we announced at GTC last week the introduction of a new product, called Pure. Storage data stream. Where we are specifically focused. Challenge. So, the first part that data stream's going
03:27
Do is address that specific area, so that way you get one workflow. One product that's going to automate the whole process to actually generate data sets for AI In minutes. Second, we're going to do is we're going To make sure it's super easy to consume the output, put some governance around it. Right? What they should use, and what not to use,
03:51
Who can see, who cannot see — those types of things should be in there. And then third, we're entering an age where we have agents. Right? Agents are part of our digital workforce. So you have to now think about how agents are going to consume. I think these are very important capabilities in data streaming,
04:07
Right, to accelerate and simplify the process of making data AI-ready. Right? And so, you can think about Pure Storage as Really taking an active role to be there for the, For the customer, for every step of their AI journey. And what I love about it is that it's all centered around the data.
04:25
Which is really the core competency of Pure Storage: protecting that data, And then it builds on top of that, but data is always at the core. Jacob, it was a pleasure having you at the studio. Thank you so much for doing this. We really enjoyed it. Had a blast! Yeah, me too.
  • Artificial Intelligence
  • Video
  • NVIDIA
09/2025
Everpure FlashArray//X: Mission-critical Performance | Everpure
Pack more IOPS, ultra consistent latency, and greater scale into a smaller footprint for your mission-critical workloads with Everpure®️ FlashArray//X™️.
Data Sheet
4 pages
Continue Watching

* indicates a required field.

We hope you found this preview valuable. To continue watching this video please provide your information below.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Your Browser Is No Longer Supported!

Older browsers often represent security risks. In order to deliver the best possible experience when using our site, please update to any of these latest browsers.

Personalize for Me
Steps Complete!
1
2
3
Personalize your Everpure experience
Select a challenge, or skip and build your own use case.
Future-proof virtualization strategies

Storage options for all your needs

Enable AI projects at any scale

High-performance storage for data pipelines, training, and inferencing

Protect against data loss

Cyber resilience solutions that defend your data

Reduce cost of cloud operations

Cost-efficient storage for Azure, AWS, and private clouds

Accelerate applications and database performance

Low-latency storage for application performance

Reduce data center power and space usage

Resource efficient storage to improve data center utilization

Confirm your outcome priorities
Your scenario prioritizes the selected outcomes. You can modify or choose next to confirm.
Primary
Reduce My Storage Costs
Lower hardware and operational spend.
Primary
Strengthen Cyber Resilience
Detect, protect against, and recover from ransomware.
Primary
Simplify Governance and Compliance
Easy-to-use policy rules, settings, and templates.
Primary
Deliver Workflow Automation
Eliminate error-prone manual tasks.
Primary
Use Less Power and Space
Smaller footprint, lower power consumption.
Primary
Boost Performance and Scale
Predictability and low latency at any size.
What’s your role and industry?
We've inferred your role based on your scenario. Modify or confirm and select your industry.
Select your industry
Financial services
Government
Healthcare
Education
Telecommunications
Automotive
Hyperscaler
Electronic design automation
Retail
Service provider
Transportation
Which team are you on?
Technical leadership team
Defines the strategy and the decision making process
Infrastructure and Ops team
Manages IT infrastructure operations and the technical evaluations
Business leadership team
Responsible for achieving business outcomes
Security team
Owns the policies for security, incident management, and recovery
Application team
Owns the business applications and application SLAs
Describe your ideal environment
Tell us about your infrastructure and workload needs. We chose a few based on your scenario.
Select your preferred deployment
Hosted
Dedicated off-prem
On-prem
Your data center + edge
Public cloud
Public cloud only
Hybrid
Mix of on-prem and cloud
Select the workloads you need
Databases
Oracle, SQL Server, SAP HANA, open-source

Key benefits:

  • Instant, space-efficient snapshots

  • Near-zero-RPO protection and rapid restore

  • Consistent, low-latency performance

 

AI/ML and analytics
Training, inference, data lakes, HPC

Key benefits:

  • Predictable throughput for faster training and ingest

  • One data layer for pipelines from ingest to serve

  • Optimized GPU utilization and scale
Data protection and recovery
Backups, disaster recovery, and ransomware-safe restore

Key benefits:

  • Immutable snapshots and isolated recovery points

  • Clean, rapid restore with SafeMode™

  • Detection and policy-driven response

 

Containers and Kubernetes
Kubernetes, containers, microservices

Key benefits:

  • Reliable, persistent volumes for stateful apps

  • Fast, space-efficient clones for CI/CD

  • Multi-cloud portability and consistent ops
Cloud
AWS, Azure

Key benefits:

  • Consistent data services across clouds

  • Simple mobility for apps and datasets

  • Flexible, pay-as-you-use economics

 

Virtualization
VMs, vSphere, VCF, vSAN replacement

Key benefits:

  • Higher VM density with predictable latency

  • Non-disruptive, always-on upgrades

  • Fast ransomware recovery with SafeMode™

 

Data storage
Block, file, and object

Key benefits:

  • Consolidate workloads on one platform

  • Unified services, policy, and governance

  • Eliminate silos and redundant copies

 

What other vendors are you considering or using?
Thinking...
Your personalized, guided path
Get started with resources based on your selections.