NVIDIA Corporation (NVDA)
NASDAQ: NVDA · Real-Time Price · USD
200.75
-8.50 (-4.06%)
Apr 30, 2026, 12:15 PM EDT - Market open
← View all transcripts

Fireside Chat

Dec 18, 2025

Moderator

In this new era, we know that leadership, technology, and partnerships matter. Google Cloud's partnership with NVIDIA is paving the way forward in this new era. We have a very famous and senior leader from NVIDIA joining us today, who's been working with Google for over a decade. Fresh off GTC DC, where we made several announcements together, I'm delighted to introduce the Vice President of Hyperscale and High-Performance Computing at NVIDIA, Ian Buck.

Speaker 4

Thank you, NVIDIA.

So glad to have you here. And of course, please welcome me back to the stage, CEO of Google Cloud, Thomas Kurian. Let's have a seat.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Thanks.

Moderator

Ian, you're filling in. You can sit wherever you want to sit, my friend.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Thank you so much.

Moderator

Thank you so much for being here, both of you, and Ian, thank you in particular.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

I know Jensen wanted to be here, and he sends his regards.

Moderator

Yes.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Unfortunately.

Moderator

We miss him, but we're thrilled to have the man behind the man, so thank you. We just heard Thomas talk, Ian, about sort of the importance of our ecosystem, and clearly, our partnership with NVIDIA is a critical part of that. Let's start with what this partnership means to you. I know you've been deeply involved in it for a decade.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

I mean, two giants in AI. I was there at the very beginning. It all started, of course, with Google and discovering that first demo that was ever done by Jeff Dean and the Google team, teaching AI to discover cats in videos. It all starts with cats. Since then, we've actually been partners from the very beginning to build the best possible infrastructure, best possible software. And as two partners in this journey to bring forward the capabilities of AI and to usher it, and now to bring it to enterprise and to federal. Here at Google and at our GTC conference, which coincided, super excited about the collaboration on Gemini.

It's been an 18-month journey of NVIDIA engineers working directly with Google engineers together, a collaboration where two companies that just know how to work together to bring Gemini to Blackwell, to bring it to Google Distributed Cloud, to the air-gapped and on-prem applications. It's only one of the many announcements and many partnerships that we've done over the years.

Moderator

Thank you, Ian. Thank you so much. Thomas, why don't you comment on the partnership as well?

Thomas Kurian
CEO, Google Cloud

You know, we share Jensen's vision of this, what he calls AI Factory. Foundationally, if you look at what we work on with Ian, Jensen, and the team, it's optimizing AI through the entire stack, and it starts the foundation. There's a lot of work between our teams to bring the latest NVIDIA systems. We are very proud to announce Grace Blackwell 200, 300, as well as the new RPU 600 family technology now available through Google Cloud, but it also spans the software layers. We spend a lot of time optimizing technologies like JAX, for example, to be super efficient from a performance and cost point of view. We're working on technology to make, for example, inference work super efficiently. We've done work to optimize our tool sets so that when you use, for example, Gemini for Government, it runs super efficiently on these platforms.

Now, one example of the collaboration is about 14 months ago, Jensen sent Sundar, our CEO, and me a note saying, "Hey, today, if you're a government agency or a company that has very sensitive data or classified data or applications that cannot be moved to a cloud, you don't have the choice to use a frontier model like Gemini. Can you make that available?" And over the last 12 months, we've done a lot of work to make Gemini now available in an air-gapped environment and also running on NVIDIA Blackwell GPUs in your data center. And that's yet another example where an idea from Jensen sparked a conversation between our teams and solved a problem because we're both working together for all of you.

Moderator

Yeah, thank you, Thomas. And Ian as well. So both NVIDIA and Google Cloud have made, excuse me, huge commitments to the public sector. And certainly coming off of GTC DC this week and here today with the Google Public Sector Summit, I wonder if you could both elaborate for this audience how your businesses, your companies are supporting innovation across the entirety of the federal, state, education, and research organizations.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Certainly, AI is that new industrial revolution. It's a catchphrase, but it's true. The Industrial Revolution didn't actually start with the U.S., but the U.S. was the fast adopters of that Industrial Revolution that totally transformed our nation. It brought it to the forefront in the beginning of the last century, and now this century, AI has that same opportunity. To bring it to market, to make it possible, we have to activate an ecosystem, an ecosystem of developers, of computing platforms, and to bring it to all those technologies to improve productivity, to do new discovery, to ask questions of our data, and to enable entirely new business models. I feel like Google and NVIDIA can work together to bring the best of those platforms to this entire market. Our federal government is a giant enterprise, Karen, as you know. It's at the scale of hyperscale.

And yet it is yet to truly adopt AI across its platforms. Partly, that may be due to the nature of the business. Unfortunately, we're going to shut down right now. But it is a productivity tool that I think actually has the largest opportunity. I work with certain lab directors, and I've met with many federal agencies here. Just the productivity opportunity of applying AI alone is massive to many of our federal workers who need the help right now to get their work done, to be more productive, and to scale efficiently.

Thomas Kurian
CEO, Google Cloud

And just to add to what Ian said, our real focus has been on we look at AI as the frontier of bringing intelligence. When you look at how we look at an AI chip, for example, a GPU, we see it as the first chip that actually generates intelligence because every time you generate a token, it's a form of intelligence. What's the things that we're working on with NVIDIA and the broader ecosystem? We believe in open platforms. So if you look at Gemini for Government, when we designed it, we said it's got to interface with all the stuff you have. So things like Office 365, SharePoint, ServiceNow, Slack, all of those we provide connectivity to. We also allow you to manage all the AI models and agents you may have, whether those are Gemini or not.

That's part of our commitment to having an open platform. One is an open ecosystem. The second is to make it easy for you. So we've done a lot of work in the foundation to make sure, for example, that data and security are built in. When you use these models and systems, you don't have to bolt in security separately. You don't have to build auditing separately. You don't have to build. You can trust that when you put data into the model, it's kept secure because we have perimeter defense and a variety of other solutions to simplify it for you. The third thing, if you think of computational programming, it sort of went assembly language to procedural language to object-oriented language. That was sort of the trajectory. If you look at these models, they're becoming much, much more sophisticated.

So when Chris showed you basically building an agent without needing to write extremely sophisticated prompts and tuning it, it's because the model is doing the work behind the scenes. And so what we're doing also is to allow everyone in every department to, whether it's an educational institution that's looking at how do we handle grants and the process of application for grants, or if you're in a state government and you're monitoring and auditing, implementing permitting and Department of Motor Vehicles to streamline how people apply for licenses, or in a federal agency transforming the core. By simplifying this and providing an incredibly powerful model behind the scenes, you get the benefit of being able to use it. And one last thing to give you a sense of the commitment.

The Gemini model that we're making available to you is exactly the same model that sits under Search, that sits under Maps, that sits under YouTube, that sits under all our surfaces. It's available to you on the very same day that it goes in those surfaces, and that's part of our commitment to bring you the best of the best. Now, there's a ton of engineering work that goes behind the scenes. It's easy for me to say we're doing that, but if you look at how many hours of work go on making sure that the model performs and optimizes on NVLink and how the Protocol Buffers are managed, I mean, you have no idea how much work goes on behind the scenes, but that's to enable all of you to get the benefits of all that engineering work that we're doing behind the scenes.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Yeah, Thomas, you mentioned all of those optimizations and those capabilities. As someone who oversees all the engagements between NVIDIA and Google, those are deep technical engagements. I don't think your customers and everybody here should realize that they're getting the best of both what NVIDIA and Google combined can offer. We've worked on the back end of many of the Vertex AI platforms incorporate or use NVIDIA technologies. We do NVIDIA Guardrails, for example, to make sure that an AI agent stays on course, doesn't get confused, doesn't hallucinate. Google has adopted many of NVIDIA's models called Guardrails to keep that model talking and staying on focus without getting distracted and other things.

We've partnered on inference software as Google's using our latest NVIDIA Dynamo to do distributed inference, to take advantage to not run a model on just one GPU, but actually be able to run the model across all 72 GPUs in that GB200 and now GB300 rack in a distributed environment. That not only improves the amount of performance or intelligence or size of the model, whether it be Gemini or any of the open models, but actually be able to do it for cheaper and to think longer and to get a better answer. I know also we're working on data science as well with Dataproc and Dataflow, getting to be able to ask questions of the data, using NVIDIA GPUs, working with the entire Google Cloud to be able to serve the models in real time.

All of the things that Thomas mentioned and many more are deep technical engagements between our two companies, two leaders in AI, making it easy, and we'll do all the hard work, I guess, to make sure that we continue to bridge that gap with all the customers here.

Moderator

That's fantastic. Thank you for that. Ian, yesterday I had the pleasure of attending part of Jensen's keynote at GTC DC, and he had an amazing way of talking about the AI factory that Thomas mentioned and really this moment we find ourselves in with AI, the Industrial Revolution, and I know I'm asking you to speak for him, but I'd rather put in your own words what he means by that because I think it's so compelling.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Yeah, the term AI factory is an interesting one. Just like a factory, parts go in, materials go in, raw materials, and out come finished products, whether it be a car or a washing machine. That assembly line was the foundation of that first Industrial Revolution and to be able to scale it and to take it to market and to grow. We're in a revolution now with AI factories. In comes the data and out comes tokens. Tokens are just simply queries and answers and questions that build and provide the intelligence to provide that business insight.

Whether it's a coder that wants to generate a block of code or summarize an email, as we just saw, or be able to actually look and understand a complicated operational logistics problem like the CDO, who has to manage billions of gallons of petrol, gas across all around the world to 100 different countries for a U.S. government, that operation is done by an AI factory. And it's also a model for revenue. As tokens get generated, unlike AI in the past, which was perceived as expensive and costly to train all these models, it's actually inference that's generating the revenue and the opportunity. The more tokens we can generate in these AI factories, the more opportunity we create for businesses to grow the revenue, do the insight, and find things faster. It's also happening at exponential scale.

I have to say, it's a once-in-a-lifetime opportunity for everyone here in this room. We're literally driving a triple exponential of AI innovation through cost and computing, bringing the cost of each token down, yet the performance and capability of every data center up, and then also the insight and intelligence. Being able to think longer, to come up with more answers, and apply the AI as a tool for all of these use cases. That triple exponential is a once-in-a-lifetime opportunity. The sooner you can get on it, the better, of course. It's happening in all the regions. We're here to talk about federal. I see it in healthcare. I see it, of course, in research, where a lot of it started. It's coming in logistics, in coding, in pretty much everything we do, and Google's right in front of it.

I myself use Gemini all the time to summarize, to understand, to do that research, to figure out and understand issues. When we get trip reports of conferences like these, we literally have, you just count the number of words, it's larger than the King James Bible. There's no way I can read all those emails. I have to use Gemini with its long context capability to summarize and understand and actually pull out insights. In this case, the data is just the sheer amount of email that I'm sure all of you are experiencing as well. These wonderful productivity tools are really, it's a simple example, but it is changing how we work. It is speeding up and allowing us to actually be innovating and thinking at the pace of AI.

Moderator

Thomas, to that point, I wonder if you have an example or two, maybe commercial or federal, where it's really speeding up productivity and changing the game for customers, AI and AI factory?

Thomas Kurian
CEO, Google Cloud

You know.

We've seen people really advance how they're doing a couple of things. One is understanding information within their organization. A lot of times, people used to ask us, hey, I have all this data in my company. I can go on Google Search and search the internet and find an answer. But if I want to access the information in my own organization and I know where it sits, I can't do that. I have to ask eight people to run a report and get it for me. With these AI tools and as part of Gemini for Government, there's something called our data science and data agent. You can literally ask it questions, and as long as you hook it up to wherever your data sits, it can find the data and give you exactly the way that Google Search gives you.

It allows you to find the data, summarize it, present it, and show the source of the data. And we have a lot of different government organizations now trying it because it's helping people from a productivity point of view find information much more quickly. The second thing we've seen a lot of people do is, can you automate a transaction flow? Now, as agents have become more sophisticated in their understanding, we have agencies now building. Think of the following. I go to a mobile app and say, "I'm applying for a permit." The agent asks you, "What kind of permit are you applying for?" You walk through a conversation to get to, "this is a specific permit." It sends you a form either on chat or in a secure email. You fill it out. You upload it.

The digital agent reads the form, extracts the information from it, and automates the process by talking to the back end. That process, just in the commercial sector, going through Gemini today handles 42% of all mortgages in the United States. You probably don't realize that's going on behind the scenes. We have banks, for example, issuing credit cards. There's a bank overseas that does that process for credit card application. They can issue a credit card yes or no in five minutes, and that's what we mean by automation. The agents are able to do that because they're much more sophisticated now, and they can understand. The model can understand, this is what you're asking about. This is the information you're submitting. I understand that multimodal, the form is an image. I can extract the image from it.

And I can actually talk to a set of tools to say, do this task and then do this task. For example, check the credit score, then execute the following sequence of steps. And that helps you from a productivity point of view, but also from the way you serve the citizens of the country.

Moderator

Thank you, Thomas. So last question, lightning round. We've gotten to the end so quickly. You have 1,000 leaders in the room, federal agencies, state, local, education, research institutions. What's the one piece of advice you would give them about how they can more rapidly drive to adoption, look at AI, and understand how their agency can use it? What would you tell them to do?

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

We get asked this question a lot.

Moderator

Yes.

Ian Buck
VP of Hyperscale and High-Performance Computing, NVIDIA

Certainly, Jensen does too. Internally, this is coming. Where do we get going? This is a technology that's maybe confusing, miraculous, almost magical. My feedback, and Jensen shared this as well, don't wait for the perfect moment. Don't wait for the overarching AI strategy. Right, the opportunity is now. At NVIDIA, we were saying that strategy isn't a document. Strategy is doing. It is saying what you are going to do and what you're going to do today. There are opportunities to find those opportunities to apply AI today, even in the simple task of form processing. I'm sure in federal government, processing moving data from one form into another form that can really accelerate the improved productivity. I will also say that AI is also. That's not just for the task worker, but it's also for the executive.

You can't be an AI leader in this room if you haven't used these tools yourself. They're approachable, even if it's simple Gemini through Cloud. It gives you that familiarity and demonstrates you as a champion of that technology. Everyone's looking to us, just like a wedding. If the bride and groom are happy, if the bride and groom are looking and are excited about the technology, everyone else around you will be as well. You can set an example, even if it's just simply using Gemini to summarize an email. Tell people you're doing that. Show them that you're using the tool, that it is the front door. It is the doing and the leadership that we can provide to everybody in the room. Your whole organization will follow if they know that you're a practitioner.

Moderator

That's great advice. Thank you, Ian. Thomas, you get the last word.

Thomas Kurian
CEO, Google Cloud

I could not agree more with what Ian said. Just from our point of view, when we look at leaders and organizations, I would just tell you two very simple things. Doing AI well is not like a traditional large, 10-year, multiple billion-dollar ERP or one of those projects. Because often we see organizations who see this tool and this capability and think, will it take me eight years to implement? These implementations, we've done hundreds of them, hundreds and hundreds of them. They happen in months and weeks, not years. So that's, I mean, the proof of it is we have over 100, we have 780 companies that do over 500 billion tokens. Frankly, we didn't launch a lot of these models till about 18 months ago. So by definition, they could not have been going on for multiple years.

So first thing I would tell you is pick the projects you want and start. Because it's much easier to do these projects than people are imagining them to be. The second is, 100% agree with Ian. As leaders, you should start using the tools. Because when you use the tools, you actually surprise yourself first. You ask your teams and you show by example. And NVIDIA and Google are working together to give this opportunity to the United States to be the leader in this new revolution. The Industrial Revolution happened many, many years ago. We're now at the precipice of a new revolution. We have spent so many years working together to give the United States this opportunity and the government agencies in the United States this opportunity to use this technology that's world-leading, but to transform the government.

And so my personal request to each and every one of you, you have the chance to lead now. Let's do it together.

Moderator

Thank you, Thomas. In this new era, yes. As we've been talking about this morning in this new era, leadership is absolutely critical. And so is having a healthy disregard for the impossible. These are two leaders that always have a healthy disregard for the impossible. Thank you so much, Ian and Thomas.

Powered by