NVIDIA Corporation (NVDA)
NASDAQ: NVDA · Real-Time Price · USD
200.50
-8.75 (-4.18%)
Apr 30, 2026, 12:16 PM EDT - Market open
← View all transcripts

Nasdaq Investor Conference in Partnership with Jefferies

Jun 10, 2025

Janet Harbison
Head of International Equities, Jefferies

For our 2025 Nasdaq Investor Conference. My name is Janet Harbison, and I lead International Equities at Jefferies here in London. We are proud to be partnering with Nasdaq for the 10th consecutive year on this event, and I can confidently say that this is the best lineup ever. I want to take a moment to thank Jack, Daniel McCart, and Andrea Hoff from Nasdaq for their incredible partnership and dedication. On the Jefferies side, a huge thank you to Abigail Charkham , Edyta Balsam , and Tanya Khosla for their tireless work behind the scenes to ensure a seamless experience for all of you. Of course, thank you to the corporates for making the trip to London. What an exceptional lineup we have. The Nasdaq has long been a bellwether for innovation-driven growth.

Over the past five years, it has consistently outperformed broader market indices, reflecting the strength and resilience of Technology and Biotech sectors. This performance underscores Nasdaq's role, not just as a stock exchange, but as a global platform for companies shaping the future, from AI and semis to Cloud Computing and Digital Health. A few quick words on Jefferies, as we will be better known to some of you than others. We're one of the fastest-growing investment banks globally, with a 60-year-old firm, a $60 billion balance sheet, and almost 7,000 professionals across more than 40 offices in Europe, the Middle East, Asia, and of course, the Americas. We focus exclusively on Global Markets, Investment Banking, and Asset Management. U.S. equities are a key focus of the team. Delighted that we're welcoming our Jefferies team from Stockholm, Frankfurt, Paris, as well as multiple U.S. offices here today.

Within equities, clients are often surprised to learn that Jefferies now has the broadest global equity research coverage on the street, covering over 3,500 stocks. Most recently, we've added Latin America, MENA, and Canadian research. What truly differentiates Jefferies is the global nature of our business and the depth of collaboration between regions and teams. Please do speak to me or my colleagues if we can help you or your business further, or if you would like to be introduced to other parts of the business. This spirit of global collaboration and insight is at the heart of our upcoming lunchtime panel on semiconductors, where we look forward to hosting our Global Semis Analyst, Blayne Curtis, our U.S. Semis Analyst, Janardan Menon , Head of European Semis, and Edison Lee, Head of Asian Semis. We hope you'll join us for what promises to be a fascinating discussion.

Before I close, one small ask. In 2024, we had record results in Institutional Investor, ranking number five. Jefferies was the most improved firm for the fourth year running, and we have almost 80 analysts ranked in the U.S. and Europe. We care, and we would be incredibly grateful for your five-star votes in the U.S. survey that is currently running, especially for the tech team attending this conference and helping make it happen. Many of you in this room have been instrumental in our journey. Thank you for your trust and support. It is now my pleasure to hand over to Colette Kress, CFO of NVIDIA, and Blayne Curtis, Jefferies' Head of Semiconductor Research. NVIDIA has been a trailblazer in the tech industry, revolutionizing fields such as AI, Gaming, and Data Centers.

Under Colette's financial leadership, NVIDIA has achieved remarkable growth and innovation, making it one of the best-performing and most exciting tech stocks globally. Welcome to the Jefferies stage, Colette and Blayne.

Blayne Curtis
Head of Semiconductor Research, Jefferies

All right. Thank you all for joining. I'm Blayne Curtis. Obviously, Colette Kress, and very happy to be kicking off the conference with NVIDIA. Obviously, been an incredible story over the last couple of years with AI, particularly. I think we wanted to start on the kind of demand side because I think one of the new interesting kind of drivers is sovereign AI. I think Jensen's kind of talked about it as the next growth driver. In fact, I think at GTC, talked about maybe sovereign would be the biggest spenders. He said non-CSPs. Maybe kicking off there, obviously, you've been talking about it for several quarters. There's some Middle East announcements. I think Jensen promised some European. You have GTC Paris coming up here. Thank you for joining and maybe start there.

Colette Kress
EVP and CFO, NVIDIA

Yeah. Thanks so much for having us here. I'm pleased to be here. It's been a while since I've been here at the conference and been able to speak to so many of the ambassadors. I really appreciate that you all came out for today. I have a little bit of an opening kind of little statement that I have to say. Before we begin, as a reminder, the content of this meeting may contain forward-looking statements, and investors are advised to read our reports filed with the SEC for information related to risk and uncertainties facing our business. First, I want to talk about some of the things that occurred over the last couple of days. Jensen was here in the U.K., working here with the Prime Minister.

The Prime Minister and Jensen together really worked to develop opportunities within the U.K. and focusing on that sovereign piece of it. We will be looking to build out infrastructure here in the U.K., supporting many of the industries that are here, many of the startups that are here, focusing on what they can do for AI. We know this is an important time to help them, help them in terms of building that AI and infrastructure just to start that fuel that is going to be necessary for their AI solutions. Now, thinking about that here in the U.K., a lot of discussion about referring to it as the Goldilocks place. The Goldilocks place was really a common way that we try and think about the importance of the great talent that is here, the great AI talent, the great startups that are here in the U.K.

We couldn't be more proud to be there. We will also be here in GTC Paris. That is correct. Shortly after today, we just head on over to Paris, where we will also be talking about sovereign in a different part of the world in terms of the EU. Sovereign is a very big piece and a focus of where we are concentrating. Keep in mind, the world of AI has moved probably the fastest of any other technology across the globe that we've seen in history. From the onset of what we saw in terms of ChatGPT, the instantaneous understanding worldwide on how important AI would be for our future. All countries, all enterprises, all people, all consumers are all thinking about how AI would work there.

We're happy to be just a proud partner with so much of that work in terms of our platform and what we've put together. Sovereign is a big piece. We have been in the Middle East, as you indicated, and we ended up speaking with not only Saudi Arabia, but also the UAE. I think it led to what you heard is the tens of gigawatts that would be available through many of those nations. It was an important time because it was U.S. government together with what we were seeing in the Middle East leadership as well. I think that will be a great start for such an important part in what they can do to influence both from the Capital, the Data Center, and our help from a platform to do so. How large is sovereign?

How large is sovereign is always the question in front of us. It is going to be a very, very large piece. Look at it in this perspective. Every country will need their own ability to have their AI within their country. Using just one standard foundational models, and some of them that are available in the United States, you're going to see many of these foundational models begin in a lot of the countries that are here. That's the ability for you to have your own language, your own culture, your own data that you would likely want to keep inside of that country. That's why this sovereign piece is such an important piece for us. It will be just as your GDP would likely be, growing as your GDP does and being a very big part of that.

We see in just right now, probably tens of billions of dollars that will be surfaced. Again, when you look at the size of this, you can be approaching over several separate years, could be close to a trillion dollars. These are key areas about why we're here, why we're here in this part of the world, and why we're focusing on a lot of different parts because sovereign's going to be a big piece.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to follow up. You kind of partially answered it, but the question I get a lot is, who's the ultimate customer? You have a sovereign-funded data center in the Middle East, per se. Is it the customer going to be Microsoft, and it's just a regional data center? I think you answered there will be specific national efforts, models, data, such. Maybe you can elaborate on that. In terms of timing, I get this a lot as well. We've seen some announcements. I'm assuming these are massive data centers, gigawatts, probably need to build buildings first and then fill them. Maybe you can walk us through a little bit of the timing behind some of these statements.

Colette Kress
EVP and CFO, NVIDIA

Yeah, we get a lot of discussion, and there's a lot of folks interested in being a part of sovereign. All will be partners within what will be built in terms of sovereign. When you think about what is necessary in each country, probably being a little bit different, what participation does the government issue in many of the countries? Remember, they're also very important in terms of the Telecom business or what we need for the Internet. You can imagine what they are using for AI will also be part backed by parts of the AI. What we'll see, though, is not necessarily a standard model. Every country will probably do that different.

The governments are very focused in terms of what they need to do to support the country as a whole and have been a very big part of a lot of the fundraising that will be necessary. You then go into who is that builder. The builder can be absolutely CSPs that you're seeing, but you also see a new brand surfacing that will also be very important and we refer to often as the neoc louds, the regional clouds that will be stood up that may not be standard with the larger clouds that you see, but really customized and be focused on more of a private cloud, providing specific data or a specific model for one or two different types of customers.

These may also be what you'll see in terms of enterprises in these nations, enterprises building AI factories through these neoc louds to be put together. Many can contribute to that. Much of the European Union and folks had seen supercomputing as being an important industry. This can be a focus of moving towards AI, included in their accelerated computing focus that they also did on supercomputing. A lot of opportunities for all to join in that perspective. Now, how soon? What will we see first? What we heard in terms of in the Middle East, for example, some of the important foundational things that leads to these types of builds going forward is first the focus in terms of capital.

The capital, the availability of capital that can be earmarked for these large clusters are an important piece, but also the support in terms of the data center builds. You have a separate group that are focusing on where will the power that is necessary for these and the data center complexity also be put together. Those are some of the first things that we're already seeing, each going hand- in- hand in terms of what we'll build in AI.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to kind of finish up on the demand side. I mean, actually, to begin this year, there was a lot of questions about whether the sustainability of the level of spend, which you're going to get when you see that kind of growth. I thought it was interesting. Google talked about serving 480 trillion tokens in a month, which is up 50x. We've heard comments from the CSPs. They do not have enough GPUs. They cannot serve the inference that they need to. I'm kind of just curious from your perspective, maybe wrap in the demand for Blackwell and just overall demand perspective to start the year here. What are you seeing in terms of drivers between training and inference?

Colette Kress
EVP and CFO, NVIDIA

Yeah. So probably since ChatGPT, a little bit more than two years into this. But keep in mind, we're just at some of the early beginnings of what is going to be necessary going forward. Yes, foundational models continue to be trained, but new and advanced models are very, very predominant at this time. What you see is reasoning models taking a significant amount of need of compute. The three scaling laws are still a big part of it from the foundational part and moving into reasoning models. You see a significant amount of more compute that is necessary for those models. Additionally, that moves us to the inferencing or what we often refer to as the token generation. This is where applications coming to market and producing tokens that you are seeing both with models. And the future of agentic types of models.

Agentic models are essentially doing work for you, not just actually reasoning and giving you answers. It would be great to see so much of the work that we do today, such as the manual work that could be done with some of those agentic models. Blackwell has been engineered specific for a lot of those reasoning models and particularly for inferencing. Right out of the gate, when we shipped our GB200 NVL72, several of our customers stood it up just to look in terms of the size of inferencing improvement. The inferencing improvement, as we have now focused on accelerating just about every part of that Blackwell infrastructure, has been key. That software platform, also very important in terms of influencing the inferencing performance. As you've seen, what they can do in terms of token generation is an X factor greater than anything that they've seen before.

We're seeing folks actually use our Blackwell directly for inferencing, not just for the training upfront. Both of these are important factors that are driving there. Many of our customers absolutely see more and more needs for more compute as we continue to scale. It is not just focused in terms of one industry or one part of the world, each and every industry in terms of growing. Yes, as we recognized in our guidance that we provided for the quarter, we do see strong growth. We see strong growth in terms of Blackwell, even with the backdrop of some of the challenges that we've had in terms of what we're able to ship to China.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to ask you about the China market. Jensen talked about it being a $50 billion market. He's been quite vocal that he's kind of against the restrictions that you guys have seen. Obviously, you had the diffusion rules that went away, but that was another area that I think you spoke out against as well, trying to address this demand, be the one who does it versus something homegrown. Maybe you can just talk about, give me the comment that post the H20 ban, Jensen said that a cutdown version would maybe not be competitive. Really, you shouldn't think about you guys addressing China. There's still rumors about that you could cut down a chip and still address it. To what you can talk about, maybe just why China is important, and then what is your plan to address or not address that market?

Colette Kress
EVP and CFO, NVIDIA

Yes. During the quarter of our first quarter, in the middle of the quarter, we received notice from the U.S. government that we would not be able to ship our H20. Now, keep in mind, our H20 going to China was the only product of significance from the data center that we were doing through a lot of work in terms of what we developed for them and a lot of back and forth with the U.S. government with continuous approval for them to do so and what we brought to market. Unfortunately, they chose to not allow it to go. That means where we stand is in a situation that we really do not have anything for that market.

We've discussed that it wouldn't be appropriate for us to just start a new chip at this point because essentially the H20 from where it compared to our Blackwell architecture was significantly lower in terms of what we were being able to enable in China. That was about a 25x change from an H20 to what you would receive in terms of a Blackwell. We knew this takes a discussion with the U.S. government, if anything new that we want to do. We know that our work in China is not about us as alone because remember, there is domestic competition in China when you are not being able to ship your best to do so.

At this time, we are going to continue to work to see what would be possible, what could we do, given that we've gone through this now and have had to stop in the middle. That's not something that we want to do going forward. It's a big market, though. China is a very, very big market. We can think about it just today or this year, probably could be about a $50 billion market. That's a great opportunity for us to continue to innovate, continue to build the platform from the U.S. to the rest of the world. We think that's an important market for us to go and do. Again, still in discussions with the U.S. government, and we'll see.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to ask you on the supply side was another kind of concern entering the year. You look during Hopper, it was availability of CoWoS and supply issues more on the chip level. This time around with the GB200, it's more of a system issue. It's not that you ran into one huge problem. It's probably a lot of little problems in just standing these racks up. I think the interesting comment that was made on earnings was that you actually shipped 1,000 racks per customer per week, which is obviously a huge number. I think the point was that you're starting to catch up. Maybe you can elaborate on just the supply equation.

People, you've had a decent Blackwell number in terms of revenue, but people look at these downstream data points and the amount of racks that the ODMs can produce, and it had been quite a low number to start the year. How is that improving?

Colette Kress
EVP and CFO, NVIDIA

Yeah. Our Blackwell architecture was a phenomenal decision on an architecture change. What we did is we pretty much shipped to our customers a full data center scale versus what they had seen in, for example, the Hopper architecture, which was a standard classic configuration of what we would be selling, which would be a motherboard with about eight GPUs in it. Moving to what we did with Blackwell was the importance of understanding each and every part of that data center needed to be accelerated to focus in terms of continuous performance improvement and the best efficiency even from a power perspective.

That configuration, as sophisticated as it was, with probably about 1.2 million different components in it, landed with many of our system integrators, our OEMs and ODMs working to get it pretty much what they would do to build out a data center and get that into market. Nothing unique about that. It was just the change in terms of what they had received earlier to getting a full data center. All is moving now quite seamlessly. Yes, we are getting them back up to levels so that they can move what they had received and getting them stacked into the data centers, all racked up. Many of them have already started their work in terms of starting workloads on those systems. We've also indicated important for our Blackwell is our next architecture moving to the 300 series or Blackwell 300.

It will be the same, pretty much the same architecture, same electronics, same mechanicals. Change in terms of the chip or change in terms of the memory is probably the only change to that. The customers are well briefed now on how to build out those GB systems. And we're excited to see that now in the next architecture as well.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I was actually going to ask about that in terms of not only GB300, but also the first generation of Rubin per your roadmap. It's the same rack effectively. Kind of also want to just ask you, yeah, in terms of the concern people had is that maybe there were chips at the ODM and somehow that wouldn't clear through. I mean, I guess in terms of.

Colette Kress
EVP and CFO, NVIDIA

It's all moving quite well. All moving well. The speed of what they're moving, what's going to be important as the next thing is getting them on a cadence where they have it all showing up and moving quite quickly. Just as we do our supply chain, you're now seeing this to be an important part in terms of standing up racks as well.

Blayne Curtis
Head of Semiconductor Research, Jefferies

Would you relate the H200 transition was pretty quick, kind of just over one quarter, almost all of it switched over. When you think about the transition of the GB300, which you're sampling now, should it be a similar kind of cadence given that it's so much overlap with the platform?

Colette Kress
EVP and CFO, NVIDIA

You're going to see both. You're still going to see the 200 series and the 300 series ship. Keep in mind, we are still shipping, for example, Hopper 200. So there is still the continuation. Many of them fill out their data centers, fill out for certain workloads. So you'll probably see both of them continue over several quarters.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to ask you on the competitive side, particularly ASICs. I thought it was interesting. COMPUTEX, the NVLink Fusion allows some permutations with other people's silicon, whether it's a CPU or an accelerator. Maybe kind of talk about that strategy, what you're seeing from ASICs as competitors and why Fusion, I guess, is the question I get a lot.

Colette Kress
EVP and CFO, NVIDIA

Yeah. Let's first start with NVLink. NVLink, as you know, is our very important connectivity that has been part of us for five generations of what we put into market. Very important in terms of GPU to GPU connections as well as CPU to GPU connectivity. For example, in our GB200 NVL, you have NVLink or 72. What we are doing is NVLink plus eight switching. A very, very important part of the configuration working with the significant amount of traffic, particularly on an inferencing side. We had taken the best of breed of what we had seen in our InfiniBand and enabled that now also, which are switching Ethernet as well. Going back to NVLink and its importance, this is an opportunity for folks to still maintain on our platform and get those capabilities.

If they want a different CPU, yes, we have our Grace CPU, but another CPU gives them an option if they want an x86 or otherwise to still stay connected to our full platform, but having a license to that and working in terms of our networking. It could be the same in terms of ASICs as well. Our opportunity here is to continue to expand the opportunity of our platform both with NVLink as well as networking.

Blayne Curtis
Head of Semiconductor Research, Jefferies

That's a perfect lead. I want to ask you on networking. I mean, I think when you look at your roadmap, it's not just a GPU roadmap. You have half dozen chips there that are all critical in making that system, which I think is the challenge. We're going to have a couple of AI days coming up for some of your competitors. I mean, they're going to have to answer that equation, how they match the NVL72. Networking was $5 billion, up 64%. Maybe you can talk about the strength you're seeing. Also, Spectrum-X was $2 billion. I think we get this question a lot. InfiniBand versus Ethernet, your Ethernet versus others' Ethernet. What kind of traction are you seeing on the networking side?

Colette Kress
EVP and CFO, NVIDIA

Yeah, really good. Our networking is doing phenomenal. Just as we discussed, the importance of accelerating pretty much every part of that data center is going to be essential for many of these AI workloads. Our networking business continues to expand. You have teams really focusing on how well to integrate with so much of our work that we do in terms of AI. We had best of breed in terms of InfiniBand, best of breed InfiniBand. Keep in mind, many of your enterprises are on Ethernet, and we created Ethernet for AI inclusive of Spectrum-X. Spectrum-X has been very important for many of our hyperscalers, and many of them we talked about in terms of on our earnings.

What they are seeing is a great solution, Ethernet that would give them key for many of their enterprise tenants that they have there, but keeping those key points of the traffic that needs to be monitored so heavily, plus many other capabilities with that. Yes, it's reached a very strong level, and we are still also shipping a great amount of InfiniBand as well. Together with our NVLink, our Ethernet platform, as well as InfiniBand, we have a really, really good attach rate in terms of what we're seeing. The time that they are usually choosing NVIDIA's networking to attach with what we have in terms of our GPUs, that can be over 70% of what we're seeing. It's moving quite well.

Blayne Curtis
Head of Semiconductor Research, Jefferies

You do a great job talking on the strategy. I want to ask a CFO question. Just finishing on the ramp of the GB200, I think the gross margins have been a big focus. You did guide gross margins up sequentially, talked about mid-70s by the end of the year or in the future, maybe. I do not want to put words in your mouth. Can you talk about what needs to happen to get those gross margins to mid-70s?

Colette Kress
EVP and CFO, NVIDIA

Yeah. Making a pretty big change moving to Blackwell in terms of that configuration and all the different types of components. We were in catch-up mode for a good part of the first and second quarter of getting Blackwell to market. We have now gotten to a fairly solid ramp. That is going to be able to assist us in terms of improving those gross margins as we get more volume and more that we can work on in terms of the yield plus the cost together, pieces of that. Let's not just forget all of those different components that are there to put together. We made progress absolutely in Q1. We are guiding in terms of Q2 continued progress. Yes, we do see that path towards the mid-70% before the end of the year.

Blayne Curtis
Head of Semiconductor Research, Jefferies

I want to, you're running out of time, and I think I do want to ask, obviously, the data center and AI is the biggest part of the story. Wanted to ask on gaming. You saw a great deal of strength there. AMD saw strength as well. I think people, the question is, are we seeing some sort of gaming cycle? Is this AI that's driving the demand? I'm just kind of curious from your perspective, what's driving the first-half strength in gaming?

Colette Kress
EVP and CFO, NVIDIA

Yeah. Thanks for the question on gaming. Gaming actually hit record levels, record levels in this last quarter. Keep in mind, it's record levels and we are supply constrained. We have been working feverishly on getting our Blackwell architecture to market and the volume that we need to serve those customers. I think we're getting stronger and stronger each quarter in terms of that size. What are they excited about? They are excited about gaming. It is still such an important industry. There are other useful cases that you can see in terms of AI, AI with a PC. This in the future is going to be an important part, whether that be for your creatives, your independents, those that are really working, but now you have a great AI PC just as much as you have a great gaming PC.

More to the growth and more to the Blackwell for gaming to come.

Blayne Curtis
Head of Semiconductor Research, Jefferies

In terms of expanding the story out, I kind of want to ask you an open-ended question in terms of where you see the biggest opportunities for kind of AI over the next kind of decade. You hear stories on, obviously, you've been in autos for a while. It's funny, we're getting like a renaissance of autonomous driving if they don't burn all of them in L.A. I think you hear talk about humanoid robots who might have 10 per person, biggest market ever. Obviously, it seems futuristic, but may not be as far away as you think. Obviously, in the data center, lots of applications on the R&D side. Maybe you can elaborate at where you see it all. Where are you most excited over a longer horizon?

Colette Kress
EVP and CFO, NVIDIA

There's a lot of amazing work doing that will really influence so much of the AI work. Starting first, what we see right out of the gate is there's so many different software applications at an enterprise level. Infusing AI, infusing AI work within those is absolutely what you see so many of those companies working on. Those are going to be some of the first things that you see. The focus in terms of other enterprises, not a single enterprise on the planet doesn't have a call center. Wouldn't they just love the ability using AI to make that as most efficient as well as a great experience for their customers and pulling that together? More and more agentic work will begin at the enterprises.

Agentic work that says, I can get that work done in hours that I'm not at work and that I can walk into the office to where that it can go through the reasoning and go through a phase that says, what kind of work didn't need done. I can actually see that in something such as a finance organization that says, how do I decide in terms of what we need to do in terms of booking accruals and those types of things. A lot of work can happen agentic in terms of AI. You brought in a good area of the kind of the next focus. Automotive, AV cars, EV cars, such an important industry. Yes, working 10 years to really see a lot of the Robotaxis on the car or level two, level three in market.

It is also another introduction to another big industry, which is the physical AI and/or the Robotics. Change out some of the things that you see in terms of automotive, and you can see that exact same thing coming through in terms of in the robotics. Robotics in terms of the human eyes and the multiple brains, the brains that will be back in the data center, the brains that will actually be inside of the robots that are providing that landscape for them to actually do work as well. Manufacturing and industrial AI are very top of mind and very important in this part of the world in terms of the European side as well. Those are some of the big things that we'll see in the future.

Blayne Curtis
Head of Semiconductor Research, Jefferies

All right. Perfect. Already out of time, but thank you for joining. Thank you for everybody coming too as well. Thank you.

Colette Kress
EVP and CFO, NVIDIA

Thank you.

Powered by