Advanced Micro Devices, Inc. (AMD)
NASDAQ: AMD · Real-Time Price · USD
421.39
+66.13 (18.61%)
At close: May 6, 2026, 4:00 PM EDT
416.27
-5.12 (-1.22%)
After-hours: May 6, 2026, 7:29 PM EDT
← View all transcripts

BofA Securities 2024 Global Technology Conference

Jun 5, 2024

Vivek Arya
Managing Director, Bank of America Securities

Session. I'm Vivek Arya. I cover semiconductor/semicap equipment at Bank of America Securities, and I'm absolutely delighted and honored to have Jean Hu, Executive Vice President and CFO, this morning. I'll start with a few of my prepared questions, but if you have anything you would like to bring up, please feel free to raise your hand, wait for a mic, and we'll be sure to get you in. With that, very warm welcome to you, Jean. Really appreciate you joining.

Jean Hu
EVP and CFO, AMD

Thank you for having us, and thank you all for joining us this morning.

Vivek Arya
Managing Director, Bank of America Securities

Yeah, we had to move to a bigger room.

Jean Hu
EVP and CFO, AMD

Yeah

Vivek Arya
Managing Director, Bank of America Securities

Right? A lot, a lot of interest in AMD. So, Jean, maybe a very eventful week as we were just talking. I gave the keynote session. I think it would be really great if you could just kind of help us get up to speed on what are the main announcements from Computex, you know, what gets you the most excited about?

Jean Hu
EVP and CFO, AMD

Thank you, Vivek. Let me start to share with some key highlights from this week's exciting announcement by AMD at Computex. You're right, Lisa Su gave the keynote with, the leadership, CPU, GPU, and the NPU architecture from AMD, which are going to power end-to-end AI integrally. Now, AMD is the only company that has the, end-to-end solutions, from CPU, GPU to NPU, to address the significant AI demand, you know, end user device like gaming and AI PC, and even to the edge with our embedded business. So it is exciting time. I would say, if you look at the announcement, the first, our AI PC, Ryzen AI 300 Series processors for ultra-thin , notebook.

This is literally one single chip that include CPU core, the latest GPU core, and, the most powerful NPU core, on a single chip. Incredible innovative product, which actually has up. That is, you know, Microsoft AI PC requirement is actually 40 TOPS. So we do have a leadership performance against our competition on the AI PC side. We also announced the best tablet processors. Same thing, 2x the performance on AI performance side. So it's quite exciting, and both product will be available in July on the shelf, so people can-- But data center side, AMD is the only company that have both product portfolio and the technology to provide the CPU. On the CPU side, we previewed our Turin, which is our Gen 5 EPYC CPU processors. This generation will have up to 100-core CPU.

It will extend the leadership performance, the power, efficiency, TCO across, you know, all server platforms. So, it is 6.5. So that's the—after the very successful Gen 4 family continued to gain market share, this is another leadership product that will drive a few. Of course, on the GPU side, Lisa unveiled our expanded GPU roadmap, which actually is on annual cadence. We'll introduce new product family as, as you know, we launched the MI300 late last year. It quickly becomes the faster ramping revenue product in AMD. MI300 today, it provide leading inference and performance and very competitive on training side, too.

And then later this year, we'll have a MI325 to 288 GB HBM3E memory, which again, when you compare to actually small memory capacity, which is quite significant. The way to think about it is, when you look at the one server with 8 GPUs, 8 billion parameter large language model. That's the advantage of very large memory capacity and the bandwidth. So we are really excited about the product. In 2025, which actually is based on AMD's CDNA 4 architecture, 3 nm process node, and also again support 288 GB memory. So it is a product, when you look at the generation-over-generation, we actually improve the performance by 35x . It's actually very similar to our compute generation performance improvement, and this product will be competing with the Blackwell 200.

When you look at the memory capacity, actually, it's 1.5x of Blackwell B200. So continue to lead on the memory capacity and the inference performance. That's actually really. In 2026, we'll introduce the MI400, alongside with the, you know, competition's Rubin GPU. The way it's actually it's going to based on another new architecture we call the CDNA Next. We'll continue to extend the performance not only on the inference side, but also training. So it is exciting when you look at our portfolio and product announcement from this week. We are very excited about the large AI opportunities, end-to-end opportunity.

Vivek Arya
Managing Director, Bank of America Securities

Excellent. Thank you, Jean. Thank you for the overview. So let's start with everyone's two favorite words, AI. So on the MI300, you raised the forecast for this year from over $3.5 billion. Is that a supply constraint number? Let's say, you know, if you get enough supply in memory and CoWoS and so forth, you know, can that number be dictating that number to be $4 billion and not higher this year?

Jean Hu
EVP and CFO, AMD

Yeah, as I said earlier, you know, we literally launched the MI300X last December, right? We have ramped the MI300 in less than two quarters, and when you think about, we talked about the last earnings call, we have more than 100 customers that we are engaging with, either in the developing stage or in the deployment stage. So we are at the last earnings call, it's really based on the engagement, the pace, the design wins, the backlogs that we have with our customers. And our supply chain team has done an excellent job, also quite tight. Even for the first half of this year, we continue to face very tight supply chain situation. So our job is to really continue to push, working with different process. The ramping process can be complex, right?

There are so many different models, different workloads, different customers. So you work with them, go through the initial production, then deployment. So the process of different customers at a different stage of a process, that's what we are working with. We are actually exceeding our expectations because the ROCm software, we have made tremendous progress. So we can help a customer to bring up their production. Over time, we have said that we have more than 4 billion supply this year, and that you should expect us to update you when we make more progress going forward.

Vivek Arya
Managing Director, Bank of America Securities

Got it. Does the launch of the Q4, does that provide upside potential also?

Jean Hu
EVP and CFO, AMD

As you know, when you launch the product, typically it would take some time to ramp up, right? So in Q4, but meaningful revenue will be next year.

Vivek Arya
Managing Director, Bank of America Securities

Got it. Okay. And then finally, from a supply perspective, are you getting adequate support, especially reports about whether it's supply constraints, whether it's, you know, product is not qualified or, you know, issues, et cetera. Are you satisfied with the memory supply that you have?

Jean Hu
EVP and CFO, AMD

Yeah, we are with all three memory suppliers. As you can see, when we ramp the MI300X, it's very significant, very fast ramp. We get a very good support, but I think, the capacity is still tight, but our team is working with them closely to ensure we have enough supply to support our customers.

Vivek Arya
Managing Director, Bank of America Securities

Got it. So it's more a supply question rather-

Jean Hu
EVP and CFO, AMD

Absolutely. Yeah, if you look at the successful ramp of our MI300X in less than two quarters-

Vivek Arya
Managing Director, Bank of America Securities

That's very impressive.

Jean Hu
EVP and CFO, AMD

Yeah.

Vivek Arya
Managing Director, Bank of America Securities

Got it. Makes sense. You know, there is a change the question this way: Most hyperscalers, you know, they pretty much have a good sense of what they will deploy over the next several quarters because they have to get the land, the power, additions, ready before they start getting all the electronics and compute, right, networking and, and so forth. So is there... Are those decisions still dynamic as we go towards the end? Just because your competitor has a very large product introduction in Q4, does that crowd out your opportunity near term in any way, or that is not how decisions are made, that they're already kind of set?

Jean Hu
EVP and CFO, AMD

Yeah, thanks for the question. You're absolutely right. So their planning actually is not just quarters, right? When you talk about the land, the power, data center space, when you think buying those things, it's actually multi-year. So I think Lisa, during our earnings call, and we always talk about it, is we have been working with our customer, hyperscale, multi-year roadmap. So when you think about those kind of decisions they are making and the CapEx they are spending, it got to be multi-year. You got to plan out, not only this year. The way to think about it is both, as you know, they have to invest significant resources, and we also have to invest significant resources. So that's how we work. Well, there's like, okay, if NVIDIA introduces some new product, it will change our opportunities, the trajectory for AMD.

We know it's from nothing to $40 billion-$50 billion, now to $100 billion, and then going forward, it's going to grow very significantly. So as a strong trajectory and the pace, our progress will continue to improve.

Vivek Arya
Managing Director, Bank of America Securities

Got it. How do we think, Lisa, about... Sorry, Jean, about the strategic positioning, which is, you know, NVIDIA has been in the market for a long time, has all the software and developer support and scale and whatnot. On the other side, you have a lot of custom chips, right? Which, you know, just due to their nature, constrained. So how is AMD carving a niche for itself and make sure that it is sustainable over time?

Jean Hu
EVP and CFO, AMD

Yeah, Vivek, that's a great question. Maybe let me just take a step back to, right. I'm new at AMD, but when you look at AMD history, since Lisa and Mark Papermaster joined the company, they had the strategy to drive the high-performance CPU and the GPU platforms. So then later on, we have NPU, we added adaptive FPGA. But for the journey over the last decade, AMD, just like NVIDIA, we actually share the same legacy, starting from gaming graphics and then getting into the high-performance HPC market. So even if you look at the ROCm team, it was not talked a lot because it was literally just in the HPC market. But we started that software investment quite a while ago.

So it's very important to understand that in the GPU market, there are only two players who share that same legacy, and our team has as much a deeper understanding about the GPU. Why? When you look at the ROCm software, we are able to make tremendous progress in a very short time period of time. It's because we actually understand how the hardware work with itself to make sure the GPUs can run really significantly, efficiently. So from that perspective, I think we are a new entrant to the AI GPU market. But if you look at the competitive positioning of our product and how much we have made progress on software to catch up, I think we can be very competitive.

I do think that, you know, if from our perspective, we talk about more than $4 billion expected for this year, we'll continue to make progress. From a company's perspective, you know, it's such a big market, we can absolutely make, continue the progress to address the opportunities here. And about, you know, we all know, with the semiconductor market, there's always that ASIC, especially when it's mature, right? Because the analogy is very fixed, the ASIC is cheaper. And so it's not surprise how we think about the AI market. We talk about $400 billion. We do think, you know, there is some portion there in 2027, 2028. For us, right now, we're really focused on the merchant opportunities. But if you look at AMD, we have been doing gaming long time, and

Vivek Arya
Managing Director, Bank of America Securities

Not basics.

Jean Hu
EVP and CFO, AMD

Exactly. It's about what customer need. If customer need us to do something, we absolutely will do it. But right now, the merchant market, the model changes so quickly, I think it's going to be hard for ASIC, and especially if, you know, two suppliers, both NVIDIA and AMD, has annual cadence to address the customer's need. The key question is, when the functionality get fixed enough, they can use ASIC.

Vivek Arya
Managing Director, Bank of America Securities

Got it. And the last question on that, you mentioned the $400 billion addressing 2027-2028. Does AMD still feel confident, comfortable, you know, with that kind of addressable opportunity? And then, more important than that, what are your market share aspirations as part of that?

Jean Hu
EVP and CFO, AMD

Yeah, it was when Lisa talked about $400 billion opportunity, it was a huge surprise to everyone. But since then, if you look at what the market has been, 2023 and 2024 this year, it looks like it's exactly as we projected from the market opportunities perspective. So we do think there are more and more, not expanding, but more and more proof point from productivity improvement. People are getting return on investment to justify the market opportunity. Of course, it's about framing the trajectory of the market versus the precisely, you know, is this $300, $400, is it 2027 or 2028? But it's the durartion we feel strongly with what we said. I think, you know, right now we're quite small.

We are the new entrant, but we do have a set of competitive product. We feel pretty, we are accelerating our roadmap, is because we see the demand continue to exceed our expectations. We see customer need two suppliers, but it's a very, very large market.

Vivek Arya
Managing Director, Bank of America Securities

On the data center, server CPU, side, could you give us the perspective on both the AI workloads and then kind of the non-AI, workloads? Because, again, there is a perception dollars are cannibalizing a lot of the non-AI, and traditional server CPU demand, and you get to see both of it, so it'll be really useful to get, get your perspective on that.

Jean Hu
EVP and CFO, AMD

Yeah. Yeah, thank you. It is actually really interesting to look at the AI workload in both the hyperscale and the cloud—hyperscale cloud and enterprise data center. Really different workload need different compute engine, even for AI. If you look at the AI inference, a lot of the inferences were done by the server in the past. The GPUs, it has tremendous advantage in large language model, both the training and the inference. But at the end of the day, what the customer really want is to do their job. They're managing their workload and managing their applications. And so when you think about all different applications across the globally, additional workload will continue to be run on CPUs.

Those kind of things actually is much more efficient, get the best TCO, like your ERP system, like your Facebook, you know, Instagram, Facebook, and Chatbots and all different things. It can be run. So even some of the inference, what we're hearing from enterprise customers is they can run it on CPU. But for the large language model, definitely we think, GPU is like a compute for those models, training and the inference. So we do have a broad set of product portfolio. We can address customer need, especially enterprise customers. We'll show them both. They actually make choices, what's the best for them.

Vivek Arya
Managing Director, Bank of America Securities

Got it. So just kind of a more near-term question. I think Q1, with normal seasonality, right? Down high single digits, low double digit, and so forth. How are you thinking about the rest of the year? Do you think seasonality is the right way to model it, or given the drive some above seasonal, trend also in the background?

Jean Hu
EVP and CFO, AMD

Yeah, yeah. We actually have made a tremendous progress with our CPU market share. At the end of Q1, our market share—revenue market share percent. So we did guide Q2 strong year-over-year, double-digit growth, and sequentially, we do expect server business to be up. We think the second half, there are some tailwinds that will help us to continue to drive server business to grow faster than first half. I think what you mentioned in second half, that definitely will help us because it will have a leadership performance, continue to drive the TCO for our customers. But the major ramp is actually going to be what we're saying is even though in cloud market, the demand continue to be mixed.

But our Gen4 family of product and continue because you know for the first-party workload we have quite a significant market this year. But now the third-party workload we're really making progress because of the TCO benefit. As market one of the things you know it's really for me as a CFO I pay a lot of attention is we actually for the Genoa family of product we argued with 40% license servers. And what it means is not only upfront the CapEx you can cut by half and your operating cost can also. For any CFOs of course CIOs this is huge. And what we have done is we have been enhancing our go-to-market approach for the last 18 months.

Feet on the street, you need to talk to each enterprise customer, their CIO, the CFO, about the TCO benefit. That work has now seen the benefit. So that convert each large company a time. And what we're seeing is the momentum of conversion of a large enterprise customer to get TCO benefit. So family actually going to get more traction in second half to get more market share again. And hopefully, the enterprise replacement market will be better, too.

Vivek Arya
Managing Director, Bank of America Securities

As we move quickly to PCs, a lot of announcements on this AI PC, right? You mentioned that Lisa announced, you know, AMD's fixed point product, right, with 50 TOPS. I think TOPS, I think AMD Luna... Intel Lunar Lake is, like, 45 or so, or 48. I think Qualcomm is 45 or so. How important is just chasing this kind of TOPS performance? What will deter... First of all, kind of tangible, and then secondly, is it really just this TOPS performance that'll be the differentiator, right, between these different product offerings?

Jean Hu
EVP and CFO, AMD

Yeah. Yeah, it's a great question. IPC. AMD actually was the first to introduce the Ryzen 7000 Series, first generation for AI PC, and then we had the second generation. So even though we have AI PC, we sold millions of units, but fundamentally, the AI PC requires a lot of AI application. And what they talk about, they're going to have so many different applications coming in second half. That's the most important thing. I think, when you think of we introduced, not only it has CPU and the GPU and NPU, the TOPS are important, is because that NPU, the offload to run, needed to be, have the performance to handle whatever application out there.

So we need to see what applications that can fundamentally improve it and the content creation, that's something we're very excited about. Is because, if those applications are introduced, how many TOPS you have, how few we have, and how can you handle the CPU side, you do need all three of them, right? Because in order for PC to be... continue to be the productivity tool for enterprise edition , for consumers, you need all three. I think that's why we feel strongly is AMD is best positioned, because we have the best CPU, best GPU, and-

Vivek Arya
Managing Director, Bank of America Securities

Arm recently said that they expect, you know, Arm-based PCs to be 50% of the market, right, over the next five years. I imagine you would not agree with that,

Jean Hu
EVP and CFO, AMD

No.

Vivek Arya
Managing Director, Bank of America Securities

I thought so. And by the way, it's not just total PC market. They actually said Windows-based, because they already have Apple, right? That's 10%-12%. So effectively, they're saying they'll be over half the market. So I do, because we have seen Microsoft, you know, just loudly right, support Qualcomm, you know, that's kind of the exclusivity till the end of the year, et cetera. So how big of a threat is Arm, AMD pivot to Arm-based architectures also?

Jean Hu
EVP and CFO, AMD

Yeah, yeah. No, this is a great question. So if you think Arm PC, it has been around for a long time, right? It's not new. I think one, when you really think about the customers, enterprise or consumer, in the end, do they care it's x86 or Arm? It's all about performance. In the end, that's what people fundamentally care. Economics dictated that. So I do think from that perspective, x86 has been getting more... You know, if you talk to our CTO, Mark Papermaster, he will say, fundamentally, when you look at the architecture level, they're not much, too much Arm and x86. It's just they are operating under different ecosystem environment.

So x86 ecosystem environment, we have been 15-20 years, or whatever years, and, you know, all your software, all the everything is built on that ecosystem. For Arm, absolutely, they are new today. It's probably easier for Arm to use, Arm PC to use, but, most of the backward compatibility, the reason x86 get a burden a little bit, is because backward compatibility everywhere. So we do think, x86, will continue to get more and more competitive to provide the performance, the battery life people want. You know, do people really ask what's inside of that, PC? I do think, we are very well positioned and, you know, Arm is very low, right? For, like, a long, long time, it has been in that range. Ecosystem is very important.

Vivek Arya
Managing Director, Bank of America Securities

Right. Makes sense. And since you are the CFO, I thought I would ask a financial question also in the last... On gross margins,

Jean Hu
EVP and CFO, AMD

Yeah, yeah.

Vivek Arya
Managing Director, Bank of America Securities

Do you think this annual cadence of launching products helps your top line growth and share? But do you think it can work against your, you know, gross margin ambition for 57%?

Jean Hu
EVP and CFO, AMD

No, we don't think so. We're making significant progress with our gross margin. If you look at the last year, 2023, we're at 50%, and 52.3%, and the guided 53% for Q2.

Vivek Arya
Managing Director, Bank of America Securities

Right.

Jean Hu
EVP and CFO, AMD

Second half will get better. In general, our data center gross margin is better, right? So when we will continue change the mix, that will help our gross margin in the long term.

Vivek Arya
Managing Director, Bank of America Securities

Even MI, you think gross margin can become accretive despite the faster-

Jean Hu
EVP and CFO, AMD

Yeah, yeah. Yeah, absolutely, when you look at it, it will be accretive to corporate average.

Vivek Arya
Managing Director, Bank of America Securities

Terrific. I have three more pages of questions, but we are out of our time. Thank you so much.

Jean Hu
EVP and CFO, AMD

Yeah, thank you so much.

Vivek Arya
Managing Director, Bank of America Securities

Thanks.

Jean Hu
EVP and CFO, AMD

Thank you, everyone.

Powered by