Advanced Micro Devices, Inc. (AMD)
NASDAQ: AMD · Real-Time Price · USD
421.39
+66.13 (18.61%)
At close: May 6, 2026, 4:00 PM EDT
416.27
-5.12 (-1.22%)
After-hours: May 6, 2026, 7:29 PM EDT
← View all transcripts

Goldman Sachs Communacopia + Technology Conference 2023

Sep 5, 2023

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Great. I'd like to get started, if you can take a seat. Good morning, everyone. For those who missed the prior session, welcome to the Goldman Sachs Communacopia + Technology Conference . My name is Toshiya Hari. I cover the semiconductor and semi-cap equipment space. We're very excited, very honored to kick off the micro portion of the conference with Dr. Lisa Su, Chair and Chief Executive Officer of AMD. Lisa, thank you so much for doing this.

Dr. Lisa Su
Chair and CEO, AMD

Yeah, thank you.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

I know you're extremely busy, so.

Dr. Lisa Su
Chair and CEO, AMD

Thank you. Thank you for having me.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Thank you. We'll get started right away. We've got 35 minutes. You know, since your appointment as CEO back in 2014, you've grown the business by roughly 4x. At the time, I think you managed about 10,000 employees. You're now up to 25,000. As the CEO of the company what are your key priorities? How do you spend your time? If you can kind of kick us off at a high level, that would be great.

Dr. Lisa Su
Chair and CEO, AMD

Yeah, absolutely. Well, first of all, it's great to be here. Oh, can you guys hear me? Yes. All right, great. So it's great to be here this morning, Monday morning, nonetheless. I will say that it is a Tuesday morning, actually. I would say it's, the company has changed a lot over the last nine or 10 years. We have many more markets, we have a lot of customers, a lot of opportunities, but fundamentals have not changed very much. We're all about high performance computing and how do we really push the bleeding edge of technology.

So, we spend a lot of time on roadmaps making big strategic bets, 'cause the bets that we make now are really coming to fruition over the next 3-5 years. I spend a lot of time with customers and where the market inflection points are. And then, of course, today, our first, second, and third priority are around AI, AI, and AI. So, a lot of opportunity when you look at AI and how it's impacting computing across all elements, from data center to edge, to client, all of those are very significant opportunities for us. And, as a result I think the next 5 years will be even more exciting than the last 5 years were.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Great. I was hoping you would say AI, because my next question is on AI. So, so you have one of the most comprehensive portfolios as it pertains to addressing AI. You've got CPUs, you've got GPUs, you've got FPGAs, and you've got the team from Pensando. Can you speak to the long-term opportunity you see in AI and explain to us how you're different or better positioned relative to some of your peers?

Dr. Lisa Su
Chair and CEO, AMD

Yeah, I, I think the thing about AI that's become really clear is, it's really transformational for every aspect of whether you're talking about business and enterprise, or you're talking about research, or you're talking about our, our, productivity in terms of our daily lives. And so you need computing everywhere. It's a very clear sort of the largest secular growth driver that we have. I think what makes AMD unique is we've actually spent the last 10 years putting together all of the computing pieces that you need. So you need CPUs, you need GPUs, you need FPGAs, you need AI accelerators, you need the software capability around that. And that's really the portfolio that we have.

I think the most interesting part or the most, let's call it the highest profile part right now, is what's happening in the data center, especially around training and inference for these large language models. I think we've been in this space for the last five years, really building up our intellectual property or portfolio and sort of all of the software required. I think we have the confluence of events right now, where you have a market that is skyrocketing, and then you also have our technology that is very, very well positioned, particularly around these largest language models and what you need for AI in this space.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Maybe on that point, hoping to ask about customer engagements that you're having. I think on your most recent earnings call, you talked about a 7x increase in engagements as it pertains to AI in the quarter. Can you maybe expand on that statement and describe in detail what customers are asking for in terms of workloads? Is it training? Is it inference? Is it both? And maybe the breadth of your engagements as well.

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So, it has definitely been incredibly busy time with customers. You know, we said on our last earnings call 7x more customer engagements just in the past quarter. And what that means is, these are customers who are actively evaluating and using MI250s, as most recently, MI300s, either in their labs or in our labs. And, that's across a number of workloads, so that's across training as well as inference. On the inference side, we're very advantaged because we have, for our MI300 family, we built it so that it could be very flexible, depending on what was most important. And, as a result, we have the highest memory bandwidth and memory capacity.

You know, even since our last earnings call, I would say over the last 30 days, what we've seen is a continued acceleration of those engagements, and a number of those engagements have now turned into customer commitments, which we're really excited about. And that's MI250 is an excellent product right now, so there are a number of folks that have just started deploying and will deploy through the second half of this year. And MI300, a tremendous excitement. I think those customer commitments are people who want to ensure that they're very, very early in the cycle, because they see the capabilities of the hardware. They've given us really good feedback on the software and how we can work with them on it.

so I would say overall, we continue to see just huge opportunity around AI in the data center for us.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

A nd, Lisa, I guess on that point, in terms of feedback on the MI series, that's a big, big focus for investors. You talked about customer commitments, which is great. What has been the feedback on your hardware and your software? Again, which applications or which areas do you feel like you have an edge over the incumbent? And how should we think about the MI250/300 ramp over the next, say 12-18 months?

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So I think what we've heard from customers are, first of all, very, very active engagement. So they're, they're all, excited about using MI250 as sort of learning about our software ecosystem. Some are doing, deployments with MI250. Many are, very excited about MI300. I think the, the differentiation that we have is one, we're, very, very competitive in terms of hardware capability, for training. And for inference, as I said earlier, we actually have industry-leading memory capacity and memory bandwidth, which is extremely important, for large language model inference. I think people see right now that, frankly, inference will be, a sort of, a higher volume, than training going forward.

Everyone's looking at how we can optimize sort of inference query per dollar, and that's somewhere where AMD does very well. The feedback that we're getting from customers is a lot of conversation has been around, is our software ready? Can customers readily deploy? I would say with the number of engagements that we've had, customers have been able to get up and running very quickly, sometimes out of the box without much optimization, especially if you're running something like PyTorch, where we've optimized our software very well to PyTorch, TensorFlow, some of the other model frameworks. And then, sometimes in a short number of weeks we help do some optimizations around some of their specific needs.

But the net is, I think it's gone well in terms of how customers have been able to adopt. You know, from our standpoint, MI300 looks really good in the labs. We are on track to launch and start shipments in the fourth quarter of this year, ramping into to next year. So a lot of activity, but we feel really good about it.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. Got it. And I guess the question that we also get is about the crowding out of traditional spend on traditional server spend, if you will. You know, many companies, including yourselves, have spoken to this shift in wallet share, if you will, away from traditional servers to acceleration. Do you see this as sort of a one-time transitory thing that is very specific to 2023? Or should we brace for continued sort of shift towards acceleration going forward?

Dr. Lisa Su
Chair and CEO, AMD

Well, I think the key is the data center market continues to be a very attractive market, and any way you look at it, people need more compute. I think what we've seen this year is there's been a little bit of we're coming off of some of the heavy purchasing that happened through the pandemic. And so some of the cloud guys are sort of optimizing their, their data center spend a little bit. But as you go forward, I think there'll be a nice balance between general purpose compute. You know, lots of workloads will continue to require CPUs, and general purpose compute will continue to be important. I think what's more important there is efficiency.

So there's a lot of conversation about data center space and power are at a premium, and so you want the most efficient compute, and AMD delivers the most efficient compute in the CPU space. And then there's a huge opportunity for accelerators. I think our own data would say that we could see accelerators becoming a $150 billion-plus market in 2027, so that's significant growth. And we have the ability to really really optimize as we look at both general purpose compute and accelerated computing. so from that standpoint, we will continue to see data center being a very strong growth market overall, and then for AMD, it'll be our largest growth opportunity.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. It's fair to say that CPUs aren't going away.

Dr. Lisa Su
Chair and CEO, AMD

CPUs are not going away, no. I would say that, especially very efficient CPUs.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. Good to hear. Maybe shifting gears a little bit, talk about the competitive landscape in server CPU. You know, you've done a really good job in growing your footprint over the past several years. When you look at the third-party data, the rate of share expansion over the past quarter or so, it's slowed a little bit. So we are getting investor questions about your competitive footing relative to other x86 competitor. Speak to what you're seeing there and any progress on the enterprise front? Because historically, you've done really well in cloud, maybe a little bit less so in enterprise, where share tends to be a little bit more stickier.

Dr. Lisa Su
Chair and CEO, AMD

Yeah. Thanks for that. You know, when you look at the progress that we've made with our EPYC product portfolio over the last five years, I mean, it's been tremendous progress. You know, we started with, let's call it, 1% share in 2017, and we've grown very, very nicely over the last four or five years. When I look at, where we are today, though, I would say there's still significant opportunity for us. You know, we see growth across a number of vectors in the server CPU space. We've done very well in cloud for sure. We're very honored to have sort of every major cloud company using AMD. We've done very well on internal workloads.

I think we have even more opportunity with cloud as it relates to external workloads. You know, Amazon just announced their new M7a with Genoa. Google last week announced their C3D with Genoa. Oracle, Microsoft, all of the large guys have Genoa in their portfolio. And then enterprise is, again, a market where I would—I think we're underrepresented. I can tell you that I've spent a ton of time with enterprise customers over the last number of months. You know, they want the, the power efficiency, the performance capability. It's back to the optimization. There's a lot of old compute in these data centers, and they need to be replaced. Genoa is 1.9x more performance. It's 1.8 x better power efficiency.

Clearly, there's a value proposition that's there. You know, enterprise is a little bit muted right now. I mean, there are some decisions that are being made that are taking a little bit longer, but when I look as we go into 2024 and as we go forward I think there's still significant opportunity for us on the server CPU side.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

I guess within the context of enterprise, any verticals or applications where you're particularly excited about?

Dr. Lisa Su
Chair and CEO, AMD

Well, I think when you look across the board, but if you talk about financial services or telco or manufacturing or sort of anything that you require significant computing capability, I think we do very, very well. We've partnered very well with the top ISVs as well, to ensure that things are optimized to EPYC.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

As we've discussed, you've executed really well within the context of x86, but we are getting questions from investors who are a little bit concerned about competition with your customers. We talked about Graviton backstage, but how do you intend to compete with not just Amazon, but perhaps a larger group of customers who are looking to develop internal solutions?

Dr. Lisa Su
Chair and CEO, AMD

Yeah, I think the key thing about this whole computing landscape is when you have an area where you need all different kinds of compute for different workloads, it actually plays very well into someone like us, because we have all the pieces. We have CPUs, GPUs, FPGAs, we have DPUs. We have a very capable SoC ecosystem. You know, we're leading edge in terms of foundry and manufacturing. For those reasons, I believe we're extremely well positioned. Some of our customers are developing their own silicon, and I think that makes sense, right? When you have computing workloads that are so diverse, there are some places where you want to do custom silicon.

We also have the ability to do custom silicon with all of our various pieces, and for us, it's around leveraging our IP to ensure that there's the best, let's call it, workload-specific optimization. So, I feel that our sort of overall roadmap, standard product roadmap is very, very well aligned to where our largest customers are going. But I do think over time that we'll see ourselves do more customization. You know, as the volumes in compute get higher, it makes sense for you to do customization, and customization off of our workload, our IP capability, I think will do very well.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Okay. And when you say customization, is this sort of custom ASICs or your CPU is more tailored to certain customers and certain workloads, or a combination of both?

Dr. Lisa Su
Chair and CEO, AMD

I think it's a combination of particularly using our IP, whether it's CPUs or GPUs and putting them together. If you think about what we've done, for example, in a very, very different market in the game console market, it's been for very high volume applications, taking our IP and putting it in the most optimized format for a given workload, in that case, gaming. Now as we go into the data center, and especially around AI, we're gonna see some very similar characteristics, right? One is the volumes are gonna get significantly higher. I think workloads are more optimized, depending on whether you're doing training or inference. Are you talking about the largest language models?

Are you talking about sort of more tailored models? And with our IP portfolio and our capability we can optimize for those different workloads.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. I wanted to squeeze in a relatively near-term question on data center, if that's okay. I think the biggest sticking point with investors today is sort of the implied second half data center outlook. Half over half, it's a pretty robust outlook you have embedded in your guidance. Can you share with us your conviction level as it pertains to your ability in hitting those targets?

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So I think when you look at the data center market today, I know there's a lot of puts and takes that people are trying to understand. I feel very good about the market. I feel that the first half of the year we did have some of the optimization going on with some of our largest cloud customers. As we move into the second half of the year, what we see is first of all, the positives are around AI. The AI demand is definitely there, and what it's driving is both CPU as well as GPU demand for us.

CPUs as the head node for some of these AI instances, and GPUs with, as I mentioned interest in MI250, that we see over the second half of this year into next year, as well as early shipments of MI300. So those are positives, and I continue to see that the AI demand is positive there. There are some things that we're watching as it relates to what's happening in enterprise. So enterprise is continues to be, I would say, a little bit muted, as I said, and there is some optimization going on in cloud. So if you put all those factors together, we do see a very strong second half of the year for our data center business.

It's really on driven by some of the ramps in AI that we're seeing, and we feel good about it.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

That's great. Shifting gears a little bit, let's talk about PCs for a couple minutes. It's really nice to see your fundamentals improve in the client business. A year ago, we were going through pretty rough times for you and the broader industry. Do you see a path back to 300 million plus units for the PC industry, or is the current 240-250 kind of a steady state? And how should we think about AI as that becomes integrated into PCs? Could that drive a big replacement cycle?

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So I think the PC market, look, was very volatile over the last four or five quarters. I think we had to off of the highs of the pandemic, there was a lot of sort of inventory dynamics that were happening. You know, clearly, we're pleased with our PC business performance. I think the second quarter was strong. You know, we're guiding to a stronger second half of the year, as we see the sell-in more closely match sell-through. When I look overall at the PC market, I think it's a strong market. What we see now going forward is the opportunity for a very differentiated PC environment, where AI does become much more important. We're leaning into AI in PCs. You know, we've put-...

A significant amount of effort with our AI engine, both hardware and software, working very closely with our partners like Microsoft, ensuring that we have that capability going forward. So I do see growth for the PC market going forward. I think for our own business, I would see that we will grow better than market from the standpoint of you know, strong demand for our products. Our Ryzen 7000 series that recently launched, we're seeing strong demand for Phoenix, as well as some of the other you know, products in the product portfolio. Going into 2024 and 2025 when you think about you know, PCs as your productivity tool with AI, like, we can all be much, much more productive you know, with it.

So I do think there are opportunities for us to grow in the PC market as a market, and then as AMD growing above market.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. And then in terms of the competitive landscape in PCs, I think your market share peaked at 20% plus. I believe it was the June quarter of last year. Since then, according to third-party data, you've lost about six percentage points of share. How should we think about your strategy in PCs going forward? Which segments do you pursue? Which segments do you perhaps deprioritize?

Dr. Lisa Su
Chair and CEO, AMD

Yeah.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Your differentiation. Talk a little bit about that.

Dr. Lisa Su
Chair and CEO, AMD

Yeah. You know, and I think our strategy in PCs is about gaining profitable share, and that will continue to be our strategy. What that means is the segments of the market where we see significant differentiation are things like, commercial, premium consumer gaming, AI-driven PCs. Those are all places that play very well into our strengths. We are not prioritizing as much the low end of the market. You know, those are sometimes I call them empty calories. You know, they're good if you have a fab that you need to fill, but they're not necessarily good from a profitability standpoint.

What I like about our PC momentum, though, is really this focus on commercial is very synergistic with our enterprise data center focus. so much of what we're doing is really ensuring that we have a strong go-to-market there. So the products are excellent, and as we drive more go-to-market capability, more feet on the street, from AMD standpoint, selling to CIOs we think there are opportunities for us to gain profitable PC share. Especially as we add AI more broadly throughout the portfolio, we think this is an opportunity for more enterprise adoption of AMD PCs.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

The Xilinx acquisition has been an absolute home run. It's been growing very fast and obviously a very profitable business. I'm cognizant of the near-term headwinds, but how would you describe the long-term growth opportunities in the FPGA business? It would also be helpful if you can give us an update on how we should be thinking about some of the synergies.

Dr. Lisa Su
Chair and CEO, AMD

Yeah, absolutely. So, the Xilinx acquisition has exceeded everything in our acquisition case, so, it's been very positive. I think the integration of the teams have gone very well. The embedded business in general is a great market. You know, it's a very durable market. You know, we saw exceptional growth over the last actually 6 quarters. We've seen. You know, we're getting into a period where there'll be a little bit of a normalization second half of this year into first half of next year. But when we look at longer-term opportunities, we really like these markets, right?

If you think about industrial, aerospace and defense automotive, communications, networking, these are markets that play very well into needing more computing. We have a stronghold with the embedded Xilinx portfolio, and we're adding to that our embedded processing portfolio. And there's actually quite a few sales synergies that we've seen. So we've seen customers who are, let's call it, traditionally Xilinx customers, that now are asking for, "Hey, show me all of the AMD portfolio, 'cause I want to really leverage the full computing portfolio." But I will say probably the number one synergy for the acquisition is in AI. The addition of the Xilinx IP and the Xilinx talent has actually substantially strengthened the AMD AI portfolio.

When I think about, for example, we were talking about MI300 and what we've done there, we've actually really consolidated all of our software resources in a place where we can accelerate our ROCm software development, so that our portfolio is actually much, much stronger with the combined teams. In addition, when I talk about PCs, the AI engine that we're using in PCs actually came from the Xilinx acquisition, and we're using that to build out sort of the accelerator portion of it. So, we feel very good about the embedded business as a business. We feel even better about how it is a better together story, especially around AI, so the timing couldn't be better to really bring together these assets.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

I think you have a CPU part that's tailored to the comms infra market as well. Is that part of the story with Xilinx as well?

Dr. Lisa Su
Chair and CEO, AMD

That, that's right. So when we talk about some of our communications and networking customers who were you know traditionally very strong Xilinx customers, they're now looking at a broader portfolio of AMD, and processors become part of it. So we expect, if you look at just the embedded segment of our business, we would expect the processor portfolio of that to grow very nicely over the next few years.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. The geopolitical environment continues to be challenging, tricky. Remind us roughly what percentage of your total revenue today is derived from customers in China, headquartered in China? And more importantly, how are you navigating through the current backdrop? Has your strategy as it pertains to China evolved at all?

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So, China is an important market for us. From a revenue standpoint, it's probably about 20%-25% of our revenue. I would say that's not all indigenous some of that is coming back outside of China. When I look at our strategy, we continue to believe that China's an important market. Obviously, we're monitoring all of the export rules and regulations. but we'll continue to invest in China because we think it's an important market.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. Somewhat related to that, a question on how to think about your supply chain. You have a very deep and long history with TSMC. That said, given the geopolitical environment, I'd imagine you and your team are thinking through all sorts of scenarios. Is TSMC's proposed expansion in Arizona good enough, or do you need to think of a plan B, if you will, as it pertains to your foundry relationship?

Dr. Lisa Su
Chair and CEO, AMD

Well, I think we've gotten extremely good at managing supply chain, so I would say that's one of our core strengths. You know, TSMC has been a phenomenal partner for us, in terms of advanced technology, both on the silicon side as well as on the packaging side, and we, very much value that relationship. geopolitical when you, when you think about the geopolitical situation, geographic diversity is important to us. So the Arizona factory, very important to us. We're gonna be one of the early users. We're putting our first tape-outs in, shortly, with the idea of being a significant user of Arizona.

I think we'll continue to look at geographic diversity, as an important piece of it, and also just supply chain diversity. You know, one of the things that comes up as we're talking about AI is, are we able to even supply everything that we need to do? and frankly, we've been working on that, for many quarters. And so we think that supply chain is actually a strength for us, and we really look forward to, being able to rapidly grow the company for all of the various, opportunities that we have, given our, our supply chain strength.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

I guess on that point, Lisa, I think you've been very front-footed as it relates to supply chain diversity, and I think 18 months ago, 24 months ago, substrates were an issue. But at this point, is there more to do in terms of supply chain durability, or are you, are you pretty happy with how you're set up today?

Dr. Lisa Su
Chair and CEO, AMD

Well, I think we have invested ahead of the curve. So we invested ahead of the curve, for substrate capacity, and now we feel very good about the capability there. You know, right now, a lot of the conversation around the most advanced, AI chips is around CoWoS capacity, as well as high bandwidth memory, HBM capacity. We have, very strong relationships across the supply chain there, so we feel good about what we need to do. We were actually one of the very early users of 2.5D and 3D packaging, and so I think that's given us good visibility in what needs to be done, to, secure that capacity. So, yeah, we feel good about it.

W e'll always keep an eye on supply chain, but I do think that it's an area where we've made some good proactive decisions.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Got it. I guess to sort of bring everything together, I wanted to ask a question on the long-term financial model. It's been a year plus since your most recent analyst day. You threw out a couple of targets, I think 20% revenue CAGR, gross margins higher than 57%, operating margins in the mid-30s, and 25%+ for cash flow margin. A lot has happened since last June in terms of the macro, in terms of AI. Are those targets still relevant? What are some of the upside down scenarios?

Dr. Lisa Su
Chair and CEO, AMD

Yeah, absolutely. So a lot has happened, as you said, a lot of puts and takes. I think at a high level, the key drivers of our financial model are very much the same. There may be a few differences in terms of the details. So what's gonna drive our a substantial growth over the next few years? It's gonna be around our investments in data center embedded and commercial. Those are the key things. In the data center business, we now see just a huge opportunity across both our CPU and GPU franchise, but certainly AI has driven those opportunities. You know, I think they're even more pronounced now than they were a year ago.

We continue to watch the macro like everybody does, so we need to see how all of that settles out, in the near term. But we feel good about growth. I think margin expansion for us is very much around product mix. You know, as more of our business goes towards those data center embedded businesses, those tend to be margin accretive, portions of our business. You know, we've had, a little bit of headwinds as it relates to the client business, PC business, as we were doing, some of the sort of the inventory, dynamics needed to be corrected. As we come out of that, we expect, improvements there, so that's our margin expansion story.

With that we expect to see leverage in the overall model that will drop to the bottom line. so we'll go through the details as we think through what happens over the next number of quarters with the macro. But overall, I think the key fundamentals of the model are very much intact, and I think the punctuation on it is AI growth will be a larger driver than perhaps we thought a year ago.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

You know, with your model, you generate very robust free cash flow. When you and Jean and the broader team sit together and think through capital allocation, how do you prioritize investing in the business, which I assume is number one, maybe M&A, maybe shareholder return, but how do you think about those things?

Dr. Lisa Su
Chair and CEO, AMD

Yeah, I think we have a very, very strong balance sheet, so we can do all of the things, all of the above. I think number one priority is always going to be invest in the business and investing in the business, to drive kind of the growth that we're talking about. You know, we've committed to shareholder returns through buybacks. We're going to continue to do that. You know, that is something that our board and our management team are very supportive of. I think we do have the opportunity to do some strategic acquisitions as well mostly to build out sort of some of our capabilities. So we just did a small software acquisition for AI.

W e're looking at you know, a few things, but mostly around you know, building out sort of our overall technology capabilities. But I think the strength of our balance sheet gives us a lot of options as we move forward.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Okay, great. We do have a couple of minutes left. I wanted to pause here and see if we had any questions in the audience. I think we have mic runners. Any questions?

Dr. Lisa Su
Chair and CEO, AMD

You've asked them all.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Oh.

What are the biggest pieces of the puzzle you need to get, have a very successful MI300 launch? Is it networking? Is it software? Is it other pieces? What's sort of the thing you need to get right, to get that really going?

Dr. Lisa Su
Chair and CEO, AMD

Yeah. So, we feel really good about where MI300 is today. I think we have hardware is great. I think we have strong software capability as well. I think the customers are very much actively working on it. I think for us right now, it's around execution. And execution we want to continue to ensure that our software, as we tune the software, the product will actually get better. So as we see it, we see opportunities for performance to improve over time. But from what customers see today, they're very excited about the capabilities. So this is about execution, qualification, getting products up in the data centers of our customers.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Yeah.

Thank you. Terrific discussion, Lisa Su. You talked a lot about AMD's advantages on the inference side. Could you expand on that some more, especially in the context of inference being increasingly done on the edge as opposed to on the cloud, essentially billions of devices, et cetera. It seems right now, generally looking at the field, that the consensus seems to be focused on more LLM AI in the cloud, and it feels like that has not been re-thought through on the edge side. I'd love to get your sense, given AMD's excellent history.

Dr. Lisa Su
Chair and CEO, AMD

Yeah. Look, it's a great point. I think a lot of the emphasis has been on the largest language models in the cloud, because that's the nearest term let's call it a very steep ramp. But I think over time, what we're going to see is you're going to need inference everywhere. And what we like about our solution is we have the, we have the solution for the largest language models. We also have the solution for the edge and for the client. And, and then our software stack should work across all of those pieces. So we do think there's a large opportunity there. It's perhaps a little bit, if you think about it as a 24-25 opportunity, whereas right now it's about large language model inference.

In any case, I think we have a broad portfolio to service it. Thank you.

Toshiya Hari
Managing Director and Lead Analyst, Goldman Sachs

Okay. With that, I think we're out of time. Lisa, thank you so much for supporting the conference. Really enjoyed the conversation. Thank you so much.

Powered by