Marvell Technology, Inc. (MRVL)
NASDAQ: MRVL · Real-Time Price · USD
158.21
-6.10 (-3.71%)
At close: Apr 27, 2026, 4:00 PM EDT
157.27
-0.94 (-0.59%)
After-hours: Apr 27, 2026, 7:12 PM EDT
← View all transcripts

Citi 2023 Global Technology Conference

Sep 6, 2023

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Welcome to Day one of Citi Global Technology Conference. My name is Atif Malik. I cover U.S. semiconductors, semiconductor equipment, and communication equipment stocks here at Citi. It's my pleasure to welcome Matt Murphy, CEO of Marvell Technology, and Ashish Saran, our VP, Investor Relations. I'm going to kick it off with my questions first, and then I'll open it up to the audience. If you have a question, just wait for the mic to come to you, and then you can ask your questions. Welcome, Matt.

Matt Murphy
Chairman and CEO, Marvell Technology

Great to see you. Thank you.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Matt, I'm going to start with a topic that is on everyone's mind, artificial intelligence, and you guys are on a great play on that theme. When it comes to AI, Marvell is one of the company that has multiple AI opportunities. It is uniquely positioned from compute to networking and passing by electro-optics.

Matt Murphy
Chairman and CEO, Marvell Technology

Mm-hmm.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Can you walk us through your strategy and why you think you are best positioned to address this theme?

Matt Murphy
Chairman and CEO, Marvell Technology

Sure, happy to do it, and great to see everybody. Just a quick side note, I was reflecting, you know, this was the first conference I did when I became CEO of Marvell in 2016. And, my esteemed friend here had a sell rating on the company, and I think our stock was at $10. It was a grim situation, and I think the sell rating was probably still warranted at the time. But anyway, how far we've come, right?

You fast-forward, and to your question about AI, I mean, just as a quick kind of overview, you know, the pivot we made back then was to really refocus Marvell on what we believed, really, what I believed would be the biggest kind of SAM growth opportunity in the semi industry, which was really the growth of the data infrastructure and kind of data platform companies driving huge semi growth, right? So you fast-forward, we're seven years later. I think that's played out well. Within that, within this data infrastructure SAM, that's grown, the AI and generative AI piece and accelerated computing even more broadly has become a massively important growth driver as a part of our strategy, if you follow me.

You know, and I would say that we think that we're at the very beginning of a long cycle here. And I'm almost thinking about accelerated computing as kind of that just like we thought about data infrastructure as a platform for us seven years ago, okay? And within that, I mean, so meaning it has the same attributes. You know, it's gonna be high performance, it's gonna be large SAM growth, and it's going to be driven by multiple products and technologies to be successful. So the three you mentioned, one was in the connectivity side, right?

Which is really our high-speed optical communications products, where we have a very strong leadership position, both in technology that powers inside data center communications, both for in traditional cloud infrastructure as well as in AI clusters. We have leading technology that also connects data centers together because that's gonna be a more and more important part of the equation as you scale out your data centers and you move more inference and processing, you know, to the edge of the network, close to where the consumers are. We have a growing position in custom silicon, which we think is gonna be, you know, really, the growth will accelerate with accelerated computing there. We can talk about that.

Very strong offering in this five nanometer cycle we're in right now, and we've got some pretty exciting programs that we won several years ago, right, that are now coming to fruition. Because this isn't a business and infrastructure where you can just decide, "Oh, wow, this is super exciting. How do I get in on that?" I mean, we won some of these programs back in 2020, you know, and now they're finally going to production. But that's an exciting opportunity, especially as that whole compute SAM really opens up, with the move to acceleration. And then in networking, we have a growing position in switching technology, really driven by the acquisition of a company we did called Innovium.

And we're now in great position with our latest product, which is a five nanometer technology, 51.2T switch. It's done on the Marvell process flow. It has our own IP, our own SerDes, but it has the great innovation and architecture from Innovium. And so those three things, if you think about it, is processing the data, moving the data with inside data centers and around the data centers, those are gonna be... They're fundamental, right, to the performance of these types of systems. So, we can talk more about in detail about those, but we're kind of in the heart of really where all the action is right now.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Sure. Just to start with the electro-optic side, you know, that section of the, the business, seem to be the primary driver right now of your AI sales. Your sales growing to $200 million quarterly run rate, or $800 million annual run rate, way ahead of schedule. Can you just, pull the curtain a little bit and, and dive deeper into how cloud providers are approaching their AI build?

Matt Murphy
Chairman and CEO, Marvell Technology

Sure. Yeah, it to your point, I mean, we're very pleased with how that business has really strengthened throughout the year. I think, you know, we're in the middle of a more broad semi-cycle correction, and this I think the folks that are levered to AI right now are obviously doing extremely well. And I'd say that those sort of order trends, forecast, everything's really, really continue to improve every month, really, and really taking off. Starting back, really, when ChatGPT was kind of announced, and then there was a little bit of a lag. And just to put it into context, the $200 million a quarter, with, as you said, most of it coming from this electro-optics area-...

This technology came from an acquisition we did of a company called Inphi, which if you remember, I think you followed them or maybe your colleague did. But when we bought the company, its trailing twelve months revenue was around $680 million for Inphi. It was projected the year we were going to close it to do about $800 million. I think Street had them at $810 million. So to fast-forward two years later, and we're talking about exiting the year just on electro- optics for AI at $200 million a quarter, that was the size of all of what we predicted the whole co to be, you know, just two years ago. So it's been a tremendous asset for us.

To answer your question, I think how they're thinking about deploying is very broad-based, and you're seeing it, you know, in NVIDIA's numbers and sort of the CapEx trend changes. It's a massive deployment cycle, and all of it is driven by obviously needing the right AI sort of GPU technology or, in some cases, it's our customers' custom chips. But all of those have optical interconnect attached to it, every single system. And in some cases, the ratio of the attach is actually more than one to one, right? In terms of an AI element versus the connectivity. Now, the ASPs are obviously much different, but it's driving a tremendous, you know, growth cycle.

We've had upsides, big upsides that we're doing a great job of meeting with our manufacturing partners. So yeah, we see strong growth in that business clearly this year, and also we see strong growth again going into next year, in that part of the portfolio, both on the between data center or DCI, as well as the inside data center side.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Matt, are you seeing most of the demand on 800G, or are you also seeing some demand for your Inphi products on 400G? And, you know, the beauty is that you guys are agnostic to InfiniBand or Ethernet. So maybe just talk about 800G versus 400G demand.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah, I think if I characterize it at a high level, the way to think about it is that. And this is, I think, what our—how we sort of project going forward, is that the AI systems are going to drive the highest frequency and the highest performance optics. And so today, the—almost all of the growth in 800G is due to AI. And then the way to think about the rest of the PAM portfolio is the traditional cloud infrastructure is still really either at 200G or 400G PAM. And some haven't even upgraded quite to it yet, but that's coming next year. So the NRZ transition is not done yet. NRZ being the old technology that now moved to PAM.

So what we see is going forward, and we announced at OFC, our next generation product, which is double the bandwidth at 1.6 terabit per second, that'll be deployed in AI first. And then, you know, we'll see sort of the traditional cloud stuff move to 800 gig, and then it'll— by the time that moves to 1.6 T, we'll probably be on to 3.2 T. So that's sort of the cycle we're on, and I'd say the AI, refresh rate looks to be around half-

Ashish Saran
VP of Investor Relations, Marvell Technology

Yeah

Matt Murphy
Chairman and CEO, Marvell Technology

... you know, in terms of the speed. So, you know, call it 18-24 months versus, you know, three to four to five years on the other side. So I'd say the new product development intensity level has actually picked up on the optical side. And given the fact that the capacity, just the raw computing capacity, has gone up so much, there's real throughput limitations to actually get the data on and off the card, the cluster, et cetera. And so I think it's going to drive a pretty big refresh cycle on optics, on switching, and on data center interconnect as a sort of a tailwind or a byproduct of all the growth in AI.

Ashish Saran
VP of Investor Relations, Marvell Technology

And maybe, Atif, just to add, I think the key takeaway to your question is, not only are we seeing a lot of growth in AI, which is fairly expected, but in our cloud business, even the non-AI portion, networking in particular, is also growing very strongly. It grew very strong sequentially, Q2 to Q3, and we're expecting that to continue, right? So I think that's the other thing to keep in mind is we're seeing broad growth within the infrastructure. AI clearly faster, but even the non-AI portion, right?

Matt Murphy
Chairman and CEO, Marvell Technology

Right

Ashish Saran
VP of Investor Relations, Marvell Technology

After going through maybe a couple of quarters of an inventory correction very early in the year, has started to come back out again.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah. I think that's an important point because we've had some great meetings this morning, and I think it continues to be a worry on investors' minds that with the shift, so kind of the hard pivot really to the AI CapEx spend is what gets impacted on the traditional cloud infrastructure side. And who gets impacted? And, you know, there's only so many dollars, right, available. But what we're seeing in our business, you know, real time, given our product mix, which on the traditional systems is not really compute intensive, it's really networking and connectivity intensive, that business is growing really well. And we guided our third quarter-

Ashish Saran
VP of Investor Relations, Marvell Technology

Mm-hmm

Matt Murphy
Chairman and CEO, Marvell Technology

... in data center up kind of mid-teens. We said that was with a, a headwind, by the way. There's a piece of that data center business that's on premise, that's kind of legacy, that's actually down. So you got to think, "Okay, if that's down, then the rest of it's up." And we said, obviously, AI's up a bunch, but the, but the, the traditional cloud infrastructure stuff is up like double digits plus. Q2 to Q3, that's what we guided, and we said it was going to keep going in Q4-

Ashish Saran
VP of Investor Relations, Marvell Technology

That's right

Matt Murphy
Chairman and CEO, Marvell Technology

... and it was going to grow through next year. So yeah, we, I think, I think because of the reasons I mentioned earlier, kind of our product mix lends itself to to growing kind of in both segments of the cloud, if you will, even if there's a CapEx shift.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

So that's an interesting observation. So is there kind of a lag effect to networking? Is this Marvell specific, that the cloud is the non-AI part of the cloud is also growing, or is there a lag effect between compute and networking when things go?

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah, I don't think it's specific to us. I mean, I think you could ask our large peer competitors, and they probably are seeing that business grow. And I actually look out to next year, really, in the next couple of years. I mean, there's a big Ethernet refresh cycle coming again, right? Because remember, most of the networking today is done at the kind of the 12.8 terabit per second switch platforms that are out there. We have some portion of that. We have one large competitor that does really well there. Mostly, the industry skipped this 25.6 generation, and everyone kind of waited for the next one, which is at 51.2, because then you get a quadrupling of bandwidth.

So those products are gonna be released to the market industry-wide, you know, ourselves and really, really, you know, one other large competitor, and I think that's gonna drive a very significant, you know, networking silicon TAM expansion upgrade cycle over the next few years. And that'll be driven because of some of the AI, AI stuff, but also it's been, it's been like four years, right? Since you had this sort of 50 gig IO, this is the 100 gig transition. So anyway, it's pretty exciting, you know, because you've got, you've got sort of this, this networking tailwind that's, that's a little bit independent but obviously helped by AI, and then the connectivity. You know, we actually have more content and more sort of dollars if it's an AI system versus not, right? So that, that, that trend benefits us too.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Great. Just staying on Inphi, how do you see the changing AI landscape impacting DSPs versus linear drives versus, you know, co-packaged optics? I mean, it sounds like AI is accelerating everything, and that should help you guys kind of protect your 90%+ market share.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah. Yeah, we so I think there's kind of a short-term view of this, and then there's the longer-term view. And in the short term, you know, all of these current systems that are out there that are being deployed, AI systems, I mean, these designs were done and qualification was done on these like two or three years ago, okay? So, so there's kind of there's a lot that's been baked already because the qual process has been long. These systems have been, you know, under development. And so, you know, for the kind of current generations we see in the foreseeable future, you know, this, this need for pluggable optics is only going to continue, okay? And that's gonna be the preponderance of all the deployments for a long time.

And there's a lot of reasons for that, but the main one is sort of scalability, interoperability, assurance of supply, and the fact that once it's qualified, it's qualified, and if you need to change it, it's pluggable, so you can remove it. There's some challenges with some of the other technologies you mentioned. But I would say longer term, you know, our view is the number of ports, if you really believe the accelerated computing trend is gonna just drive a massive disruption in the TAM for semis, the number of ports that is going to be deployed is going to explode, okay? Our view is we never want to get caught in the innovator's dilemma.

We're not like, head in the sand, "Well, we have this one business," and we're looking at, hey, how do you develop and deliver the best solution for these customers, right? So we're not opposed at all. We can get content, by the way, in linear direct drive. It's not a problem. We may not get the full DSP, but we can get a TIA, we can get a driver, we can actually help our customers solve problems. Same in the area of co-packaged optics or silicon photonics. I mean, we're shipping high volume today of silicon photonics. You'll get every Marvell/Inphi 100 gb or 400gb ZR module that we ship for data center interconnect. We have our own silicon photonics inside. We're in high volume production, so we have that technology for sure. The question is, when is it needed?

Is it deployable at scale? Does it work technically or not? And then what's the trade-off between just, hey, I can, I can, I can just ramp up today because I know I can get access to a myriad of pluggable optics supply from a number of companies, or do I go more proprietary and bespoke? And we, you know, our view at Marvell is we're prepared to supply the necessary technology to the industry to really enable and drive accelerated Computing, and it's not a negative. People, people shouldn't go, "Oh, my gosh! Well, if, if, if that happens, then..." You know, and that was a worry like back at OFC, right? Oh, Marvell's business is going to disappear this year because somebody showed a demo of linear direct drive. We've known what linear direct drive is for a long time. It hasn't made sense yet.

You know what I mean? So and I don't think that that's played out. I think our optics business has only gotten stronger this year, and it's only gonna get stronger next year. But we're not head in the sand, and I think if we do this right, we can be really the invaluable supplier to our customers by providing a suite of options for them. And that'll only be—if the pie will only get bigger, if you can do things in a cost-optimized way. So if somebody doesn't need pluggables, that's okay. We have a whole plan to go address that. And if somebody wants to do really, really dense, customized designs that are kind of controlled by them with silicon photonics, then that's something that we can invest in as well. So we have the building blocks.

It's really what actually makes sense and what's going to get deployed. So we're kind of looking at that more than, you know, here's a PowerPoint we could show, and it's got some cool things. And, you know, that, that's fine, but I think we tend to look at things very practically at the end of the day at Marvell, and so we're, we're kind of prepared. We're not... We're okay because the ports are gonna be so big, it doesn't even matter.

Ashish Saran
VP of Investor Relations, Marvell Technology

Right. Yeah, now, in a realistic timeframe, meaning three to five years-

Matt Murphy
Chairman and CEO, Marvell Technology

Mm.

Ashish Saran
VP of Investor Relations, Marvell Technology

I mean, pluggable, not just our view, I think the industry remains, that is the tech-

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah

Ashish Saran
VP of Investor Relations, Marvell Technology

... technology of choice. We're already shipping 800G. We already have announced, and we're the only ones who have announced, actually, a 200 gigabits per wavelength, which is a 1.6T product, which is absolutely critical for increasing densities when you go to these next generation AI clusters. The reality also is it's not just what can compete on the current generation, which is where these alternative technology demos have taken place. The reality is we've already gone to 1.6T, and you should believe we have a 3.2T in the roadmap, right?

I think as long as the feedback from our customers, which is what matters at the end of the day, is pluggables are primary choice for certain niche applications, we absolutely want you to invest and investigate alternative solutions in the long time frame in case we do need them at some point. So that's.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah

Ashish Saran
VP of Investor Relations, Marvell Technology

Kind of the summary I would say of-

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah

Ashish Saran
VP of Investor Relations, Marvell Technology

of where we see the industry going.

Matt Murphy
Chairman and CEO, Marvell Technology

I would add a final point. I would say the way to think about it, too, is the faster the beat rate of these upgrade cycles for AI systems, the longer, quite frankly, pluggables last. Because otherwise, you're just making a trade-off to say, "Well, let me slow everything down, and let me try this brand-new technology, and let's hope it works, so I can save a, save a watt, save a buck, save a..." You know, and I think there will be a time for sure, but actually, our thesis internally is just, hey, as long as people want to keep cranking on, you know, these—this level of product development, our customers, then pluggables is going to be around for a very, very long time because it just doesn't make sense to halt everything and try to switch to something new.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Great. Just to finish off the discussion, data center, you know, parts of that business were weaker on the last earnings call, storage. Though it's growing, but it kind of remains subdued because of your, you know, end customers' demand weakness and then Fiber Channel. So, walk us through, like, what part of your data center business are kind of slowing down, and when do you expect those areas to stabilize?

Matt Murphy
Chairman and CEO, Marvell Technology

Sure. Well, on the storage side, as it relates to data center, I guess our view is, well, it can only go up. I mean, basically, completely beyond bottom, right in our first quarter.

Ashish Saran
VP of Investor Relations, Marvell Technology

Mm-hmm.

Matt Murphy
Chairman and CEO, Marvell Technology

So we actually said, "Hey, good news, it grew in Q2, and it's going to grow again in Q3." But the real million-dollar question is: when does it come back to, you know, whatever run rate you want to pick? A lot of people want to just say kind of what was the pre-pandemic level, and we strongly believe it has to come back to at least where it was at some point, right? So whether it gets to 80% of that or 85% of that. So yeah, it grew from Q1, I'm saying storage data center, right, grew from Q1 to Q2, and it's going to grow again in Q2 to Q3. But it does. And we said in the last call, we just wanted to reset expectations 'cause it's hard for us to predict.

We're back in the supply chain on this, and it's really hard for us to read through all the layers of the chain to get to, like, where's the actual inventory and what's going on there. But our view is probing all the way to the end-to-end customer level. I mean, the TCO case for continuing for them to invest in new storage technologies to drive all this exabyte growth is still intact. That was sort of, I guess, the reassuring thing, right? There's on the hard drive side, on the flash side, I mean, despite the inventory and kind of how those companies are doing in the middle, you know, the end usage of it is considered to be just mission-critical for these large cloud companies. I mean, it's actually how they measure, in a lot of ways, the value of their customer.

“Hey, how many petabytes are you going to bring me?” Right? Because that's storage in, and it's really hard to get the storage out. So we think exabytes will continue to come back and grow. We think that, you know, in the cloud, it's going to still be, preponderance will be hard drive-based and nearline drive-based. It's just very hard for us to predict when it comes back, and I think even if we try, we're not going to get it right. So we've just sort of pushed it out to the right and said at some point it comes back, and then, you know, that's part of what was a headwind always becomes a tailwind at some point.

So, you know, but we said it pushed out meaningfully from where we were before, which was basically we thought by the fourth quarter, it was like a quarter or two ago, we would probably be back, you know, maybe a little lower than we were, but kind of getting back in line, and it seems like the whole industry slid that a couple of quarters to the right.

Ashish Saran
VP of Investor Relations, Marvell Technology

Okay. Yeah, I think the way I think about it is it comes back at some point it's a net positive for us. In the near term, it quite frankly doesn't matter. We're powering right through some of these kind of mini downturns, right? Whether it's storage, whether it's enterprise on-prem. If you look at our overall data center footprint, revenue from cloud is significantly, significantly higher than enterprise on-prem. Enterprise on-prem is a much smaller part of the business, and that's going to continue, quite frankly, even when it comes back, because the cloud portion just keeps growing much, much faster.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah.

Ashish Saran
VP of Investor Relations, Marvell Technology

With AI, it's got a kicker on acceleration, essentially, right? So as I look at the back half of this year, you know, as we guided, we said we'll be up mid-teen sequentially, which is all driven by cloud, which is growing a lot faster, and you should expect that's going to be a bigger number as you get into Q4.

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah. Yeah, there's always the glass half full, half empty, right? I mean, take the example of in 2019 when we had that fun little downturn, the industry, if you remember, there was the whole 2018 tariff thing and the correction. There was a storage correction we got hit with. During that cycle, we took a lot of pain during that cycle. If you remember, we still had some legacy like PC hard drive exposure. Remember, it always needed to kind of get wrung out, and instead of taking, like, two years to wring out, we wrung it out in, like, two quarters, and then that exposure was gone. So to the point when things actually came back, we got it out of the way.

Kind of to your point, in some cases, that mix in our data center business of the cloud and AI to enterprise on-prem, as painful as it is to kind of go through a down cycle, you know, the good part, the growthy part, will be a higher percentage of that business just structurally, even when the legacy stuff normalizes, if that makes sense.

Ashish Saran
VP of Investor Relations, Marvell Technology

Yeah.

Matt Murphy
Chairman and CEO, Marvell Technology

So mix just kind of gets better, although it... So the glass half empty as it hurts now, the glass half full is you feel better on the other side.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Got it. And then on the custom, basic compute side, you guys have talked about working with two- ... hyperscalers and customers, and we also hear from your competitors that they're involved with certain hyperscalers. So the question I get from investors is like, how do we get confidence in terms of you ramping up sales with those hyperscalers? And longer term, is there increasingly more competition in this market, particularly from in Asia fabless company as well?

Matt Murphy
Chairman and CEO, Marvell Technology

Well, I think on the first question, it is the million-dollar question. You know, I wish we could provide, you know, better visibility to investors at this point. But I think everybody understands how dynamic and how fast-moving the whole Gen AI thing is. I mean, if you just even look at our optics as like a small portion of that, like, how much our view of what that business could do this year changed in, like, two quarters. And so now trying to call the ball on next year ramp of some of these custom programs, it's just a bit early. They're both, you know, tracking in line with what we had said the last couple of quarters in terms of the new product development qual activities, so that's positive.

I think we need more time to really give investors a better view of what are the, what's the real scope and revenue expectation. The good news on that is, like a year from now, that business will have ramped up. It'll be at a certain level, and then we'll be able to actually kind of understand what's the, what's the run rate. And then, in theory, 2025 gets a little easier, if you know what I mean. But it's just, it's, it's very new for us. On the competitive side, look, I think the custom silicon and custom ASIC market has really moved, where the, where the TAM has really moved is to data center, right? And it used to be heavy enterprise, heavy carrier, consumer, and the volume has really shifted, and so companies are trying to move there.

I, I just think to do the really bleeding-edge, state-of-the-art, you know, you know, in, in like tier one hyperscale class, custom ASICs at the bleeding edge of complexity, there's a huge barrier there. There's a huge barrier there. And I think, you know, there's really... in my view, there's really us and one very, very good competitor in, in North America who have the, the process technology, packaging, IP, scale, supply chain relationship, long-term planning, focus, suite of products to sell, and that can be, you know, completely trusted to actually deliver, right? And deliver in a system for three years, four years, five years, and be able to meet all of the requirements that are needed. And I think there is also geopolitical concern as well.

I think when you talk about who's going to trust their business ultimately to, to a partner, I think, you know, I think more and more it's going to look—the U.S. guys are going to look to U.S. guys and so forth. And it's just really hard to do. And I think and it only gets harder, and the distance only grows, quite frankly, as you have to make these technology jumps. Because it's not just nanometers, it's not, "Well, I had five nanometers, and now I'm going to three nanometers." I mean, you've got to double the I/O speeds. You've got to completely—you've got a new CPU subsystem. You've got, you've got a whole new suite of IPs you've got to go develop, and everything just gets harder.

Because Moore's Law is slowing down, you're just not getting the bang for the buck as much as you used to. So you're having to solve thermals in a different way and power management. So I think the complexity is going up dramatically, each of these generations, and the cost to go do that and the scale required, and I just think it's going to be more and more rarefied air who can really, really compete for the long term there.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Let's see if there are any questions in the audience. If you have a question, please raise your hand and the mic will come to you. Question over there.

Speaker 4

Can you just talk a little bit about your shareholder returns, capital allocation going forward? I know that y'all talked about repaying some debt, I think maybe earlier this year, that you're going to resume shareholder returns. But just kind of given some of the headwinds, macro headwinds that you face, I think leverage is kind of trending higher right now after going down for a couple of years. Just kind of how do you weigh the capital allocation versus the change in direction and leverage?

Matt Murphy
Chairman and CEO, Marvell Technology

Sure. Yeah, no, thanks for the question, and I think, yeah, to your exact point, I think we've had really good progress on, on the leverage on the company over the last few years. You know, we definitely leaned in to buy and stretch to get Inphi, and, and over time, you know, we've continued to drive sort of the trailing twelve-month EBITDA up, and then we just paid down $500 million in June, and so that's sort of moving in the right direction. To your point on, some of the macro stuff we're dealing with, it's definitely impacted free cash flow and a few other items. You guys all see that. We're having to kind of go through that in this, in this cycle.

But our view very much is to you know focus like crazy on that and then really you know resume buybacks and shareholder return. I mean, we've been focused on shareholder return. We you know we worked really hard to put together this portfolio of technology, and we did a lot of M&A to go do it, so there was times where we were off spending money over there. But our view's been still very consistent from really our Analyst Day in 2021, which is we did what we needed to do to put the portfolio together we needed, and now it's really about driving our organic growth, and then anything excess that we've got is really focusing it back to shareholders. So we're laser-focused on it.

We're a little behind relative to where I wanted to be, just given, quite frankly, this macro pocket. And at the same time, we've got some of our businesses going through inventory correction, and then we got this massive AI upside, and that's quite frankly, guys, it's tying up some capital, right? 'Cause we're doing a bunch of wafer starts and having to buy ahead. And so I hope that answered your question. Do you wanna add to that? I mean-

Ashish Saran
VP of Investor Relations, Marvell Technology

Yeah, I mean, I don't think anything's actually changed from a financial policy. I think, as Matt said, you know, our focus is organic growth. I think as we said we're gonna go pay off that $500 million of debt down, which we did, and we started buybacks basically in Q3. Yeah. And you should expect nothing to really change kind of going forward, right? So back on a growth track, right. As you'll see, the operating leverage is starting to kick back in, and you'll see that more of that as we go through the back half of this year into next year. So overall, I would say nothing's really changed, and you should expect us to remain very consistent from a financial policy perspective. Sure.

Speaker 5

Yes, I have a quick question on the custom ASIC versus maybe general-purpose GPU, or maybe even saying versus might not be correct, but both of them seem to be going together. I guess one thing that could be helpful would be, as we move forward, perhaps we move towards more inference versus training. Is that where custom ASIC will have a bigger opportunity, or is it when models are starting to get optimized? We just wanna have a better understanding on when will custom ASIC have its real moment, GPU?

Matt Murphy
Chairman and CEO, Marvell Technology

Okay, did everybody hear that? I'm worried the mic wasn't on. Did everybody get the question? Let me just repeat it real quick. The high-level question was, you have, you have—in AI, you have GPUs, you have custom ASIC. What does that mix look like over time? Maybe a second question, hey, how much of your stuff is training versus inference? How does that play out? How do we think about that? Yeah, in our view, I think is pretty consol- The answer is, we've had a view for probably four years, okay, which was the following. And this is when AI was much more nascent, but we were involved in this. I mean, some of you may remember when we acquired Cavium, we had an AI chip. We had an inference chip. We brought it over. It was called M1K.

We had a whole team working on it, and we actually shut it down in 2019. Got out, closed, pack it in. Our view then, myself and my president, Raghib, was this AI market is going to be NVIDIA plus the hyperscale companies doing their own custom chips. That's it. No one else is gonna be successful, and I think we were right, at least so far. And of course, NVIDIA has just gotten much bigger than anybody sort of could have comprehended, which is just what an amazing job they've done. At the same time, the growth of custom for AI, driven by one large company now, but I think there's more coming. That's definitely happened too. That's a much bigger spend. So our view is it's gonna continue to coexist. That's our view.

And I can't really get into us and what we do and what our mix is gonna look like and what we're working on with our customers. That's obviously very, very, very, very sensitive, and there's already enough articles, rumor, things about Marvell won this or somebody won that, and it's this chip. So I really can't comment on our mix, but our view is that those continue to coexist because I think there's different opportunities. I think the broader accelerated computing, it's not just you make one chip and you've solved everything. And I think the people that are doing their own custom programs are gonna keep doing those and keep optimizing. And you see all of them are also announcing with NVIDIA too. And so I think people are gonna figure out how to make all this work.

In the end, this will all be good for anybody that believes accelerated computing is a real game-changing, industry-disruptive industry trend that happens. I think both win. Of course, for us, we provide all sort of the basic building blocks to enable all that, even outside custom silicon, right? With our optics and our switching products.

Ashish Saran
VP of Investor Relations, Marvell Technology

Yes, Rich.

Speaker 6

Maybe just a follow-up to that. As you say, that there's four billion of your competitor's business and three billion, and that's one product. Do you think that that's the way custom silicon goes? It's, you know, some lumpy, you know, whale hunting, or is it lots of little projects that you can kind of build up and becomes a sizable business?

Matt Murphy
Chairman and CEO, Marvell Technology

Yeah, hard to say, but I would, I would say based on what's happened so far, you know, given the... Let me say, let me say this differently. To justify the investment to do one of these chips, it's gonna, by nature, be very large and very significant. There's, there's honestly, no such thing as a small-

Ashish Saran
VP of Investor Relations, Marvell Technology

Mm-hmm

Matt Murphy
Chairman and CEO, Marvell Technology

... five nanometer design, and much less a three nanometer design. I mean, you're gonna, you know, I don't know what you're gonna spend, but somebody's gotta be willing to spend, you know, $1 billion lifetime or... You know, I mean, it's gotta be a big enough thing to justify the cost to develop it. So I think there's fewer and fewer big, big sockets as you go to these newer nodes, and so I think by design, they any custom program, I would say. So the stakes get higher, but also the value you can deliver, if you think about if you do it really well, if you could really nail it, the TCO, the savings you get from spending that could be actually enormous. And that's not just an AI thing.

I mean, that's also in other markets that we serve or other opportunities in data center. They're always—these companies are always gonna look at the TCO and the return, and but so far, it that trend is only going one direction.

Speaker 4

Hey, Matt, hey, Ashish, can I ask a question on the attach rate of PAM4 GPUs? So I guess you said it, in some cases, it goes above one to one. My question is, when, what use cases drive that attach rate to go up? And as we move to 1.6T and beyond in the future, how does that impact the attach rate?

Ashish Saran
VP of Investor Relations, Marvell Technology

Yeah, maybe I'll, I'll take that. So in AI, the reality is the attach rate to an accelerator from our DSP is almost always higher, well higher than one to one. The one to one was one example, essentially, which is attaching clusters to each other directly, but that's just the first level. Think about it from like, like a server to top-of-rack connection. So the first hop, essentially, is already optical in an AI system versus it's typically not in a traditional server infrastructure. So you get the one to one right there itself. But then remember, you've got an entire layer of leaf and spine switches which have to connect to each other, right, to actually form the network, which has a huge number of additional optical connections.

Now, every customer is slightly different, so we can't give you an exact ratio, but the key point is, to your question is, the attach rate is significantly higher than one to one in AI, starting today. Now, even with using the highest speed optical connectivity, which is 800 gigabits per second, you're actually not using full throughput, right? The throughput or the total, bit rate within those clusters is anywhere from almost 10 times higher than what's coming out of those clusters. So what happens as you go forward is you'll most likely see the density of optical connections go up in terms of the physical number of connection. That's one way you fix the problem, and the second way you fix it is, to your point, you go to the next generation. You go from an 800 gb to 1.6 T.

In reality, both will happen, right? You'll go to more next, right, when possible, as well as go to the next higher speed, and that's how you essentially get more bandwidth. So that's how I would look at it.

Atif Malik
Managing Director and U.S. Semiconductor Capital Equipment and Specialty Semiconductors Analyst, Citi

Okay. We're almost out of time. Matt and Ashish, thank you for coming to Citi Conference.

Ashish Saran
VP of Investor Relations, Marvell Technology

Thank you, Atif.

Matt Murphy
Chairman and CEO, Marvell Technology

Thank you.

Ashish Saran
VP of Investor Relations, Marvell Technology

Thank you for hosting us.

Powered by