Rambus Inc. (RMBS)
NASDAQ: RMBS · Real-Time Price · USD
146.81
-11.59 (-7.32%)
Apr 27, 2026, 12:17 PM EDT - Market open
← View all transcripts

Nasdaq 49th Investor Conference

Dec 6, 2023

Moderator

All right. Well, we are in the home stretch now. I'm pleased to welcome to the stage Rambus, a leading semiconductor supplier, with a rich heritage in intellectual property development and licensing, who's recently moved into memory interface chipsets, targeting service providers in the data center. With me today are Desmond Lynch, CFO, and Matt Jones, VP of Strategic Marketing. Guys, thanks for joining us.

Desmond Lynch
CFO, Rambus

Hey, Jeff, thanks for having us here today. It's a pleasure for Matt and I to be here at the conference. Before we begin today, I'd just like to mention our safe harbor statement about any forward-looking statements, and I would encourage everyone to please read the documents on file with the SEC, which contain a lot of more information than we'll cover in this short discussion today.

Moderator

Fantastic. Thanks for covering that. All right, so as I mentioned, you know, the strategy has changed over the years for Rambus. Could you maybe start off and give us an overview of the current strategy and areas of focus?

Desmond Lynch
CFO, Rambus

Yeah. Certainly Rambus has continued to evolve. We've been a semiconductor company for the last 30+ years. Really a pioneer within the semiconductor industry. The business was really founded based upon foundational IP associated with memory interfaces. And really the backbone of our business has been our patent licensing business, which has provided millions of dollars of free cash flow each year, which we've been able to invest both organically and inorganically into product programs. From an end market perspective, the company is really centered around the data center, with 75% of our revenue last year coming from the data center from there. Really, as a company, there's really three ways that we go to market as a company. First is the patent licensing business, which is the foundational IP associated with memory interfaces.

And that, that's used in all of today's modern compute systems. We have a robust patent portfolio of over 3,000 patents, and we continue to invest and innovate in our patent portfolio today. From a revenue perspective, this is around a $200 million-$220 million revenue opportunity each year. It's been very stable at the midpoint of that range. And what we're very pleased about is the certainty and the long-term cash flow generation that this business offers to us. The second area that we go to market as a company is on our silicon IP business. This is where we develop IP building blocks as a company. And what happens is that companies buy our IP and integrate this into larger ASIC and SoC type of solutions.

The portfolio today is really centered around the security IP and controller IP from there, and it's really positioned to take advantages of the growing opportunities within the data center, fueled by the growth of AI from there. The business is at scale today. It's about just over $100 million today, with the expectation that the business will continue to grow at 10%-15% on a go-forward basis from there. The last area that we go to market is with our solutions is on the chip business. This is memory interface chips, where we sell the chips to the DRAM providers who integrate our chips into the DIMM modules themselves. What we see here is exceptional growth from the company.

In 2018, our revenue was $38 million, and last year, 2022, we grew the business to $227 million, primarily on the DDR4 cycle. And what we're very excited about is the growth opportunities of the business going forward with the DDR5 cycle happening, just now. So overall, we have a really robust, financial model that continues to generate a lot of cash each year, and what we see is continued growth opportunities fueled by the data center going forward.

Moderator

That's great. Well, let's dive in a little bit on those memory interface chips. So first, can you just talk a little bit more about how you've kind of managed to have so much growth, how you've taken share, and then kind of build on what you were saying about the opportunity around DDR5?

Desmond Lynch
CFO, Rambus

Yeah, that, that's a great question. We've been very pleased with our growth on the chip side of the business. We got into the chip business as a result of one of our patent licensees encouraging us to become involved in this area, given our rich heritage in memory interfaces. The memory interface chip is a buffer chip that we sell today. This is a chip that sits in between the memory and the processor, but it has a really critical function as it really controls speed and really sends commands between the memory and the processor from there. Last year, as I mentioned, our revenue was $227 million, which was, you know, about 25% market share is the right way to think about this last year.

What we've been able to do with each sub-generation of DDR4 is continue to grow and take market share. We've done that by being laser-focused in our execution in producing quality and reliable products to the market. The DDR5 transition is taking place to the market just now, and we're very excited about the growth opportunities that that will offer to us going forward.

Moderator

That's great. And I know you guys have also been investing in companion chips under the DDR5 cycle. So can you talk a little bit about the opportunity there and-

Desmond Lynch
CFO, Rambus

Yeah. So what happens on the companion chip side is that these are an adjacent chip opportunities next to our memory interface chip or our buffer chip. Under the DDR4 cycle, these chips were on the motherboard, and they moved to the memory module, so that offers them the adjacent chip opportunity next to our buffer chip. So what we've been able to do is have a really strategic approach to our DDR5 cycle and also the companion chips. The number one importance for us was on the DDR5 buffer chip. We invested early into this technology, and what really happens on the buffer chip side is that you get to work with the ecosystem on signal integrity and the interoperability of these chips.

So we believe that we have a market share gain opportunity here, which will take us from 25% on the DDR4 cycle to 40%-50% on the DDR5. This is the highest value chip for us as a company, and that's why we invested early and have that leadership position. The companion chips are the chips that moved from the motherboard to the memory module. This is the SPD hub, the temperature sensor, and the power management device. Last year, we did announce two of these chips, which is the SPD hub and temperature sensor, and we're shipping this in low volume today. The power management chip is the last chip under the companion chips, from there.

Rambus is not known for its power management capabilities, so we had to build that skill set in-house, and we've done that by building a quality team. Just in the last earnings call, we talked about the fact that we had working silicon available for the power management, and the feedback from the customers is very positive there. What we'll enter now is the qualification phase of the power management chip. Overall, we size that the companion chip will add about a $600 million market opportunity for us, and we'll really get to see that the revenue will ramp towards the second half of 2024 and into 2025. So we're very pleased with our position on DDR5.

It offers the market share gains on the buffer chip that we talked about and also the content gains from the companion chip as well.

Moderator

That's great. And so as you kinda building on that further outlook, you mentioned you're very heavily concentrated in the data center today. Can you talk about some of the opportunities in the client market and how important that is?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, Jeff, that's a great question. As Des talked about, you know, the next layer of growth for us beyond our traditional RCD chip that we entered the market with. We start to see as the performance of DDR5 memory and memory in general in the client space starts to evolve, needs for functionality like we provide to the data center starts to trickle down. So as we reach data rates of 6400 MT/s in the client space, you'll start to see the need for high-speed clocking control for those DIMM modules that go into the client space. And you'll see some clocking chips enter and be present on the DIMMs that weren't there before.

So we'll see continued power management needs, and other things trickle down from there. We've just started to talk to investors and customers about this in some depth, and you'll see more of that from us in the next year. But we're very excited about that opportunity as we see some of the functionality trickle down to the high end of the client system, and then permeate as we move forward there.

Moderator

That's great. As we think about kinda big opportunities, obviously, everybody's talking about AI. And so how does generative AI and more kinda data-intensive workloads impact your opportunities, Matt?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, really what we look at in terms of AI as we impact it, you know, a lot has been written about the last mile, if you will. You know, the training engines, the HBM that go with that. Certainly, we play directly there. So we provide a silicon IP product in the form of an HBM controller that serves that vital purpose of interfacing the compute engine, whether it be a GPU or TPU type of functionality to the HBM stack on the device. So direct play there.

More fundamentally or more foundationally, the need for the data pipeline from storage all the way through to that last mile of the training that's going on today. Certainly that infrastructure benefits from the upgrade to DDR5. And so helping drive that is something we've taken great pride in, and certainly something we think is an opportunity for us going forward as the industry converts towards DDR5. Finally, you know, as we move into this world of AI, we're seeing finally, I think, you know, very clearly examples of heterogeneous compute. So general purpose CPUs coupled with GPUs. The data movement between those creates an increased security risk for us.

Security IP and protecting that data both in at rest in those far-end compute items, if you will, and as it moves around the system, becomes more important. IPsec, MACsec become vital products, and that's a place where we've invested in and lead the market in terms of those silicon IP products.

Moderator

That's great. Let's shift gears a little bit, talk about another longer-term opportunity. There's a new standard Compute Express Link, CXL. So, Matt, would love to hear about opportunities you have in both IP and silicon in that area.

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, it's a, it's a great tag on to the notion of heterogeneous compute. So CXL, you know, it's been talked about a great deal, next generation serial interface with an overlaying protocol that enables memory attach. We have been participating from a IP perspective. We provide a silicon IP block to the industry that aids in that connectivity between chips, and we're seeing that roll out today in both the supporting processors and some of the silicon that's coming out as a means for chip-to-chip communication.

Given our foundation in memory, one of the things that we're very excited about is the chip opportunity that this brings us going forward, where we have serial attached memory via the CXL link, providing additional DRAM in the form of memory expansion, and also enabling some pooling and sharing architectures down the road here. So believe that CXL is gonna have a very wide application. We're very excited about CXL-attached memory as we see that augmenting direct attached DRAM and the opportunities we have in the DDR5 space.

Moderator

That's great. So speaking of the silicon IP business, I was wondering if you could give us just kind of a little bit deeper dive on, you know, what is silicon IP and what's the value? What are some of the key drivers, and how do you think about your customer base when it comes to silicon IP?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, we may have fast-forwarded a bit. I talked about a number of silicon IP blocks there.

Moderator

Yeah.

Matt Jones
VP of Strategic Marketing, Rambus

So silicon IP, this is a product for us as a company. We develop IP blocks and specifically security engines and security blocks in the forms of hardware root of trust, and then specifically serial interfaces, PCI Express, and CXL controllers, and memory interfaces in the form of HBM and GDDR. We then sell these blocks to customers who integrate them into their SoCs or ASICs. Lets them focus on their algorithmic block at the heart of the chip and use our silicon proven IP blocks as the standard interfaces around them.

Speeds time to market, and provides us a great opportunity for customer diversification, because those customers are a much wider set of folks than we sell our buffer chips to, which tend to be the three main memory suppliers. So a lot of reasons that we like the silicon IP business, and it provides us a great growth engine. Again, with those blocks I talked about in security, CXL, PCI Express, HBM, and GDDR, as we see AI continue to drive both ASIC and in-house silicon developments.

Moderator

Great. And Desmond mentioned a little bit about the revenue profile of the business today, but what are some of the, you know, the longer-term growth opportunities there? Where do you see the, the revenue profile of that business?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, we certainly think that, for us, and again, you'll, you'll hear a lot about CXL and how, big it can be and vast. And again, it's, it's going to be, an ecosystem when you think about, the revenue opportunities. There will be, lots of opportunities in terms of, silicon. When we talk about this, we're speaking about, kind of the chip opportunity here for that, serial attached memory and the controller chips that would go in it. Somewhere in the neighborhood of $600 million-$800 million, as an opportunity for us. Looking at this, intercepting the market, very late 2025 and beginning, to, be material, in 2026.

If you look at that, that's aligned with around the time CXL 3.1 or 3.x will become available on processors.

Moderator

Great. All right, Desmond, back to you. Maybe we could just dive in a little bit more on the patent licensing. You say it's kind of the core of the business, right? And so, you know, you obviously have some very long-standing customers there. I think Samsung and SK hynix both renewed on kinda 10-year deals. When you think about that side of the business, what are some of the big upcoming renewals, and how do you guys think about securing that?

Desmond Lynch
CFO, Rambus

Yeah. The patent licensing business really has been the bedrock of the company. This is the foundational IP associated with memory interfaces I mentioned earlier on. If you look at the business today, we describe it as in a range of $200 million-$220 million. It's been very stable at the midpoint of that range. And if I break that midpoint down a little bit further for you, $150 million of that comes from the top three DRAM companies, with the remaining $60 million coming from a variety of memory, SoC, and FPGA providers.

Just in the last 12 months, we've been very happy to announce that both Samsung and SK hynix have renewed for 10-year terms, which will extend through 2033 and 2034, which I think really speaks to the strength of our patent portfolio and innovation engine there. It is important to note that the DRAM companies that are top licensees, but they're also our top customers on the product chip side that we talked about earlier on today. So what the licensees see is that they pay us the licensing dollars, which are reinvested back into product programs which directly benefit from them from there. The next major renewal that we have coming up is Micron, which will be Q4 2024 from there.

We've been very pleased with our execution and stability and long-term cash generation that the patent business offers to us from there.

Moderator

... That's fantastic. Now, one thing I'd love to just kinda ask you about, you guys still report the patent licensing under 606, but I know some analysts still use 605. So I was wondering if you could just talk a little bit about the difference between the two standards and, you know, if there's any gaps there.

Desmond Lynch
CFO, Rambus

Yeah. We adopted 606, which is the revenue standard, in 2018. As a result of 606, you cannot recognize some of the revenue because there's no performance obligations on the patents. So what we've been able to do is provide licensing billings, which is a proxy for revenue under ASC 605, which is a more straight line view of the sort of revenue. Analysts use that to substitute for our sort of results there. Today, there's probably about Q3. Q3 was the last reported quarter. We had about a $30 million difference between licensing billings and our royalty revenue reported under 606. But what we've been able to do is we've structured these contracts, such as Samsung and SK hynix, we've made them revenue recognition friendly.

So what you will see, Samsung will become revenue recognition friendly in this quarter, the December quarter. That will reduce this gap down to about $15 million. And then the next major event in this, cycle will really be SK hynix, which will be revenue recognition friendly in Q3 of 2024. And at that point, there'll be a very small difference between 605 and 606. So you will see that our financials will converge, together at that point.

Moderator

That's great. Thanks for the color on that. I think it's an important point as people are checking out the financials. And speaking of, you have a really strong financial model, which generates a lot of cash. So as products are becoming a bigger part of your revenue, just wanted to think about how you think about your long-term model.

Desmond Lynch
CFO, Rambus

Yeah, that, that's a great question, Jeff. You know, we've talked a little bit today about the sort of revenue drivers. We'll continue to see the stability of the patent business, as we've talked about. Matt went into a bit of detail about the drivers on the silicon IP, but that business is at scale above $100 million, growing at 10%-15%. But certainly, the largest opportunity that we see is on the product side, and we've touched upon that today with the DDR5 cycle coming up. We've talked about the client opportunities in CXL as well. So that's going to be the fastest growing area of our business, going forward from there.

If you really look at our gross margin profile of a company today, we're mid-80% on the sort of gross margin, so very healthy gross margins at the corporate level. And just breaking that down a little bit further, patents are obviously 100%. Our silicon IP gross margins are 85%-90% gross margin, and we have very healthy product gross margins, which are in the 60%-65% range. As we think long term, we will see that the total company gross margin will tick down as a result of products becoming the fastest growing area. But what you will see is some very nice leverage on the OpEx side. We'll continue to be disciplined and invest in the right areas, which is important.

We continue to grow on the top line from a product perspective, but you will see the leverage on the SG&A side, as we've already built out the infrastructure to support the top-line growth from products. What you do see today is an operating margin that is around mid-forties, again, 45%, with very strong cash generation, and that's what we'll continue to see going forward. We're highly profitable as a company with strong cash generation, and that will continue going forward from here.

Moderator

Given all that strong cash flow generation, how do you think about capital allocation? You know, any changes as you kinda shift from patents and IP to, to products?

Desmond Lynch
CFO, Rambus

Yeah, that's a great question. You know, we're very fortunate to have a robust balance sheet. We're debt-free as a company, and we continue to generate a lot of free cash flow each year. Our capital allocation strategy has really been built around three pillars. Organically, we'll continue to fund the high growth opportunities ahead of us. As we've talked about today, that's certainly in the product areas. Inorganically, we'll continue to look at acquisitions. We've been acquisitive over the past five years - four years, sorry. We've made five acquisitions, which has really been centered around the silicon IP area, and that's got that business to scale. We'll continue to look, and we have a healthy funnel on the M&A side, but we'll continue to be disciplined there, as strategically, operationally, and financially, it needs to make sense from there.

Really, the last area on capital allocation is return back to shareholders. We do have a commitment of returning 40%-50% of our free cash flow back to shareholders, and over the last three years, we've returned $300 million, which is above the high end of our commitment there. As we go forward, I would expect to have a sort of similar sort of playbook on capital allocation, which I think we've executed very well over the past couple of years here.

Moderator

That's great. Would love to pause now, see if we have any questions from the audience. Yep, one in the back there, and we'll get a mic back to you.

Speaker 4

Okay. Hello. If I understand the architecture correctly, with advanced packaging, the HBM memory is put on top of the GPU chipsets. If this is correct, my question is, is then still your controller interface necessary?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, that's a great question. So yes, the way that the HBM is packaged, it's put onto chips that are stacked wafer on silicon or on substrate, excuse me, type of manufacturing. And so the controller IP is integrated with the compute element in the case of a GPU, the GPU matrix processing engines. That does control the memory that's on that same subsystem. It's a necessary element to it. It's not a chip in this case, it's a piece of IP that people choose to integrate rather than building that themselves.

Speaker 4

Oh, okay. Got it. Thanks.

Matt Jones
VP of Strategic Marketing, Rambus

Yep.

Speaker 4

Maybe a follow-up one. Not sure if you mentioned it and I missed it, but regarding the AI, can you share the AI revenue part of your entire revenue, like 10% is impacted by AI, or it's like 30% rather?

Desmond Lynch
CFO, Rambus

You know, from an AI perspective, you know, that's, it's certainly positive for us as a company, and we see that in two areas. One is the silicon IP that Matt sort of talked about. This is the HBM and the GDDR controllers. That's not something that we break out. We, we describe that more as, you know, it's part of our silicon IP revenue, which is the $105 million-$110 million. We also see an AI play on the sort of chip, sort of space as we continue to see the progress of the, processors being sort of built out there. And, really, that will be continue to be beneficial under the sort of general compute going forward. But today we ship to the memory, module makers themselves.

Our visibility into how our revenue breaks out between general compute and AI servers is not something we really have visibility to. But overall, the AI transition will be beneficial for us as a company.

Moderator

Any other questions from the audience? Now I've got a couple follow-ups here. Great. Well, one question I did want to dive in a little bit more. We talked about the opportunity around DDR5, but we've seen in kind of past transitions, there has been some price erosion, as the new standard comes out on the older standard. Can you guys comment, you know, what you expect to see as you kind of move from DDR4 to DDR5?

Desmond Lynch
CFO, Rambus

Yeah. What we see from a pricing environment today is that DDR4 has been in the market for 7+ years just now. So you can assume that we've been squeezed down on the sort of pricing from there. On DDR5, this is a generational change, so we get to see an ASP reset there. And there is a premium relative to DDR4, which is positive from us from there. What we also see is on the DDR5 cycles, if you follow maybe Intel's roadmap, the DDR5 sub-generations are coming around every sort of 12 months, as opposed to every 24 months under DDR4. So with each sub-generation, you will see an ASP reset from there. As a company, we don't break out the specific ASPs, given we're in a three-customer, three-supplier environment.

But what we've done is we've been very disciplined in our approach to pricing management, and we'll continue to drive cost savings to manage towards our long-term product gross margins of 60%-65%. And if you look at our history here over the years, last year it was 61% on the product gross margins. This year we're tracking towards 62%-63%, which is towards the midpoint of the long-term model that we've talked about, which is very healthy for a product chip business from there.

Moderator

Yeah, absolutely. Matt, one other follow-up for you just on the CXL opportunity.

Matt Jones
VP of Strategic Marketing, Rambus

Yeah.

Moderator

You know, how do you see the competitive landscape shaping up there, and how's Rambus gonna differentiate itself?

Matt Jones
VP of Strategic Marketing, Rambus

Yeah, certainly. So we would expect this is a nascent market coming from nothing to, you know, like you said, depending on who you talk to, millions or billions. It's certainly has the momentum of the industry behind it. If you look at the member companies in the CXL Consortium, a very united front, very great backing for this. You know, we do see some competitors that range from, you know, the traditional competitors that we see in the memory interface space. We see a range from startups through some of the bigger players that have talked about things. So you'll see a roster, you know, across the board, I think.

But I think we're very excited about that opportunity to compete with those folks. I think their interest in the market certainly validates, you know, what we're doing, and hopefully, we validate them going forward.

Moderator

Fantastic. Well, I wanna thank you guys both for your time today. Thanks for sharing your thoughts, and hope everybody enjoyed the conference.

Desmond Lynch
CFO, Rambus

Thanks, Jeff.

Powered by