Arteris, Inc. (AIP)
NASDAQ: AIP · Real-Time Price · USD
26.72
+2.41 (9.91%)
At close: Apr 24, 2026, 4:00 PM EDT
26.75
+0.03 (0.11%)
After-hours: Apr 24, 2026, 7:59 PM EDT
← View all transcripts

Rosenblatt’s 5th Annual Technology Summit - The Age of AI 2025

Jun 11, 2025

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Good morning, everyone, and welcome to Day 2 of Rosenblatt Securities' 5th Annual Age of AI Technology Summit. My name is Kevin Garrigan, and I am one of the semiconductor analysts here at Rosenblatt. We're pleased to have with us Arteris' CFO, Nick Hawkins, and Arteris' CEO, Charlie Janac, who will be joining shortly for this fireside chat. We currently have a buy rating on Arteris with a $14 price target. We're bullish on Arteris because of rising SoC design complexity, the shift to chiplets, and growing AI adoption driving demand for outsourced networking on-chip IP, which Arteris is a leader in. Throughout the fireside, we will ask for any questions from the audience. To ask a question, you can click on the quote bubble in the graphic on the top right-hand corner of your screen. I'll then read the questions to Charlie and Nick.

Thank you, Nick, for joining the conference. It's great to see you again.

Nick Hawkins
Executive Vice President and CFO, Arteris

You too, Kevin. You too.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

For anyone listening that may not know the Arteris story, I thought we'd just start out with, you know, just a brief overview of Arteris.

Nick Hawkins
Executive Vice President and CFO, Arteris

Sure. We are essentially the creators, the inventors, in a way, of what's called the network on chip, the NoC. The NoC is an essential element of a system on chip, any complex IC. It is essentially the communication fabric of a chip, which is these days made up of multiple components, sometimes hundreds. Having an efficient, power-efficient, fast, and low-heat generation NoC is super critical, especially in the current climate, especially with AI demands being hugely power-consumptive, for example. The use in any mobile devices, for example, including automobiles, where power consumption is a key criterion. You can't make a chip without a complex chip without interconnect, especially as we move into chiplets. Complexity is our friend. We also have another element of the business, which is what we call our SoC Integration Automation Software, or SIA for short.

That was really brought to us through two acquisitions, one of a company called Magellan in France, in Paris, in fact, in 2020, and supplemented by a second acquisition in the U.S. of a company called Semaphore in 2022. We have a leading position in that SoC Integration Automation Software, which, again, once you're designed in, it's very, very hard to live without that solution. Some people try to internally, but it's not as effective. We find that we have a very successful business element or business unit as well. Yeah, we're all about connectivity and communication within the chip, Kevin.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Perfect. Yeah, I appreciate that overview. I figured we'd start with a few industry topics. You know, there's a lot going on with SoC design complexity. Again, you mentioned chiplets. You know, on the SoC design complexity side, compared to four or five years ago, what is it about the design process that has kind of become more complex?

Nick Hawkins
Executive Vice President and CFO, Arteris

If you look at the functional chips, really are made up, or complex chips are made up of multiple functional blocks. The creation of the NoC, the necessity of the creation of the NoC environment, which, as I say, was an Arteris creation back a decade and a half ago, before under 10 functional blocks in a chip broadly is simple enough that you do not really need a complex interconnect. You can hardwire the functional blocks together, and that is how things used to be built. The way that we created it, it is essentially like think of a Cisco network in an office and miniaturize that. Essentially, that is what we created, and it is a miniature network, which is why it is called a NoC on a chip.

Twenty years ago, 15 years ago, most chips were over the streets, or they were very small, very low complexity, sub-10. That seems to be the magic number of functional blocks on a chip. We now, eighteen months ago, actually crossed the point where we found a chip actually in Japan that had more than 500 functional blocks. That is getting super complex on a single chip. It has now gotten so complex, Kevin, that the size of the die to contain all of those functional blocks becomes almost unworkably large. This is where the sort of move to multi-die solutions, chiplets, and so on, emanates from, essentially breaking those huge chips with hundreds of functional blocks into smaller chips.

Of course, all of those smaller chips need to communicate, and they'll come together in a single package, but they still need to communicate. In fact, the communication, once you get into kind of 3D chips or multi-dies between the layers and in intra-layer, becomes obviously even more taxing. This is, in fact, what's driving, and I know this is a theme that you cover very well, the move, the general trend to increasingly outsource NoC design to the commercial market, which is where Arteris plays and Arm plays, and a few other smaller competitors play. It's not a very well-populated space. We're certainly the second largest player in there after Arm. Arm is sort of more focused, I think, these days on its own chips, as you know. It's been well documented. They're still there. It's still a force.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

There it is. It's incredible how complex these days chips are getting. I mean, everyone's trying to add everything under the sun onto a single piece of silicon, and it's definitely getting a lot harder. That helps you guys out, so.

Nick Hawkins
Executive Vice President and CFO, Arteris

It does. As I said, complexity is our friend, because the more complex chips get, the more companies have to rely on complex interconnect. Most companies, if you take any of the big semis, for example, a lot of them have their own teams. Most of them have their own teams. They also, a lot of them use the commercial market, and increasingly are moving in that direction. We see that as a significant vector, sort of a tailwind vector for the company, because it is not only more efficient and effective to use a commercial solution, which is silicon-proven and tested on hundreds of chips, thousands of chips, hundreds of chip designs, and billions of actual chips, when compared to even the largest company might have 30 or 40 designs, semiconductor company I am talking about, in a year, and we have hundreds. Over the lifetime of the company, we've had many, many hundreds. It's a very exciting time for us. This is why I say complexity is our friend.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, no, absolutely. I have a couple of questions on kind of the shift from insourcing to outsourcing. Just before that, you talked about chiplets. There may just be a lot of people out there that do not understand kind of what chiplets are, the benefits of chiplets, why is there this transition. Can you just spend a minute or two providing a little bit more in-depth on what chiplets are and why you see the transition to more chiplet-based architecture?

Nick Hawkins
Executive Vice President and CFO, Arteris

I will give you sort of a novice view, but I think it's something that we also ought to ask Charlie when he's able to join from his Uber. Basically, this is what I was mentioning earlier on. Chiplets are a solution for complexity. As chips get more and more complex, it's just in size. It becomes increasingly difficult to make a single chip perform all the functionality that's required from the multiple functions. As you say, the demands on chips are ever-increasing. While the chiplet market is actually still nascent, it's still in early stages of development, it's clearly a developing theme for the world. You'll see that market growing. It's not a huge market right now, but it's a very important market because it's very future-oriented.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Okay, that makes sense.

Nick Hawkins
Executive Vice President and CFO, Arteris

Charlie can give us some more science and technology around that answer, which is mine is a sort of a high-level 40,000-foot view.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah. Separate topic, again, you talked about the shift over time from internal to commercial. How is that transitioning happening today? Is it faster than you expected previously? Is it kind of on par? Do you see it accelerating as we kind of move forward?

Nick Hawkins
Executive Vice President and CFO, Arteris

Yeah, let me give you some color on that. It is not an overnight process, is the way I'll preface it. If you look back to when I first got into the semis world, which was a couple of decades ago, even then, predominantly, EDA was an in-house solution, generally. Thirty years ago, it was almost exclusively in-house. And Charlie, as you probably know, you probably remember, was employee number two at Cadence. He saw it when Cadence was a sort of a $30 million-$40 million business. It was absolutely in its infancy, and 95% of the market was internal, and only 5% was sort of third-party commercial EDA solutions. If you roll forward to today, Cadence, Synopsys, Siemens have the lion's share of the market. There are very few people who do EDA internally now. It's just too complex. That took several decades.

We are not at the beginning end of that curve. We think that maybe 25% of the market right now is commercial. The other 75% is still internal. It is shifting. If you go back to, I guess, pre-Arteris, that was sort of the cadence of 15-20 years ago, that situation, the EDA market, where it was in low single digits percent. Almost everything was done internally then. Now, roll forward to today, we are at 25%. It never gets to 100%, to be clear. Over the next decade, you will see that shifting. One of the big unknowns to us is how fast that shifts. I think maybe it gets to 75% commercial, 25% internal. Maybe it goes all the way to 90%-10%. That we do not know. One of the limiting factors for the internal market, well, there are two key limiting factors.

One is just simply the availability of hardware EEs with the right background. There are very few people coming out of universities and colleges now around the world who are specializing in that space. They're more software-oriented or solutions-oriented engineers. There's a lack of people, and some of the people who are in that market are retiring. There's little new blood coming in. That is a fundamental issue that's facing the whole semiconductor world, this limited pool of people who can do the work. That is a drive more towards the commercial market. The other really driving vector towards the commercial market is simple economics. I was just talking to somebody two days ago about one of the, I can't name them, but one of the larger global semis, who you will know very well. I can't name them.

They have an interconnect, even though they're a customer of ours, they have an interconnect team internally, and it's about 60 people. If you look at the all-in cost of a typical engineer, they're really not cheap, especially if they're in the US. That's somewhere between $20 million and $30 million of annual OpEx to support that team. Now, imagine how many licenses of Arteris that you can buy for, for example, for that amount of money. It's kind of almost the size of our, it's sort of almost like half the size of our revenue. It's getting onto that level. This gives me an idea. It's one company. We generally think there's somewhere between sort of a four to eight times payback. Every dollar that you invest in Arteris gives you $4 or $8 back on your own OpEx saving.

As you know, all of the semis players right now are super focused on cost control. Anybody who has any growth limitations or just is suffering price erosion or margin erosion or anything like that, OpEx is a really major focus area for all of these companies. Plus, they can't get the people. Both those things drive to an increasing push to the commercial market. What's really interesting is that if you look at our product that we launched actually at the back end of last year, and it's now going, it's being monetized and now into real negotiations with real customers for actual use cases, i.e., people are taking licenses.

That is a very interesting dynamic because FlexGen essentially allows people in customers with less technical knowledge and less skills to do the same work that a highly skilled engineer would have done previously in terms of designing that NoC into that complex chip. FlexGen actually is a huge additive. It is solving an industry problem, which is two industry problems. One is lack of people, and two is the cost element. I see Charlie has joined now. He is on mute, so I guess he is probably waiting for a question so he can come off mute because I know he is in a little bit because his flight was late.

Maybe, Kevin, now would be a good time, unless you've got any follow-on questions on that whole shift dynamic, maybe now would be a good time to ask Charlie about the question you had on giving some more background and color on chiplets, what they are.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, hey, Charlie, how's it going?

Charlie Janac
Chairman, President, and CEO, Arteris

My humblest apologies due to a travel disaster.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, that's quite all right. We all have them. I had asked Nick earlier, there's one of the big industry movements is more towards chiplets. Can you just kind of give an overview of chiplets? Why is the transition happening? When do you see kind of the market really adopting 40%-50% of maybe the market adopting kind of chiplets?

Charlie Janac
Chairman, President, and CEO, Arteris

There are different types of chiplets, right? The homogeneous chiplets where you have basically single dies becoming too large because of neural processing blocks, those are in production, right? What's also in production is these heterogeneous chiplets that are made by a single company. This is like the Intel Meteor Lake, the AMD chiplet chips that are out there. Those are in production. What people are trying to figure out is how do you get into production things where the chiplet is made by different companies on different processes, right? What's driving that is the need for compute, right? Moore's Law has slowed down substantially. People are trying to figure out how to make things that are able to process things in bigger reticle sizes than you can make on a single die. There is a yield issue there.

There's a performance issue. There's the fact that a number of different functions on the chip don't really belong on a leading-edge process, right? It makes sense to have the CPU chiplet on the latest three-nanometer process, but the analog-digital is perfectly happy on 28-nanometer, for example, right? It also allows you to mix and match chiplets, right, rather than having to redo an entire die. There's a bunch of economic reasons for why this is going to happen, but it's still today a relatively minority portion of the market. The projects are starting, and obviously, it makes a system IP very much more valuable and very much more sophisticated and complex because now you don't have to worry about on-die communication, but the communication between dies. We think that chiplets are a major opportunity for Arteris.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, that makes a lot of sense. Are you seeing kind of any specific end market or are just all end markets eventually going to adopt chiplet architectures?

Charlie Janac
Chairman, President, and CEO, Arteris

You're seeing chiplets in data centers, right? You're starting to see it in HPC, in servers, infrastructure, even mid-range servers, you're starting to see it. You're starting to see it in automotive where some of the dies are just too big given how much machine learning sections that they need to have. There are some projects starting in storage, and there's some AI inference and AI training kinds of designs. Leading-edge stuff.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, okay. Yeah, that makes sense. I know we're still in the very early innings of AI and chiplets and SoC design complexity, but looking out 5, 10 years, what do you think will be the next kind of major disruption after AI and chiplets or anything to kind of look out for?

Charlie Janac
Chairman, President, and CEO, Arteris

I mean, we're just scratching the surface of AI, but I think the issue is going to be just explosion of autonomous systems based on AI, right? You're going to have some relatively mundane things, mundane machines making their own decisions. You're going to have some relatively sophisticated machines making their decision. What we're starting to see is people starting to, for example, if you have chiplets, it's possible that you have some chiplet or one chiplet out of this six or seven chiplet set that's basically silicon photonics. You can get inter-die, very high performance, high-speed inter-die communication, right? The other thing that's kind of surprising me is the evolution of space applications, right? There's just a lot of stuff going on for satellites, satellite communication, exploration vehicles, those kinds of things.

That's going to be the next frontier, is actually going to be space in some sense. And then the other thing is, I mean, down the road, I believe that we're going to see a combination of biology and electronics where the electronics are going to do control with biological actuators, right? So, basically combining biology and electronics, which that's like a little bit farther out, but I don't think there's any shortage of major applications coming down the pike.

Nick Hawkins
Executive Vice President and CFO, Arteris

Yeah, one of the, just to add a little bit of color to that, Kevin, one of the themes that we've had, we were talking about AI and machine learning even before we IPOed, as you probably remember, when it wasn't particularly fashionable, it wasn't a thing. And we mistakenly referred to it as one of our verticals. It's not a vertical at all. It's actually a horizontal. It goes across every vertical we're in. You have AI in vehicles, you have AI in space, you have AI in sort of the communications sector, in industrial. It's across every chip area. There are some which will adopt it faster, automotive, for example. Now, if you look at the intensity, and I know this is something that you focus a lot on in terms of how much of your business is actually related to AI and machine learning.

We actually measure that in terms of the number, the proportion of our customers' design starts, which are AI and machine learning related. When we first started talking about this back in pre-IPO days, it was a very small single-digit number, but it was there. We were talking. Now, it is half of our customer design starts, and it is growing every time you listen to one of our earnings, you hear an uptick in that number. I think that is just representing what the industry is doing now and what the world is demanding in terms of use of AI and machine learning.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

I think that is actually a good segue into a couple of the other questions, more Arteris-focused questions that I kind of had. Are you guys specifically incorporating AI and machine learning to help with system IP design at all? I know a lot of, again, we said a lot of other companies are incorporating AI into their business. What are kind of the negatives to incorporating AI?

Charlie Janac
Chairman, President, and CEO, Arteris

I mean, we're obviously, I mean, we're not ready for any announcements, but I had a demo of an AI feature today, this morning, right, before I got on the plane. We're obviously looking at that. I think you basically start off with AI-based analytics to help customers understand what's actually happening in design. You're next looking at AI features for verification where you're trying to have AI adaptive tests and things like this. Ultimately, some of these things are generated by algorithms, and right now, they're generated by heuristic algorithms. Eventually, some of these things will be generated by AI algorithms. The downside is that you need a ton of data, and the AI technology, it's not even AI technology at this moment, really. If you really look at the definition, it is very primitive, and it's not repeatable or deterministic, right?

No one really can, at least today, determine how an AI algorithm arrived in an answer. Even though people are working on that, and they will solve that problem for sure over the next number of years, the answer is that you got to watch AI because it can go give you answers that are completely nonsensical, right? When you're dealing with a $300 million chip project, that's a very bad thing, right? You got to be very, very careful about how you use AI, and you have to be very judicious to make sure that the answers are correct.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, that makes sense. You said something interesting, a $300 million design project. I mean, what are kind of the costs to AI chip designs these days?

Charlie Janac
Chairman, President, and CEO, Arteris

Yeah, I mean, $300 million would be a very big platform, right? That would be something like a smartphone platform or a large ADAS level three plus platform, right? I mean, with mask sets costing, I do not know, $5 million, $10 million, something like this, you really, really got to, these projects are relatively expensive to undertake. The thing that we are trying to do is lower the complexity, lower the risk, and lower the cost of doing some of these things. Some of these AI features will be helpful, right? We are fighting constantly because as the chips are getting way, way more complex, we are introducing features and capabilities and automation that keeps that complexity and that cost at bay.

Nick Hawkins
Executive Vice President and CFO, Arteris

I'm sure, Kevin, that you asked also about what are the risks and the downsides potentially and the headwinds to use of AI in companies. It's a great question. I think you and the audience will be aware that the adoption of AI by the semi's world and by the whole world, in fact, is focused right now in terms of the sort of the mundane low high frequency, but low complexity tasks initially, rather than the most complex piece of software coding, for example. We even see that internally, for example, in that we are already using AI tools even in GNA, for example, for efficiency purposes. When you look, for example, a great example is the creation of a 10Q.

There are some elements of the 10Q that are just basically rolling forward and adopting the language of the last 10Q and shifting this quarter's numbers into last quarter's numbers and those kind of things. That can be done very efficiently on AI. You could then say, okay, go into my NetSuite and pull out all the numbers and populate the current unannounced numbers, the actual numbers for the quarter we are reporting. Of course, the risk there is unless you have an entirely fortress approach, internal only, which is very tough to do, not using any external feeds, then your data, the risk is that your information can get pre-populated to the world and available to the world.

This actually happened to one of the big semis, and I will not mention their name here, but it was well publicized, who actually put their chip design out and asked AI, help me debug this software for this design for my chip. Unfortunately, that became then public knowledge. They completely lost the edge because their design was now completely public. Those are the kind of dangers. This is where you have to be very careful and judicious about what part of the company's operations do not really matter. Somebody seeing last quarter's numbers populated in the prior period column on a 10Q, that is no risk because it is already public. Putting this year's this quarter's numbers out, that is a risk. That would have to be taken with a great deal of caution.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah, and I'm sure that's exactly, I'm sure there's going to be a lot more instances like that that we're going to see kind of in the future. It's kind of what makes, again, cybersecurity also one of the bigger themes as well.

Nick Hawkins
Executive Vice President and CFO, Arteris

We take cybersecurity extremely seriously. We're probably more advanced than most companies of our size in terms of cybersecurity because it's a big hairy risk.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

No doubt. Just switching gears a little bit, given the ongoing trade tensions between the U.S. and China, can you just remind us how much of your top line is driven from China customers and how are business activities over there right now? Again, multi-part question, but there was a recent ban that I think has been still ongoing, but I think alleviated right now. Just the EDA sales to China, does that impact Arteris in any way?

Charlie Janac
Chairman, President, and CEO, Arteris

Not directly. We're basically, there hasn't been any ban of IP, right? That's not direct. The effect of it is on our customers, right? If you can't get EDA tools, you really can't, because we're part of a big ecosystem, right? You're having trouble getting your designs done, right? This is basically the issue, that there's some indirect impact. I keep thinking that we're going to work this out because the U.S. economies and the China economies are large, but we just have to roll with the punches. There's no direct impact on IP at the moment, or at least as of this morning.

Nick Hawkins
Executive Vice President and CFO, Arteris

I think, Charlie, the other point is in terms of the teeth of that, most of our top line is derived from French origin products. They are not actually subject to U.S. So, directly, the impact is very muted. In fact, zero, but as Charlie said, there is a sort of collateral impact on Chinese chip designers, which they are kind of working around. There are some nascent EDA, sorry, there are some native Chinese EDA solutions, but they do kind of 80% of the job of the U.S. EDA guys. There is a switch involved, and it is not easy.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

How much of your top line?

Nick Hawkins
Executive Vice President and CFO, Arteris

Yeah, sorry, that's where I was coming on to. I had a brain thought, and I couldn't get there. Yeah, it's a great question. If you roll back to 2019, 50% of our business, our top line, came from China. Then the U.S. BIS created the entity list, and that dropped to, by about the beginning of 2023, that was down to 30%. Right now, it's closer to sort of high teens % of our top line in terms of deal creation. Bookings, essentially, which we don't disclose. When you look at the impact on revenue, since the drop in the second half of 2023, because around the midpoint of 2023, from around 30% to around 15-17% of total bookings at the front load, because we have ratable revenue recognition, our run rate of revenue concentration from China has stayed at about 30%.

It has now started to drop, as some of those earlier China contracts have end dated, and they are not replaced. There are a couple of things going on in China that affected that drop. One was access to VC capital, which really completely dried up around that time for China. The second was the increased focus of the entity list and the increased impact of the entity list. That is gradually coming through. What has been good? How have we carried on growing at 20%-ish, give or take, on top line? Just looking at revenue, for example. The answer is that there are other GOs that are growing faster than the decline in China. That is a really good fact. Probably the highest focus area in GO growth for us is in the U.S., which is a great thing.

We're also seeing growth in Japan and in Korea, for example. Excuse me. Right now, the first quarter numbers, we were about dropped to 25% of total revenue out of China. By the time that we get fully lapped out of this decline in China that started in 2023, by the mid of 2026, so maybe the beginning of the third quarter of 2026, we should see that settling at around high teens percent of total revenue from the current 25%. All the time, other GOs outpacing that because people still need chips. People still buy things. They're still consuming SoCs eventually, whether it's coming from the U.S. or from Europe or from China, from China or from Japan or Korea, which are the main centers, or Taiwan. The consumption is still there, the consumption demand. That is why we are seeing the overall growth rate still happening.

Charlie Janac
Chairman, President, and CEO, Arteris

Yeah, it is a combination of headwinds and tailwinds. The reason you want to be highly diversified and geographically application-wise, customer-size, application-wise is that when inevitably things like the China headwinds happen, you can compensate with tailwinds from things like AI chips and autonomous systems and government investment in Europe and those kinds of things in the US.

[crosstalk]

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Sorry, Charlie.

Charlie Janac
Chairman, President, and CEO, Arteris

No, just a combination of tailwinds and headwinds.

Nick Hawkins
Executive Vice President and CFO, Arteris

A great example of that, Kevin, in terms of this diversification benefit that we have is just look at the automotive sector, for example. Right now, Chinese EVs are winning the game. They're doing extremely well. Tesla has had some issues and some challenges in terms of its units. Many of the US and the European manufacturers have had headwinds. Some of the Korean companies and the Japanese have been doing very well. We are designed into OEMs in all places. In China, in Korea, in Japan, in EMEA, and a lot in the U.S. If one OEM wins against the others, we do not mind because we're already designed into four OEMs in China, for example, and multiple SoC vendors into the Chinese automotive market. We're very well positioned. We have this luxury of being less focused in terms of having backed the right horse because we're backing all the horses, essentially. If one's winning, then we win. If another one wins, then we win.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Yeah. The benefits of being diverse, as you guys said. We have about five minutes left. We got an investor question. One topic that I did want to hit on is the revenue model, pricing trends. I guess just to start on that, what is the average deal size that you guys are seeing? Where do you see it kind of going? Are there an increasing number of design starts and deals that are at higher prices right now?

Nick Hawkins
Executive Vice President and CFO, Arteris

Charlie, do you want to take that, or would you like me to?

Charlie Janac
Chairman, President, and CEO, Arteris

Yeah. No, no, I'll take that one. The ASPs continue to grow. They continue to grow because the designs are getting more complex. There's also they use more system IP, those kinds of things. The ASP is growing nicely. I think we'll be at a $1 million average by 2026, as we said. Now, you, I think, on one of the earnings calls, that's the very inside question. When we go into microcontrollers, isn't that going to dilute the ASP? The answer is yes, it is. We are essentially trying to capture entire generations of microcontrollers rather than individual microcontroller projects, right? The ASP calculation is kind of more for SoCs. The pricing's going up nicely because these chips just are very much more complex than they used to be. Generative AI adds complexity. The chiplets add complexity. System IP is just becoming a very, very valuable category to the point where I think ultimately it's the second most important IP category after the processor.

Nick Hawkins
Executive Vice President and CFO, Arteris

If I can give you some CFO-type responses to that as well, put some numbers around Charlie's excellent commentary there. If you look, one of our most important bedrock product categories, product families is our non-coherent interconnect, the Flex family. FlexNoC is our oldest product in the company, or that family. If you look at FlexNoC 4, which was really the mainstay when I joined the company back in 2019, that is gradually being replaced by FlexNoC 5 as customers go through the next iterations. Eventually, many of those customers will adopt FlexGen, which is FlexNoC 5 plus some automation features. FlexNoC 5 is roughly a 30% list price increase over FlexNoC 4 because it does more. FlexGen is another 30% on top of FlexNoC 5 because it does more.

As the customer base shifts to these later generations of non-coherent, just as one example, that lifts our ASPs, our sort of project-level costs. You also have another part of that question. I am just keeping an eye on the clock here in terms of the revenue model. I think it is a very important question. I think when does that then generate non-GAAP profitability? The revenue model is ratable, so essentially a deferred revenue base. As we sign deals, the deal value goes onto the balance sheet and then it amortizes into the revenue line over the design term, which is typically on average around three years. We have approximately $90 million of what we call remaining performance obligations, which is essentially deferred revenues, revenue that we would have recognized already in the past, but is now going to be recognized in the future.

The path to profitability is really just a question of how quickly that deferred revenue and that RPO amortizes into top line because it's a trailing measure of top line. RPO, which is essentially our backlog of revenue, that is the leading indicator of growth. ACV plus royalties is the current state of top line, essentially. Revenue trails that by about a year. The path to profitability is when do we have catch-up? Because we're only growing, if we're growing the top line at high teens, low 20%, which is what we stated. We throttle OpEx and spending down to half that level, so around 10%, let's say, just between friends. You automatically move towards profitability. It's a math question more than anything else, provided we don't start losing market share or something like that, tragic like that, which we're not.

We see where the catch-up point, where all this deferred revenue starts to get to a point where it matches OpEx and cost of revenue, that is around the middle of 2026. I say that is a math question as opposed to a growth question, if that makes any sense.

It absolutely does. Yeah, no, I appreciate all that, color. It looks like we're just about out of time. Charlie and Nick, thank you very much again for joining us for our conference. We really appreciate it.

Charlie Janac
Chairman, President, and CEO, Arteris

Okay. Thank you very much, Kevin.

See you.

Kevin Garrigan
Senior Research Analyst, Rosenblatt Securities

Thanks. Have a good listen.

Charlie Janac
Chairman, President, and CEO, Arteris

Thank you.

Powered by