Update us maybe on some of the dynamics you're seeing in the semi-test market. Things, there's, you know, this. Things haven't been so great. There's been some good things and there's been some bad things. Mobility's not been very good during the past year or so. So can you just sort of, like, walk through what some of the puts and takes are in the SoC market and how to think about what some of those puts and takes look like as we head into next year?
So I think the real driver in the market this year has been AI. That's the largest positive impact that we've seen in the semiconductor test market, and that's both for the SoC market and for the memory market. If you look beyond that, everything else is at relatively low rates historically. The mobile market is quite weak and has come through weaker than we expected for the year. The automotive and industrial market we expected to be weak, and it has turned out the way we expected. And the only sign of life in the flash memory market is really around solid-state storage for cloud. You know, as long as you have exposure to AI, then things are good. But exposure to AI is a little bit broader than just GPUs.
It includes bespoke silicon for hyperscalers, the VIPs. It is networking silicon, and it even extends into power management, power processing for these high-power servers. So, we're seeing that impact in our business in a bunch of different places.
So can we talk about the VIP part of the market? You've been getting about 50% of all the sort of incremental business there. And the Technoprobe deal, although it's not having much of an impact yet, I don't think that sort of is the strategic thought process behind that was that it would eventually help you.
Mm-hmm.
Do better in that market. So can you talk about VIP and sort of how big it is and how you see the market evolving? Do you aspire to get more than 50%, or are you happy with 50?
We always have high aspirations. First, just for anyone that hasn't made a habit of following Teradyne, we refer to a class of customers as VIPs, vertically integrated producers. That's hyperscalers that are designing their own silicon, in some cases, automakers that are designing their own silicon. They require a different sales approach, and they are a very important factor in cloud AI compute. That's a big part of the market. We estimate that the VIP compute market is gonna be about. At the beginning of this year, we estimated that that was gonna be about $500 million out in 2026. It's turning out that in 2024, that market is already $300 million. We're on a steeper trajectory there than we expected. Now we are going to be. We're trending above that $500 million for 2026.
Could be 25%-50% higher. So we think it's gonna be a very important part of the compute market going forward. Now, as for our share aspiration, we think that, if you look at traditional compute, we have kinda 10%-20% market share, and it's almost all concentrated in the networking part of that business. We have very little share in GPU or in traditional CPU business. So we're pretty happy with a 50% share of these new entrants. That's a significant boost to our overall share. Of course, we would like to get more, and we think we've got a great tester, and we deserve more share than that. But we think that it's a prudent plan that we'll be able to achieve that kind of a level.
I get the question all the time. Why, you know, you look at the performance of your competitor stock and your stock over the past two years, and, you know, everyone just wants to own AI, and the perception is that your competitor has all this AI exposure and you don't have a lot of it, which is. I don't think really true at all. But that's how people think about it. Some of that, in my opinion, is just a legacy thing.
Mm-hmm.
Can you talk about sort of the evolution of why your share in the traditional compute world is low and how, you know, it's 50/50 in the VIP, but it's quite low in the, you know, CPU/GPU world?
Yeah. So I think the, if you look at the traditional players in the compute market, the largest players are Intel, AMD, and NVIDIA. And all of them have had a, a tester platform strategy that was set, in some cases, more than 20 years ago. That, they've, and they all have favored the, sort of the investment required to bring products to market as more important than the actual cost of test of the devices. That led Intel to their own architecture so they don't participate in the market. And AMD and NVIDIA have been on our competitor's platform for decades. We have a, you know, if you, if you look at our platform versus the competitor's platform, both of them are perfectly capable of testing this next generation of chips. We believe that we have certain advantages. I'm sure they believe that they have certain advantages.
But when it comes down to a comparison on the basis of features and benefits, it's kind of a, you know, it's an even fight. We were able to win a significantly higher share than we have just because of the historical reason and the entrenched base. In the ATE market, there generally needs to be a pretty powerful inducement to change your test strategy. And that can come from a discontinuity, like there's a new technology that comes into the part that your current vendor doesn't have a good solution to test, or it could be that your current vendor can't support what your needs are. They let you down in some way.
And so typically, if you try to attack a customer without having an ability to do something that your competitor can't do, and that there is an actual unmet need that you're trying to get to, or if the customer has that competitor has high customer satisfaction, it's generally the only way that you could do it is to like buy the business. And we're not really interested in cutting all of the margin out of the space. So we look for these entry points. And entry points to try and get into more compute business really have to do with technology inflections, you know? So we see one of those coming with like silicon photonics and co-packaged optics. That's something that nobody has an installed base solution for high-volume production.
So we think that there's an opportunity to differentiate and a chance for us to gain share in traditional and new compute customers. And we think that there's also opportunities by working with Technoprobe to be able to expand test coverage, find new kinds of faults at earlier test insertions. And so we're leaning into that as well.
Is it the kind of thing where, I mean, I think of these two pieces of the market, one being a big, mobility customer that you have?
Mm-hmm.
And the other being a big GPU customer that they have, although your customer buys more testers than theirs does from a dollar point of view.
Mm-hmm.
Typically, but do you think, like, would you consider the battle lines on those two customers? They're drawn. Like, it's not worth it for your customer to switch, and it's not worth it for their customer to switch.
Well, it's never worth it for our customer to switch. They wouldn't wanna set that as a ground rule. No. I think that, it's really the inflections are the place where there's a motivation to change. So, you know, silicon photonics would be one. Another is if a chiplet-based strategy ends up putting a die that's tested on Teradyne and a die that's tested on Advantest into the same final package, now all of a sudden there's an opportunity to sort of cross those lines. And we look for those opportunities, and we try and maximize our likelihood of winning in that.
Great. Can we just in semi-test? I mean, I could talk the whole time about semi-test, but can you just zoom out and you know, we talked about test intensity for years.
Mm-hmm.
And it was coming down and down and down and down and down in SoC, and it's, and it's recently leveled off and, in fact, begun to go up a little bit. Can you just talk just generically, you know, test is a capacity business, but there also are, there is some obsolescence under the surface that's actually happening. So can you just talk about sort of those factors and, and maybe what, when we're thinking about the, the, the, the size of the tester market relative to the size of the, of the, of the market itself.
Yeah.
Should we expect test to start to go faster?
So the way we think about what you're talking about is buy rate, you know, relative to the sort of IC revenue versus tester revenue or tester TAM in a particular year. For sure, that was a really bad profile from, like, 2004 out through 2010. It was, you know, it was coming down. After 2010, really when significant compute complexity got into phones, and the effect of really high site count test in memory was fully absorbed, and the complexity of the interfaces between the device and the tester really kinda reached a maximum point, that was when we saw an inflection, and the test intensity or the buy rate recovered and actually started to track the growth and revenue much, much more closely. That was sort of a great period of time from 2010 out through 2022-ish, 2021.
With the decline in handset volumes, and the handset manufacturers looking to preserve their earnings against smaller deliveries, they've sort of switched back towards trying to get more efficient. And that's certainly been one of the things that is impacting the size of the market for mobile over the past couple of years is they've gotten significantly more capital efficient from a test perspective. I think that that's a temporary thing, and I think it's mostly run its course. At the same time, you know, complexity is still continuing to increase. I would expect to see an inflection coming up back up there.
Great. I wanna ask this question in kind of a different way because you can look at buy rate, but you can also look at the size of the SoC test market relative to the size of the front-end equipment market.
Yeah.
For those, you know, for those relevant verticals, and you can do the same thing in memory. And it really is in memory, there's not really any conclusive thing to draw. It's, it's sort of in the same range it's always been. But in, in the SoC world, in the non-memory world, there's a huge disconnect.
Mm-hmm.
And some of that's been China, and there's been a lot of, you know, China money that's gone into WFE, and of course, you know, you're restricted from selling to, you know, a lot of the, you know, Chinese companies. Does that explain the whole thing? Or, you know, when you, because you look at the same data I look at.
Mm-hmm.
Does that tell you? You say, you know, this is not. This cannot be sustained, that the wafer fab equipment market for non-memory, even if I exclude China.
Mm-hmm.
It's so far offsides. It's so much higher relative to the TAM of test.
Yeah.
That it seems like there's being a bunch of front-end equipment installed, and once that stuff gets turned on, you're gonna have to buy more testers.
Yeah. I think there are things that create a new normal, and by the way, I, I don't agree with your assertion that it's always been kinda consistent in the memory space because I can remember seeing an inflection in the front-end associated with, like, multi-layer flash. Like, all of a sudden, the fab intensity for folks like Lam and KLA went way, way up, but the test intensity just went up kind of as the log of bits, you know? It wasn't. There was a much higher level of investment to try and create multi-level flash than in certainly the, the test associated with it, and I think we've seen the same thing with EUV, that the, the, fab investment required for to, to build stuff at 3-nanometer and below is, has had a sig, a more significant effect on the front-end than it's had on test.
I think at the same time, that is, I don't think that's a trend that will continue. Like, I don't think you're gonna see them diverge further.
Mm-hmm.
But I'm not sure that you'd see a lot of catch-up. I don't. I used to think that there was a lot of dark capacity that hadn't been turned on. I don't know that that's necessarily the case anymore. But I think that there is, certainly higher, like, healthier end markets that are coming.
Got it. Can we shift to the memory part of your semi-test business?
Sure.
Which has been a great story for you, particularly in HBM. You've gained a, you know, a bunch of share, and you do, you know, very well in NAND already in flash.
Mm-hmm.
So can you walk through some of the dynamics in HBM? You're getting qualified with, you know, one of the large, you know, vendors. You're already qualified at another one. Can you just talk about sort of how you see the dynamics in that market?
Sure. So, prior to the middle of 2024, our exposure to HBM memory was entirely at the die-level test. So, as of the second half of this year, we have a successful stack die performance test solution, and we've begun to sell that in volume. When you look forward, we expect that is going to be qualified by an additional customer, and that will lead to significant share growth as the industry shifts from HBM3E to HBM4. When it comes to the wafer sort, the die test for HBM, I think that that's gonna probably stick to existing patterns. That part of the market, there is a lot of competitors, and there's less opportunity for differentiation. So, we are far more focused in trying to establish a very strong position in the performance test for HBM.
So when you talk about the HBM TAM, are you talking about just—you're talking about just stacked test or the entire die test?
No, no. We're including the die test and the stacked die test for the HBM.
How does that break out? The HBM market, is the final stack test becoming a bigger part of the HBM TAM, or is it still a?
They're kinda evenly split. It's not exactly 50/50, but it's a relatively even split. I am expecting that, you know, barring sort of short-term wiggles, that the overall HBM TAM is gonna be increasing over this midterm, and the performance test market is gonna increase, but I think the wafer test market is probably gonna increase at a slightly faster rate because we're gonna go from a stack of eight to a stack of 12 to a stack of 16, and so just the number of die that you need to test before you put it in a stack increases.
Got it. And just in terms of your share in that market.
Mm-hmm.
You're optimistic that your share. It seems to me, if I'm sort of, you know, reading your, the comments you made on next year, is that you're saying, well, there was probably some pre-buying.
Mm-hmm.
This year for HBM. So maybe the TAM is not gonna go up next year like it did this year, but you're gonna gain share in a TAM that maybe doesn't grow next year.
Right. So our primary customers don't have a lot of extra test capacity, so we would expect them to continue to add capacity in 2025. We also expect to gain share in other customers. So it's kinda confusing to think about. Our revenue, if you go 2023, 2024, 2025, the TAM in 2023 was $100 million, and our share was like 50/50, but it was all concentrated in wafer sort. In 2024, the TAM ballooned all the way up to $500 million, and our revenue increased significantly in that space, but our overall share went down because a lot of the buying was associated with wafer sort in a customer that we don't have significant wafer sort share. And we also were not participating in performance test until the second half of this year. So we missed some of the inflection in 2024.
In 2025, fully qualified for performance test at one. We expect to be qualified for performance test at two more, and we have a reasonable wafer sort position in two of the three manufacturers that use external test equipment. So we think that the TAM is likely to be flat, but we think that our share is likely to go up.
Got it. And then we've had this discussion on the SoC TAM, SoC test TAM, and it peaked at $2 billion, and it troughed at $800 million. You've since come up a little bit to $900 million. And we've all asked on these earnings calls. We've asked you, well, you know, what do you think about next year? And you started by saying, well, it's probably somewhere in between. And you know, I think as time's gone on, you're sort of like zeroing in on, you know, what you think the market's gonna be next year. And you know, tell me if I'm mis-characterizing this, but it seems like you're saying, look, it's gonna be up. Maybe it's not halfway between here and there, but it'll be up.
Yeah. It's gonna suck less.
Yeah.
I mean, it's not gonna be like, I don't see an end market driver that's going to drive a mobile boom, but the excess capacity has basically been soaked up. So we're likely to see incremental capacity buys, even if the trend around handset volume and complexity just stays at the pace it's been over in, you know, from 2023 to 2024. If you straight line that into 2025, we'd see higher tester demand for that case than we saw in 2024 because there isn't all of that extra capacity to soak up.
Great. So I wanna shift a little bit to industrial automation, and you actually, via IA and via the industrial part of the semi-test business, you actually have reasonably, I'm not gonna say high, but you have reasonably strong exposure to the industrial market.
Mm-hmm.
Period and which has been challenging to say the least. So, I see AI and, you know, you and I talk about this a lot. I've not said great things about that business in the past. However, I do think that there's a lot of, with AI, there's a ton of growth opportunity there.
Mm-hmm.
So how do you sort of like weigh where the market's heading? It's bad now because industrial activity's bad, but there's all these AI robotics startups getting funded at very impressive multiples.
Mm-hmm.
Which tells me that there's a lot of growth coming.
Mm-hmm. Well, I think there's a ton of enthusiasm for anything that people, where people believe that AI could have a significant value add. And robotics is a massive AI opportunity. If you look at robotics, one of the key things that makes it difficult to automate a process using a robot is it takes a lot of work to program the robot to plan out the job. You have to understand the process much, much better if you're trying to automate it than if you're just telling a human how to do it. With AI, you're really closing that gap. You're reducing the amount of work that it takes to set up a robot to do an automated task, whether it's an autonomous robot driving around a factory or a work cell robot doing some sort of an assembly task.
So there's a real AI opportunity there, whether it's in logistics or in manufacturing or in pharma. So I think that people are very excited about that, and we're excited about that too. We've, you know, like, right now, sort of mid-single-digit% of our robotics revenue is driven by AI. We expect that that percentage is gonna go up significantly in 2025. And that's really driven by. Right now, it's all driven by partners of UR that are applying AI to solve problems. And most of that's in logistics, you know, like, you need to pick something out of a box, but you don't know before you look into the box what the shape of the thing is or how to pick it up. AI is really good to try and make those decisions and allow you to, in real time, do that kinda control.
So we are selling robots into those kinds of applications today, and we think that those kinds of applications are gonna ramp pretty hard. We are becoming, over time, less dependent on the traditional consumers of robotics, and AI is a big part of that. I also think that there's, I don't know. I, I look at, some of the, especially like humanoid robotics, and as someone who's been putting equipment into industrial settings for a long time, I think that people are underestimating the difficulty of having that kind of technology work in a realistic environment, and it's, it's down to like simple things, you know, like, one of our robots, if something goes wrong and you push the red button, it just stops and it's stable. If you have a humanoid robot and you run into a safety situation and you stop it, it's gonna fall over.
It's dynamically balanced. And so that in and of itself can create a safety concern. And I'm sure there are ways that people are solving that problem, but it's those kinds of things that make people take a little bit of extra time in terms of bringing them into an environment. And we've been working that problem. You guys have been working that problem since 2008. And I think we've got a much closer-in opportunity to apply AI to robotics than the more speculative startups.
Can we just talk maybe about the partnership you have with NVIDIA, and you're, you know, you're using some of their software stack, to help train your cobots, which is quite exciting. It's not necessarily, you know, you're not the only vendor that they could they're working with, obviously. It seems to me like you're pretty far ahead in your partnership with them.
Mm-hmm.
At the same time, I read, you know, white papers from Amazon and, you know, yes, they have their own robotics business that I think is across the street from you, actually. But they have never really, you know, it's like macro robotics. It's not, it's not cobots.
Right.
And they have a white paper out that you can read it. It's, you know, quite interesting. So how do you sort of think about like your partnerships and, you know, what does the NVIDIA partnership, in particular, do to you?
Yeah. So I think the first thing you probably wanna do is think about UR and MiR separately. The main way that AI and a partnership with NVIDIA is going to help MiR is by allowing our robots to do more stuff. Like, it is the key capability that's driving our pallet jack to allow it to have a higher mission success rate. So it's built in. The AI is built into the product. For UR, we're really more focused on providing a physical AI platform. You know, so think about it as like NVIDIA is the brains and UR is the muscle. And the particular application is something that our solution partners are going to build. So we're looking to do things that enable them to do that. And we've done two things in 2024 that are really important to support that.
One is a new version of our software, which is a total, a modern tech stack, great cybersecurity, and an ability to let partners plug in their software in a way that protects their IP. And then the other is, we worked with NVIDIA and we've released an AI development kit called the AI Accelerator that allows people to sort of unpack this kit and start working on their AI-based application right away versus trying to assemble all of the bits and pieces and then, you know, and then start working on it. Because there are so many entrepreneurs and researchers that want to apply AI to these, robotics problems that having something that gets them 50 yards down the field is something that we think will help enhance growth.
Can you talk? That's probably realistically more 2026 and beyond, and next year is probably gonna be governed by the industrial markets and the health of those markets. Is that fair?
I think we were talking about this in one of the one-on-ones. If you look at our robotics business historically, we are overweight in Europe and we are overweight in automotive. Both of those are terrible right now. They're absolutely miserable end markets. Despite that, we are showing growth this year and most industrial automation peers are showing double-digit shrinkage. The reason that we did that is because we have an effective new channel for UR called OEMs, and solution providers, and we have great new products, the UR20 and the UR30, which this year is gonna represent 16% of our unit shipments, right? If we didn't have that new stuff, we would be right down where our peers are.
Looking forward to 2025, we don't expect that the market is gonna stay this bad, but we don't expect that it's gonna get much, much better. We're not counting on that for our growth. What we're counting on is continued expansion of this OEM channel, the solution channel, and new products, especially this pallet jack from MiR, which doubles the served market for that, for our mobile robots. So we think we have a lot of next year's growth, under our control versus under the market control.
Great. Well.
But you're right. In the near term, AI is a 2026 thing, more than a 2025 thing.
Got it. Okay. Perfect. Thank you, Greg. I appreciate it.
Thank you.
Thanks again.