Good morning, everyone. I'm George Gianarikas, one of Canaccord Genuity's sustainability analysts. We're very excited to have with us on our 2nd day of the 44th Annual Growth Conference, Mobileye. With us from the company is Dan Galves, Chief Communications Officer and old friend. Dan, thank you for coming.
Thanks, George. Thanks for having me.
Really appreciate it. Maybe to start off with a softball, kind of a quick overview of what the company's been up to in the last quarter, last several months, just so you kind of run through some bullet points, that'd be great.
Yeah, sure. So I think we, you know, we've just reported earnings a couple of weeks ago. Definitely experiencing some near-term challenges in the ADAS business. You know, we've seen, you know, kind of a mix of macro and micro. You know, on the macro side, we've seen kind of global production forecasts come down pretty significantly over the last three or four months, creating, you know, some lower volume in our ADAS business. A lot of that's related to, you know, market share losses of some of our core customers inside China. And then we also saw, you know, some reduced orders from the local Chinese OEMs, that are also impacting the ADAS business, so, you know, fairly significant, you know, reduction in the guidance for the EyeQ business.
You know, macro issues in terms of tariffs, creating kind of less opportunity for some of our supervision customers to sell cars in North America and Europe. So I think the near term is definitely a little bit challenging. I think a lot of it's isolated to geopolitics and China specifically. We're seeing, you know, quite a bit of strength in the remaining parts of the business, but some challenges in the near term. On the longer term side, you know, we talked about multiple different positives that we're seeing.
I think the first thing, you know, I'd note is that, you know, incremental safety testing requirements that have started to be kind of publicized for, you know, three or four years from now, are creating an opportunity for a next generation of ADAS, driving assist systems that would be significantly more, valuable per car to a company like ours. I think that's a really important positive. I think we're seeing continued progress with our supervision engagements. You know, these things don't come quickly, but, you know, passing milestones, getting more clarity on timing, and continuing to expect design wins in the back half of the year.
And then we also talked about kind of our EyeQ6-based platform in general, and kind of, you know, all of the new technologies that are being kind of put into this platform that creates a really scalable set of technologies that goes everywhere, everything from, you know, single camera, base ADAS, to the Surround ADAS category, up to Supervision and Chauffeur. That scalability is really valuable to our customers, and in terms of creating synergies across their product portfolio. It also... We're seeing, like you know, orders of magnitude improvement in terms of the, kind of, the performance of the higher end systems, I think, which is also really encouraging.
You know, if you look at kind of what's on the road today with Supervision, it's an EyeQ5-based platform that was developed, you know, 3, 4, 5 years ago. The EyeQ6-based platform has a ton of new technology that's showing, you know, significant amounts of promise, and that's really in front of our customers today in samples. So that's kind of an overview of what we disclosed a few weeks ago.
Maybe to talk a little bit about the segmentation, the new segmentation that you defined on your conference call. Four separate categories, with potential wins and some new ones like Surround ADAS. If you can kind of go through all four, that'd be great.
Yeah. No, that's, that sounds good. So I think SuperVision and Chauffeur have always been seen by the automakers in the first generation as, you know, most relevant to fairly high-priced vehicles. Like, these are aspirational products with the end goal of an eyes-off system, where, you know, 80%-90% of your commute, you can be doing other things. That's the real goal. But I think, you know, targeted at, you know, aspirational brands, aspirational price points. On a, on the low end, you know, the kind of the new ADAS driving assist adoption rates are pretty high in the developed markets, you know, probably 80, 85%. You know, China is maybe 65%, and then, then you've got markets like India, Latin America, and South Asia that are, you know, kind of in the 10% range.
So there is a, you know, a desire for, let's say, starter kit ADAS systems, you know, where price and performance are in balance, but the performance requirements aren't up to kind of where they are in North America and Europe. So I think on the low end, you know, automakers are looking for a little bit lower priced system for emerging markets, and we have a version of EyeQ6 that launches the middle of next year, you know, where we could sell in kind of the low $30s and maintain a 70% type of gross margin. So that's kind of the high end and low end. What's happening in the middle is maybe the most interesting factor. So I think today you have, you know, the majority of vehicles have a-...
base ADAS system that supports automatic braking, and lane keeping support, and adaptive cruise control, things like this. And the performance requirements continue to get more difficult, which means, you know, every generation you're putting more features into the vehicle, and it creates kind of better performance for the customer. That's the kind of the bulk of the volume, kind of in what I would call the mainstream brands, mainstream, you know, vehicle price points. And then you've had these, you know, kind of so-called Level 2+ systems, which, you know, things like Ford BlueCruise, GM Super Cruise, BMW Drive Pilot, Nissan ProPILOT, like, there's lots of different names. And these, you know, a lot of them mobilize, involve, but really only from kind of the front-facing camera perspective, and then the OEM is directing.
You know, maybe there's some radars around the car, maybe there's a lidar, you know, maybe there's some high-definition mapping. These systems have been challenged to scale, right? So they're only on really kind of very small percentage of the market today, but consumers like them. You can get on the highway, you know, in some cases, take your hands off the wheel, the car's kind of steering for you in the lane, you know, controlling your speed. So there's been a desire to, you know, create a more efficient, like, next-generation version of these systems that wouldn't have 6, 7, 8 suppliers involved, lots of different ECUs, you know, too much cost. So we've been kind of working on an EyeQ6 High-based system for that.
At the same time, we've seen, you know, 2028 new safety testing criteria recently kind of indicated to the automakers that takes the performance to a point where it can't really be handled by a single front-facing camera.
Mm-hmm.
You need kind of to add additional sensors, and so it's creating sort of a midpoint where you can do both of these things at the same time. You can create a more efficient, kind of Level 2+ convenience system, and at the same time, protect yourself for kind of late decade performance requirements, and this is creating an opportunity for us called Surround ADAS. So I think, you know, the four segments that we're talking about is basically emerging market ADAS, kind of the typical base ADAS that we're producing today in high volume, you know, SuperVision and Chauffeur at the top end, and then this new category of Surround ADAS that probably starts, like, relatively low volume-
Mm
... but could quickly become, you know, similar volumes to the ADAS business today over time.
I want to focus first, before we dig into Surround ADAS, on emerging market ADAS, because I think what you said on your call is that much of the incremental weakness that you saw was due to, number one, sort of maybe some market share loss within the Chinese OEMs?
Yeah.
Then, second, Western OEMs losing market share in China, right?
Mm-hmm.
So it was both. Do you think that this emerging market ADAS could help blunt the impact of market share loss in China? Could you win some market share back with this product?
Yeah, I mean, it's hard to say. I think that was part of the kind of the impetus to develop it, although I do think that, you know, markets like Latin America and South Asia are also kind of big opportunities. You know, each of these is probably 3 or 4 million units of, you know, potential ADAS volume. I think what we're seeing, what exists outside of China is a balance of price and performance, right? Or I should, you know, I shouldn't say outside Asia, I should say in North America, Europe, Australia, Japan, Korea, like, these types of markets, the ADAS system needs to balance price and performance.
Obviously, the automaker wants to pay as little as possible, but there are clear KPIs based on these kind of star ratings and other regulations that the system needs to support a certain level of performance. Those KPIs don't really exist, and then there's, on the other side, you have the risk of recalls. So if you have a system that, you know, you're driving down the highway, and the system thinks it sees, you know, a pedestrian, but it turns out that it's, like, a piece of paper or a rock, we've seen recalls, you know, from, you know, driven by competitive systems that, because the car is just slamming on the brakes in the middle of the highway. Those mechanisms don't exist in China. You know, you have some level of kind of safety ratings, but the tests are very easy.
The consumers don't really trust them, and there's really no recall mechanism. So that price versus performance, you know, in China recently has become more all about price.
Mm.
And there are new competitors that, you know, would not meet the standards in North America and Europe, but are offering systems at a significantly lower price than us, and we, you know, kind of decline to participate. So it, it's leading to some headwinds in China. I think that, you know, the, the newer chip, you know, we can get most of the, the way there on price and still offer kind of a you know, a similar level of performance. But it's, it's hard to say kind of where the pricing game will end. So, you know, I think, you know, remains to be seen.
On to Surround ADAS?
Yeah.
What are the incremental features, you kind of touched on a little bit, that you get with this, you know, better product over time?
Yeah. So I think the, you know, the performance tests that you have to hit are things like, you know, I'm slowly kind of progressing through an intersection, there's a pedestrian in the crosswalk.... Can I, can I avoid that pedestrian if they jump out in the last second? So a front-facing camera with a 100- to 120-degree field of view, like behind the rear view mirror, is not gonna be able to identify that pedestrian on the, on the side. But most cars are coming with these kind of short-range, wide-angle parking cameras that have a different purpose, but we can tie into those, and that would create that vision to the side. Same thing with, like, perpendicular vehicles, like crossing in front of you through an intersection.
I'm coming to the intersection, I have a green light, there's a car about to run a red light and hit me. Can you avoid that car before you get into the intersection? Again, something that really can't be dealt with from the front-facing camera. So those are some of the performance requirements that are driving this need for more sensors, or to utilize sensors that are already on the car, and creating kind of more, more revenue opportunity. The other, you know, the other set of features is more convenience, right?
So, you know, you're on the highway, the car is not gonna route you to your destination, it's not gonna position you for an exit, but it will steer inside the lane, it'll control your speed, it'll kind of, you know, understand, you know, it can make automatic lane changes, so more of like a highway hands-free system that, that already exists today, but exists in very low volumes and at kind of very high cost to the automaker.
You mentioned this on your call, but can you kind of compartmentalize for us the unit potential, when that starts to kick in, and the ASP impact to your traditional ADAS business?
Yeah. So I mean, the RFQs that we are responding to today, there's 4 of them, and they kind of pencil out to, like, 1 million units a year per automaker. And these are automakers that do, you know, 3, 4 million units a year-
Mm
... in total, so implying kind of 25% of their volume. And this is for launch in 2027, which is before-
Mm
... these requirements really kick in, so you would expect that beyond 2028, 2029, it could become a bigger portion of their volume. You know, this is, like, kind of 3 or 4 times the volume that we're seeing in supervision RFQs today, right? And, so that's the numbers. I mean, I think the opportunity is, like, could you get to the point where, you know, a third of your ADAS volume is at this price point, and maybe two-thirds is still at kind of the base price point? That's, you know, the, that's the magnitude that, you know, the kind of the, what we're seeing from the initial interest levels are implying.
That ASP is 4x plus?
Yeah, anywhere from, like, $175-$225 per unit, versus today, like, we're averaging around $50.
What you're seeing from the RFQs, it's incremental to supervision, not cannibalistic of?
Yeah, that's right. So I think that the, you know, there's... The motivation behind SuperVision is essentially to maintain competitiveness, like, well into the future. I think, you know, everybody sees kind of what Tesla is doing, kind of doubling down on the FSD technology. You know, Porsche and Audi have, you know, adopted SuperVision with us for launch in 2026. So, you know, the Chinese are kind of moving very fast in the, in this area. So everyone wants to kind of maintain that competitiveness, and also, like, I think that, that where they see the real value or, you know, the, the ultimate value of this type of technology is if you can provide the capability for consumers to do other things while they're on their commute.
And I think that, you know, supervision creates a bridge to that type of technology that the automakers see value in. Yeah, so I think that that's the... And it's aspirational, right? And I think, you know, in the second generation, maybe it comes down in price point, but they're really seen as kind of two completely different categories.
SuperVision, I think you mentioned on the call we should expect at least one announcement by the end of the year. One announcement, can you just kind of clarify those comments?
Yeah, I mean, I think we have, you know, multiple opportunities that are targeted for, you know, before the end—targeted for decision before the end of the year. I think we have enough opportunities that there's a likelihood that you see multiple come through. You know, we're not in control of the decision-making process, but, you know, we're—we've been kind of passing milestones, executive approval points with these different customers, and we're now into a more formal process that should, you know, have more clear timing than if we were earlier in the process. So yeah, we feel good about kind of providing more visibility on that segment, but also the surround ADAS segment, too, which we should start to see some decisions made fairly soon.
By the end of the year, for that-
Yeah.
Get a question on Volkswagen a lot, just because they had this new deal with Rivian. Rivian's obviously, or Volkswagen is leveraging their electrical architecture and software. Autonomy is out of that equation, but some have speculated that it eventually could be in. Does that make sense to you?
Yeah, I mean, I think that this kind of Rivian-VW collaboration is helpful to us, and I'll tell you why. The architectures that we're on for SuperVision and Chauffeur with Porsche and Audi is, it's really an interim architecture that was created because these kind of eventual, kind of like, somewhat from the ground up architectures that VW was planning were delayed because of, you know, their internal software group that was really in charge of developing those-
Cariad
- architectures.
Yeah.
CARIAD. CARIAD is also our competitor for ADAS within the VW Group, right? So I think that the interim architectures that we're on, you know, our view is that they're not impacted by the Rivian collaboration, but that next generation of architectures, I think the intent is to use the Rivian architecture. So why does that help us? Number one, like, we think it creates more certainty that those architectures will launch on time-
Mm
... and be good. Number two, it creates the scenario where our competitor within VW is not in charge of those architectures-
Mm
... which helps us. So yeah, I think that it's really, you know, I think from a kind of a business perspective, it's a good move by Volkswagen, it's a good move by Rivian. And, you know, our understanding from kind of our contacts within VW Group, is that it helps us quite a bit in terms of ensuring that we're not just on this interim architecture, but we also are on the next gen architectures, which, you know, will run for, you know, 10 years or whatever.
Actually, any questions from the audience? I'm gonna focus a little bit on the technical debate that we've talked about quite a bit. Tesla, end-to-end neural net, cameras only. You have a completely different approach, and there was actually a question on your call that I want to clarify. Someone had asked if you had to accelerate the build-up of your own data center infrastructure, and whether your CapEx had to sort of kink up higher. What was the answer to that question, just to make sure we understand it?
Yeah. So in some ways, our approach is similar to Tesla's, and in some ways it's not. Like, so we have a heavy reliance on cameras, kind of for the base technology, just like Tesla. We also believe that, you know, the compute architecture that runs your software should be designed in-house, so it runs more efficiently and, you know, you can create architectures that are, you know, purpose-built for the software that you're using. After that, we kinda diverge, right? So our view is that, you know, yes, humans drive with their eyes and their brain, but, you know, the potential for kind of a camera-only system to get to a human level of performance is far away, and we can't really predict when that would come.
And our, you know, our kind of North Star is always to create fully self... you know, driverless vehicles, but we want those the technology blocks that create those driverless vehicles to be able to be utilized in systems that aren't fully driverless, right? I think the other place where we diverge-
Mm
... is that, you know, I think Elon is, Elon Musk has been pretty clear that their approach is: put systems on the road, you know, that are supervised, right? That people need to keep their eyes on the road, see where the interventions and where the failures are happening, pull data from the cars to kind of identify what was going on. Then you know, grow the data set to include data that would correct those corner cases or create synthetic data, and just keep on iterating over that process. Now, what's happened over the last six months? So, you know, according to, you know, a crowdsourced data website, the version 12.3 of FSD, and, and believe me, it's hard to have, like, crowdsourced data about your system. In some ways, I'm glad we don't, right?
But, you know, so I'm not saying that this is easy, and the performance of FSD is actually really good, but the original version was basically, you know, an intervention every 200-250 miles. Six months later, they've come out with the next version and they've, you know, grown that to maybe 330-350 miles. So a 30% improvement. Our view is, and the view of our customers, is that the human failure rate is anywhere from, you know, an intervention every 3 million miles to 300 million miles or 3 million miles to 30 million miles, and that's kind of the target that we need to get to.
So, you know, 6 months to get from 230 miles to 350 miles, I think that this is what we've kind of argued is one issue with kind of a pure, kind of brute force AI approach, is how long is it gonna take you to identify all those corner cases? And then how much compute and how much data? You know, it, it'll explode exponentially as you find more and more corner cases. Our approach is more to create different modules within the system that fail in different ways, and so a corner case for one approach is not a corner case for the other-
Mm
... and you kind of understand the failure rates. The easiest example is, you know, our Chauffeur system has a camera-based perception system, and it, and it has a radar, LiDAR-based perception system. Cameras have issues with sun glare and bad weather, radars don't. So you understand that, so a corner case with one is not a corner case with the other. But even within the camera-based system, you know, we use end-to-end AI for vehicle detection, but if you encounter a vehicle that's not in the data set, like some weird-looking tractor, it may be ignored. So then you have to go find more data that you know, you've got to find, you know, data with that tractor in the lane, retrain the network.
Instead of going down that route, we created a system where the different cameras are used to create a 3D view of the world, just geometric. So you don't know what the object is, but you know it's big and it has mass, and you want to avoid it. So I, I think that that's really the main reason why our data needs and our compute needs seem so much smaller than what Tesla's talking about.
It's a great place to stop. Thank you, Dan. Appreciate your time.
Thanks, George. You, too.