Good morning, everyone. I'm Samik Chatterjee, and I have the pleasure of hosting Mobileye for the fireside chat, first thing here on day three. Dan Galves, Chief Communications Officer, is with us, and, thank you, Dan, for taking the time to be at the conference. I'll kick it off with a few questions here, to the audience. We'll open it up in a bit later, for your questions as well. Dan, let's start off with more a broader question about full autonomy. Mobileye was public last time around, then went private, now it's back in the public markets. Over that time horizon, the auto industry's perception of how realistic full autonomous driving is, seems to have evolved.
Mm-hmm.
Can you just start with outlining for us where we stand today in the auto industry relative to pursuing full autonomy, relative to partial autonomy use cases like highway driving?
Sounds good. Sounds good. Thanks for having me, Samik. Maybe I'll just start with, like, a couple of basics for 30 seconds about Mobileye, and then, I'll get into your questions. So, you know, most, most of you probably know the basics. For many years, Mobileye's had a leading position in driving assist systems. Our system- on-C hip, with software and hardware designed in-house, supports a front vision system for vehicles that supports an additional safety layer and enables our customers to meet constantly tightening safety standards globally. This business is very profitable, generated about $2 billion in revenue last year at a 70% gross margin. This funds our entire operating expenses of around $800 million last year, the bulk of which is R&D, plus substantial free cash flow on top of that.
I'd note that about 80% of our R&D is related to advanced products that are still low in volume, and losses. Some of these advanced products, like SuperVision, are already in production. SuperVision represented about 6% of our revenue in 2023, but on only 0.2% of the volume. The power of these advanced products is that they carry selling prices that are 20-60 times higher than our current high-volume products. And this has been showing up in the annual bookings we report. For example, in each of the last two years, we've been awarded design wins from automakers that project to around 60 million units of future volume.
We actually shipped 37 million units in 2023 and project to around $7 billion of future revenue, against our actual revenue in 2023, which was $2.1 billion. So this kind of bookings to billings ratio is very high for us. The potential for success of these advanced products, it's been the main focus of investors since we've been public. Confidence level of the market can move up and down, based on the pace of design wins and other macro factors, and is at a relatively low point now. But nothing has changed in our view that the market for hands-free products will develop into a very large new TAM, and we have the right technology, the right cost, and the right relationships to take a leading position there.
Now, to your question, I think we have to think about kind of the use case of autonomy. I think, you know, you can think about it in terms of, you know, autonomous vehicles can be deployed in networks of cars that, you know, move people around a particular city or move goods around, or they can be deployed in consumer products. I think if you go back five or six years to, you know, the time when we were acquired by Intel, you know, most of the attention in the industry was around this kind of fleet-deployed robotaxi type of vehicles. And, you know, no one was really thinking too much about consumer-level autonomous vehicles, but we were.
So, you know, if you're building technology for a robotaxi fleet, it's okay if it only operates in one city or two cities. You know, you kind of work on your system in that city, get it up and running, and, you know, apparently, the thought was the demand would come. But then you have to scale to different cities to build your business. It's okay if your vehicles are $50,000-$100,000 kind of incremental cost versus a normal vehicle because you're replacing a driver, which is very expensive and, you know, theoretically, you can get to kind of a cost parity with human-driven vehicles.
But the technology doesn't really translate to consumer-level vehicles, 'cause consumer-owned vehicles, you know, you can't expect somebody to pay $50,000 extra to have an autonomous vehicle. It needs to be in kind of the single-digit thousands in terms of incremental cost, and obviously, you can't, you know, sell somebody a car and say, like: "We've got this amazing system, but it only operates in a few different cities." You know, it needs to be scalable geographically. So, you know, our approach was always to think about a technology that would serve both businesses, and kind of starting from that standpoint, you know, you need to engineer your system for scale, both cost-wise and geographically. And that's really what we've been doing for the last six or seven years.
And I think the other kind of aspect here is you can design a system, you can use... You know, our North star has always been fully autonomous vehicles, but the way we approach the problem, you know, these systems can be used in semi-autonomous vehicles, which is kind of the system that we are in production with now. So I think a lot of the, you know, the skepticism or the kind of, you know, negative press around autonomous vehicles is more in this robotaxi field, you know, which really hasn't turned into a scalable business yet. It may someday. And in the meantime, automakers have become much more aggressive or, you know, assertive in terms of-...
wanting to deploy vehicles with increasing levels of autonomy over time, starting with, you know, highway, you know, highway point-to-point navigation, which is kind of where the target is today. But the ultimate goal of the automakers is to get to an eyes-off system, where you can start giving people their time back, you know, on their commute at least. So I think that that's our perspective on the industry.
Okay, good. So maybe just then going into the products, how is SuperVision helping you sort of bridge that gap when you think about going from L2- L4? How is SuperVision how critical is SuperVision and the capabilities that you're trying to deliver through that?
Yeah, it's very critical. So, I think that, you know, again, back to kind of what is the end goal? At this point, we see the most interest from automakers or kind of the most, you know, strategic thinking about how to profit from this type of technology is to ultimately be able to sell vehicles where, at least on the highway, people can do other things while the car is driving them. We call it eyes-off, hands-off. But to get there, the performance requirements, you know, in our view, you would have to be able to demonstrate and validate that the system is more accurate and safer than a human-driven vehicle, right?
And we've had a lot of work with our automaker customers about kind of what that human accuracy level is, 'cause you can think of it in terms of, you know, a significant crash or a fender bender, these types of things. And, you know, what the industry is centering around is you really need to have a mean time between critical interventions of about 1 million hours, right? So the performance requirements are, you know, huge to get there. SuperVision provides a bridge to eyes off, right? So superVision, the way we build it, is essentially having 11 cameras around the vehicle and the software inside the vehicle that interprets the data from those 11 cameras to create what we call a sensing state, which is essentially a picture of the environment.
We integrate our crowdsource mapping, which boosts the accuracy of that, you know, kind of view of the world by adding in, you know, information about what's the common speed of a particular road, which traffic light is relevant to the left-hand turn, the straightaway, many different pieces of information within the map. Then you need a kind of decision-making software that uses the information from that Sensing State to make decisions. So this kind of camera-only system that we call SuperVision, our target is to get to about 1,000 hours between intervention, between critical intervention. We're on the right path to get there. Right now, this is being supported by the EyeQ5 platform, or our chipset, called EyeQ5.
There's significant amounts of new technology within the EyeQ6 platform, which launches in the first half of this year, and then kind of the lab and the testing environment with samples of this new platform, we're seeing, like, very significant increases in the mean time between failure. But that's still not good enough to, you know, allow a driver to disengage, for the OEMs to take, take on the liability and the risk that that would that would entail. So to move from supervision to Chauffeur, which is our eyes-off system, we add a second perception layer made up of radars and LiDARs that would have an equal type of mean time between failure.
If you have those two independent perception systems, then the chances of both of them failing at the same time go way down, and this is how we get to one million hours. Now, from a kind of a business and a commercial perspective, the OEMs, by adopting superVision, are actually developing and validating, you know, most of the Chauffeur system, 'cause really, all they need to do to move from superVision to Chauffeur is validate the radar and lidar system. So it creates this scalable bridge. It also, I think, protects in a way against, you know, potential delays in regulatory, right? Because I think, you know, this is not the kind of thing that the regulators are gonna allow you to just, like, check a few boxes and say, "It works." They're gonna want proof that these systems are safer than humans.
So if that takes longer, the automaker is still... has a, you know, a really nice, high-functioning system for their consumers that they're profiting from. So that's, that's really how we, how we approach the problem.
Got it. Got it. And maybe we move a bit to talking about how you're supporting the OEMs and their aspirations to sort of go through this roadmap. Obviously, some of the OEMs have their own in-house aspirations of what they can do in-house. What does that, like, a typical engagement with an OEM look like? How are you accommodating their own in-house aspirations, and still sort of what do you offer to them to make sure that they use most of your stack rather than theirs?
Yes. So I think if, again, if you kind of go back into history five or six years ago, you know, where there was interest and kind of, investment into consumer-level AV systems, it was generally happening through self-development of the OEM, or, or that was really their direction, was, you know, "We need to become a software company, so, you know, this is kind of a, a high-value area of software. This is a good place to, to, to put our investments." You had, I think, probably some thinking of, you know, "If Tesla can do it, why can't we do it?"... and I think at the time, Mobileye didn't have a production-level system that we could offer, right?
We were still working on the superVision system. And I think also from a control perspective, you know, the OEMs rightfully realized that, you know, with an ADAS safety system, where all you're doing is trying to avoid collisions with the car in front of you or avoid, you know, moving out of the lane and provide a warning to the driver, it doesn't really require much customization. But when you're thinking about a system where, you know, the car is gonna be driving for the owner of the car, then you have to think in terms of, you know, is it gonna feel comfortable, right? Like, you know, what's the braking profile? How quickly should I... You know, when should we stop? When should we start braking in front of a stop sign?
Should it be super gradual and start very early? Should it be kind of more at the end? You know, these decisions are, you know, I think rightfully so, decisions that the OEM feels like they need to own. Because they you know, needed to own that, they thought, "Well, we need to own the entire decision-making software of the car." And because the decision-making is very integrated with the perception, they said, "Well, we, we better own the perception, too." So I think that this is kind of the reason why you saw so much kind of investment in self-developed software systems for semi-autonomous vehicles. It hasn't worked, right?
There's been many examples of kind of multiple years of heavy investment without really any level of success or, you know, a success in terms of a system that is too expensive, doesn't really scale, and doesn't have a path from eyes on to eyes off. Over the last couple of years, the sense of urgency about having these, this type of technology in vehicles has increased. You know, part of it is Tesla continuing to improve their system. That pressure really ramped up a few months ago when Tesla cut the price of FSD and started giving free trials, making it seem like more of a, you know, commercial strategy.
The Chinese automakers have also been successful in putting kind of Navigate on Pilot, intelligent driving systems on the road, and kind of the, you know, the Western OEMs understand that they're coming to Europe, so need to compete. But we still had this kind of roadblock of, you know, where does Mobileye's role end and the OEM's role begin? And, we've kind of tried various things over time. You know, we opened up the architecture of the chip to enable the OEMs to put their decision-making software on our chip, which would create, you know, better integration, lower cost. But I think that the, you know, the, the view is that they won't have a decision-making software that can support these types of performance requirements.
So, you know, over time, we realized that really it's, it's really only the driving experience that they wanna control. So, what we've done is create an API where, you know, you can essentially take the universal parts of the software, the things that are related to safety or, you know, interpreting the environment, the things that consumers won't ever see. You can take that from Mobileye, and then, you know, you, you have essentially kind of tuning knobs on different parameters of the driving experience that the OEMs can code themselves. And then after the vehicle's in production, if there's complaints like, "Hey, this, this seems too reckless," or, "This seems too assertive," then they don't have to come back to us and wait for us to, to fix it. They can do it themselves.
So we call this Driving Experience Platform, and, you know, we feel like it's really the sweet spot in terms of, you know, enabling the OEMs to have the control that they need, but not, you know, but they don't have to take the risk of trying to develop the core parts of the system themselves, and they can take advantage of our scale.
Okay. Okay. Great. So, now if we go back to just talking about the portfolio and the differentiation there, you start with basic ADAS, and on the other end is Chauffeur mobility as a service as well. How do you think about... I mean, the differentiation clearly is probably higher as you go towards Chauffeur and mobility as a service, but the threat of disruption or the threat of other sort of competitors coming in, particularly, you do have advantage on the basic ADAS on the cost side. So when you think about disintermediation in terms of the wins that you have, where do you see the bigger threat?
So I think that the, you know, the basic ADAS business is, you know, very strong. I think we have significant competitive advantage. One is scale, right? We did, you know, 37 million chips last year, which means that there's 37 million units of capacity to put the chip on a circuit board and to buy the camera and to turn it into a system. This is all the work that's done with the Tier 1, so, you know... And this is really a cost business because it's not something that the OEMs make money on, right? They need it for safety rating, compliance, and so, you know, keeping costs low is extremely important, and also, you know, not causing recalls.
Like, we've never been involved in a recall, or we've never been the kind of the driver of a recall. There was a couple of OEMs that tried to second source, diversify away from us a few years ago, and both experienced recalls in the last three or four months. So that's also a big advantage for us. I think in China, there is some competition on the low end, you know, for low-priced vehicles that probably didn't have ADAS a few years ago, for very simple systems, and, you know, we're dealing with that competition. You know, we always have some level of competition, but these systems won't work outside of China. They're not up to the standards of kind of, you know, the Western-...
countries, and so, you know, we feel good about our positioning in ADAS, and we won 26 million units of new ADAS business in Q1. You know, that's not gonna be a you know, kind of a quarter after quarter type of number, but it shows kind of how we continue to win business at a very high rate. You know, I think that Chauffeur, on the other hand, right, on kind of the farther end, is something where we have not seen or heard of a real approach to get to the performance requirements of this kind of one million hours between failures. And I think, you know, we've been very visible in terms of our true redundancy concept of having the two independent perception systems.
And so the idea of, you know, we'll use Tesla as an example, which right now is maybe 10 hours between intervention, which is actually very good and creates a good product. To get from 10 hours between intervention to one million hours between intervention with only a camera system, you know, with only trying to kind find corner cases and, you know, incrementally, you know, improve the system, it doesn't seem viable to us. So we feel like we have a very large competitive advantage in terms of getting to the performance requirements that are necessary for an eyes-off system. Now, in between, eyes on is more competitive, right? Because I think, you know, there's entities in China that have put systems like this on the road, you know, XPENG, Li Auto.
There are some suppliers in China that are kind of pursuing this market. You know, I think that there, there's no, like, specific performance requirements that you need besides a system that works. Now, where we see ourselves benefiting in this area, is basically, you know, this type of system inevitably is going to be a balance of performance and cost, right? You want high performance, but, you know, how much are people willing to pay for a system that, you know, you still have to keep your eyes on the road? You know, it's not gonna be $6,000-$8,000. So cost is gonna be very important. When we look at the systems on the road in China, you know, we see multiple LiDARs and multiple radars, you know, the same number or more cameras.
We see NVIDIA Orin chips being used, which have, like, you know, anywhere from 10-20 times the processing power of our chipset, which means cost, which means power consumption. And then we, we know that, you know, some of these automakers, which are not, you know, these startups are not particularly, like, financially, you know, strong, have 2,000-3,000 engineers working on this. So if you think in terms of, you know, $100,000 per engineer, that's $200 million-$300 million of spending on engineering for these systems. If you put it in a 100,000 cars, that's $2,000-$3,000 extra. So, you know, we- our system is essentially $1,800 all into the automaker.
We see systems. Our benchmarking would say, you know, just the material cost of these systems is, you know, $3,000+ , and then you have to add in to kind of the engineered cost as well. So we think we have a very large cost advantage that'll play out over time.
Okay, interesting. Sticking with SuperVision on that front then, one of your big customers is ZEEKR. How should we think about the impact of the recent EV tariffs-
Mm
... that were put on Chinese OEMs by the U.S. administration in relation to growth aspirations for ZEEKR and eventually its then impact on SuperVision's growth?
Yeah. So yeah, the geopolitical situation is kinda tough to keep up with. So I think it's one of these good and bad things, right? It's, you know, our main customers are legacy automakers, right? They're not startups. And we've seen market share shifts over the last few years from, you know, legacy automakers to startups or to domestic Chinese automakers. You know, it's happened in China, but we're also seeing, like, significant production growth in exports out of China. And we have good position with these OEMs, but it's not, you know, it's not like we have 100% of their business for most of them, right? So I think protectionism in North America and Europe, in some ways, helps us because it helps our main customers.
But, you know, it, it hurts us in a way because, you know, we want this competitive pressure to, you know, lead to our main customers moving faster to deploy these types of technologies. And I think, you know, the U.S. tariffs don't affect ZEEKR because they didn't have any ambition to come into the U.S., but, you know, they are selling cars in Europe. You know, there's, there's obviously talk of, you know, protectionism there as well. So I think, you know, for now, no impact, but I think we, we would want our customers, like ZEEKR and Polestar, to be able to, you know, have success outside of China, because I, I think it, it, it helps to kinda push our other customers to move faster.
Okay, great. Can you give us an update on wins for SuperVision, and particularly, when do we start to see a bit more diversified OEM exposure that reduces the sort of customer concentration risk for SuperVision?
Yes. Yeah, good question. So, right now, we have production agreements with four automaker groups. You know, ZEEKR and Polestar, kind of within the Geely group. These are in production today. So this is what's driving our volume in 2023, 2024. We launch a system with FAW, which is a state-owned automaker in China, in Q4 of this year. That is expected to drive significant volume growth in 2025. But still, you know, with a, you know, mostly focused on China. We have a design win with Mahindra that should launch sometime in 2026. That'll be pretty low volume. But then the kind of the first, you know, global agreement we were able to sign is with Volkswagen Group, and that launches in the first half of 2026. So that's seven.
Right now, we have five vehicle models in production. You know, there's another, you know, four or five coming next year. The Porsche and Audi inside VW Group is for 17 models to launch over a period of a couple of years, so that should drive, you know, a lot of diversification globally, customer-wise, and a lot of volume growth as well. And then beyond that, we're in, you know, what we call advanced discussions with 10 additional automakers. In total, those 14 total, you know, almost half of the industry in terms of, you know, production share. So, you know, these design wins are really important to continue this diversification process and also generate more, you know, maybe FOMO is a good word for, you know, competitive pressure to move fast and deploy these systems. So...
This has been a huge expansion of the pipeline, 'cause if I go back to, you know, when we were IPO-ing in late 2022, you know, I would have said we were, you know, either production agreements or advanced discussions with three OEMs, and now it's 14, which kinda demonstrates, you know, the impact of, you know, lack of success in internal systems, you know, more competitive pressure, and kind of the proof points that we've provided by launching this system in China.
Okay. Okay, got it. Let me open it up and see if anyone in the audience has a question they want to ask. Any questions? Can we get a mic over?
Thank you. This is Mahfouz from Owl Creek. A quick question on terms of labeling and the impact of AI. From my understanding, you guys have a very strong lead because of the labeling component, just over time, and you guys have a team working on that. There's been a lot of talks of startups and Gen AI being able to work on that auto-labeling aspect, and I think Tesla has spoken about it before.
Mm-hmm.
Do you have any thoughts on that?
Yeah, good question. Yeah, so we've, we've been auto-labeling for, you know, four or five years now, so we still have a manual labeling team, but it's shrunk quite a bit over time. We, you know, we've been collecting video data for, you know, 15, 20 years now and have about 250 PB of labeled video data. You know, the last time Tesla talked about their set of data, they talked about 30 PB, so, you know, about 12% of what we have now. I'm sure it's grown. So I think auto-labeling is important, and it created more efficiency, and, you know, it's something that we're doing. You know, Gen AI, our CEO and our CTO are also academics, kinda world-renowned researchers in the AI space.
There's several other startups that, you know, Amnon has, has kind of brought out in the last few years, some of which make very heavy use of, of gen AI, and so do we. We published a blog last week because this question is creating a lot of noise around, you know, essentially it's... In some ways, it's making Mobileye look like the old school, right? And there's this kind of false dichotomy of end-to-end AI or model-based, and nothing in between. And like everything else, the truth is in the middle, right? So I, I'd encourage people to go to our website and, and read the blog.
It's a little scientific, but I think essentially the takeaway was that, you know, even if you look at kind of the most leading-edge gen AI type of developers, like OpenAI or Google, their latest networks are an engineered approach like ours, right? It's called compound AI, and kind the idea is essentially where, you know, you're you need to connect the different pieces of the system together in order to drive reliability, in order to drive accuracy, in order to drive efficiency. And so, you know, I'd encourage people to kind read more about, you know, how fast things are changing, and actually the industry has really moved beyond this kind of pure end-to-end type of system already.
And I would say it aligns very well with kind the approach that we've taken of you know creating this sensing state of you know what exactly what is the vehicle seeing around it? And using that to make decisions instead of just taking video and essentially making decisions you know based on a network, which you know I think works pretty well for you know a 95%-96% accurate system, but you know would be unprecedented to get to the type of accuracy levels that you need for automotive.
Dan, let me ask you one on the more, on the financials. You've talked about the inventory digestion, with a large part of that playing out in Q1 . Maybe just update us how much more is there to go in terms of absorbing inventory through the year. And I know we've talked previously about 2025 being the year where you sort of think about resuming normal growth-
Mm-hmm
... but in terms of the base number to work off is the one that's adjusted for inventory this year, so higher than what you're shipping. Is that still the thought process, that next year we just build off what inventory-adjusted numbers for 2024 are?
That's right. That's right. So our guidance this year for, you know, EyeQ unit shipments is 31-33 million, but we also talked about that we'll be consuming around 6.5 million units of inventory that was in the system as of the start of the year. So if you put those two numbers together, 38.5 is what we would say would be kind true demand in the market. And yes, we've been pretty open that 2025, you know, you should put a growth rate on that kind all-in number, you know, not just the, you know, what we'll actually ship this year. We're very encouraged by kind the progress that we've made on kind reducing inventory and, you know, the validity of basically the numbers that we disclosed on January 4th.
In Q1, we only shipped, like, 3.5 million units. But 70%-75% of that 6.5 million excess inventory was consumed during Q1. We know that through, you know, reporting that we're getting from our tier ones, and also by analyzing what was actually produced, you know, globally in the quarter, 'cause we know, you know, what cars have our chip and what cars don't. And then, you know, we, you know, as we talked about on the earnings call, we saw, like, a significant uptick in orders from, you know, specific OEMs that had the most inventory for Q2. So, you know, by the end of Q2, we expect all of the excess inventory to be gone and get back to more normalized volume levels.
But yeah, so we feel like we're on track with the inventory digestion this year, and really kind of, you know, very much aligned with what we kinda talked about on January 4th when we disclosed.
Just to follow up, as we think about basic ADAS or the whole ADAS stack outside of superVision, how much of that volume through the year will be dictated by underlying production versus new model launches? Because I would assume that helps sort of the inventory situation quite a bit as well.
Yeah, I mean, I think at, you know, at this point, the adoption rate of this technology is pretty high, so a lot of, you know, our new launches or, you know, maybe there's a new, like, Nissan Altima-
Mm
... that's coming out this year. You know, the old Nissan Altima-
So sort of this-
Had an ADAS system from us. So we do have some level of kind of, you know, vehicles that are going from 0 ADAS to some ADAS, and that, and that's really kinda what creates this kinda underlying, you know, five-seven points of volume growth per year, that, that should be, and that we should be able to sustain, you know, through, you know, the next five or six years.
Okay. I'll take this question that's come in online, and let me read it out. "Can you share any update about NZP in China, and any thoughts on the competition situation over the next six months to a year?
Yeah, so NZP is what ZEEKR calls their eyes-on, hands-free Navigation ZEEKR Pilot system that is, you know, built on the SuperVision platform. Yeah, things are going well. Like, ZEEKR refreshed the 001, which is kind of the main model we're on, and it led to a significant uptick in orders that started to come through in volumes in April, and, you know, so far, looking good in May as well. So I think that that's very encouraging. There was a recent survey done that essentially surveyed ZEEKR users, and more than 80% were, you know, either somewhat or very satisfied with the system.
I think the numbers were, like, 60% said that they would pay, you know, over $3,000, for, you know, to get highway and urban NZP, you know, on a lifetime basis. So yeah, we're, we're hearing kind of very encouraging signs in terms of consumer demand, consumer kind satisfaction with the system, and, you know, we see ZEEKR as a really good customer for ours, for us.
Okay, great. We are up on time, so I'll wrap it up there. Thank you for coming to the conference, and thank you to the audience as well.
Thanks, Samik. Thanks, everyone. Yeah.