Mobileye Global Inc. (MBLY)
NASDAQ: MBLY · Real-Time Price · USD
9.23
+0.53 (6.09%)
At close: Apr 24, 2026, 4:00 PM EDT
9.28
+0.05 (0.54%)
After-hours: Apr 24, 2026, 7:57 PM EDT
← View all transcripts

Barclays 42nd Annual Industrial Select Conference

Feb 20, 2025

Moderator

All right, thank you, everyone. As we continue day two of the Barclays Industrial Select Conference, the autos track, I'm Dan Levy. I lead autos and mobility coverage at Barclays. And very pleased to have with us Mobileye, a global leader in ADAS, Autonomous Driving Solutions. Dan Galves, who leads IR and communications efforts at Mobileye. So we're going to do a series of fireside chat-style questions. So folks, if you have questions in the room, please raise your hands. We can handle those at the end.

Dan Galves
Chief Communication Officer, Mobileye

Thanks a lot, Dan, for having me.

Moderator

OK, great. I want to just start, you know, if we can begin with the outlook on 2025, because that's probably the best place to start. Maybe you could give us a sense of how we went through, I would say, some choppiness with the numbers. You're guiding to EyeQ shipments in the 32-34 million unit range. I think this is flat versus the, or down slightly versus the run rate you provided in the second half. Maybe give us a sense of how the numbers are playing out, the puts and takes behind those. Let's start there.

Dan Galves
Chief Communication Officer, Mobileye

Yeah, thanks, Dan. So yeah, it's down a little bit from the second half run rate that we shipped. I think the key point here is, as we said on the earnings call, the indication or kind of the assumption for Q1 in terms of volume is about 25% of the full year. And if it turned out that way, it would be unusual, right? Because you have kind of seasonality in the back half. Typically, you'd have 2% or 3% more volume in the back half than the first half. The reason we're taking a little bit more of a cautious stance is really, you know, I wouldn't look at it as overly conservative. I would look at it as kind of reflecting the reality of the market, where our core customers have been kind of underperforming the market on a fairly consistent basis the last couple of years.

We took a haircut to what kind of expected production is going to be in 2025, expected by the OEMs themselves, expected by S&P, IHS, to really reflect kind of a worse environment in China for global OEMs than is currently being expected. Kind of probably the one million gap, one million unit gap between our assumptions and kind of what the indications from the customers would be, or the indications from the third parties would be, is really reflecting more like 20% decline in China volume for those core customers versus, which is consistent with last year approximately, but is worse than kind of the third parties are thinking right now of like 7% or 8%. So really kind of a clear one million gap. We're also typically in Q4, we would get a bit of pull forward.

There's price bands, volume bands that lead to kind of lower pricing if you buy towards the end of the year. That happened last year. We're assuming it doesn't happen this year. But it's also a kind of a strong case for Q1 being actually a strong quarter relative to the rest of the year. So I think we're just taking more of our kind of our own view of the realities of the market.

Moderator

OK. Within the guide, there is, call it like 1.5 million units. That's from share gain, just ADAS adoption, which I think from a practical standpoint, when we look at that, that's to us, I would say, either launch activity or it's uptake of better take rates on technology with your product. So can you just help decompose that? And how much of this is launch driven? If it's launch driven, are there key launches we should be watching for that will?

Dan Galves
Chief Communication Officer, Mobileye

Yeah, it's a good question. I mean, this kind of quote unquote share gain is related to four or five OEMs where we're still not the vast, vast majority of their business. But we had kind of one conquest business a couple of years ago that is now sort of ready to launch, or probably some vehicles launched last year. So these are not like, I wouldn't say that they're like specific launches that are happening. It's like new platforms, derivative products coming out with our technology where that vehicle produced last year didn't have our chip in it. So we can't identify any major portion of that 1.5 million that is coming from one or two key launches. So I think it's pretty solid. I think that they're clearly identified volume of vehicles that are coming off platforms that have already been launched.

Moderator

And I know you're powertrain agnostic, but I think in general, we've seen a confluence of premium content on EVs. Do we need to be watching EV take rates this year to determine whether you'll hit your numbers?

Dan Galves
Chief Communication Officer, Mobileye

No, because these are just base ADAS programs on kind of normal vehicles, right? So I'm sure that there's some EVs in there, but kind of in kind of a normal mix with combustion engines. So nothing to worry about there this year.

Moderator

OK, great. OK, let's talk about customer awards, and I want to start with maybe a broader question. I think that the point of the capital markets day, at least one of the themes you were trying to hit on, was the clear lead that you had in non-Chinese ADAS, right? That in Western ADAS plus parts of non-China Asia, you're dominating. So maybe you could just start, what are the key data points that you could point to that demonstrate you still hold this lead?

Dan Galves
Chief Communication Officer, Mobileye

Yeah. So I think, I mean, if you look at our overall volume, 33 million units this year, you're talking about a very high share of our core 10 customers, which make up about half the industry. So those customers make up about 41, 42 million units of production a year. We're generating; there's a little bit of Chinese OEM in there, so maybe 31 out of that 42. So kind of a very high share. We've won 95% + of the RFQs from these companies that have been awarded the last two or three years. I think EyeQ6 Lite is paying some dividends. This is a chip that adds features that are going to be required either this year or kind of the next iteration of kind of safety compliance requirements at the same price. So we're continuing to kind of innovate even on the base side.

We just recently won an award with a customer that we haven't had any new business with since 2016. So we're starting to kind of see the benefits of EyeQ6 Lite as well. But yeah, our entry base ADAS is just as good as it's ever been.

Moderator

In your CMD deck, there was a slide that a lot of people are referring to with some flags that show your progress or the opportunities on winning key awards. I think a lot of people have keyed in on. There was a Japanese flag, which I think a lot of people are assuming is Nissan, that you're close to on Supervision Chauffeur. There's a European flag that appears that you're close to a Surround ADAS. Our guess that we've written is the VW brand. Maybe you can just talk to what it's going to take to sort of close out some of these deals and maybe the cadence of some of the announcements we can expect in 2025.

Dan Galves
Chief Communication Officer, Mobileye

So yeah, I mean, if you study that chart, you'll see that we have advanced engagements with nine of our 10 top customers for advanced products, so beyond the base ADAS product set, and that continues to be the case, right? There's a couple of others on that chart that are outside of that top 10, so that continues to be the case. Nothing's really changed. I don't have anything too much to report. It's probably worthwhile to talk a little bit about the difference between Surround ADAS and Supervision engagements, right? Surround ADAS is more of a normal RFQ type of process, and that's because it's kind of a have to do, right? There's a compliance requirement. There's a safety compliance requirement where the 2028 Euro NCAP rules are getting so stringent that you really can't get a five-star rating anymore without multiple cameras around the vehicle.

There's also a huge need to create more efficient Level 2+ on the highway. I think that Level 2+ on the highway is kind of a solution or a feature set that every OEM knows that they need to have, and they need to have it fast. Kind of the prior, the first generations of these systems were multi-ECU, multi-supplier, kind of with the OEM in charge of integrating the system. I think it was kind of a way to create some level of control by the OEM by kind of splitting the supply base, right, and using different suppliers for different pieces. We are almost always kind of the front-facing camera on, I mean, I don't want to name Super Cruise, BlueCruise, Drive Pilot, all of these different systems. But the fact that they're multi-ECU means too much cost.

The fact that there's so many suppliers involved creates kind of too many cooks in the kitchen and results in kind of relatively low-performance systems. So this Surround ADAS package not only hits the safety compliance, but it creates kind of a more efficient, low-cost, scalable Level 2+ on the highway, so highway hands-free. And so I think when you see awards coming here, these are kind of industry-first, consolidated ECU. It's very kind of in line with software-defined vehicle. They're all OTA capable. And it's actually very competitive, right? Because you have kind of traditional competitors trying to use this as a way to solidify their business. But we see many advantages for our solution. So I think you'll likely see announcements here relatively soon. And I think it's a more predictable process. Supervision is kind of visionary for the OEMs.

There's no real kind of compliance or kind of have to do component. There's definitely kind of competitive aspects where kind of the OEMs understand that this is sort of the next generation of driving. You need to have it. But to do it, what to do it, how to create differentiation with your competitors, how to see it as a way to possibly move ahead of Tesla creates a situation where you kind of have to get it right. Or that's the view of the OEMs. And maybe timing is not as important as making sure that you're making the right decision. Here, you don't really have competition. The competition is more kind of internal, internal development. Should we do that? Maybe should we wait and kind of see what happens over the next year? It's more how do we create uniqueness, right? Like Mobileye has Porsche and Audi.

We need to have something slightly different than that. And we have ideas on how to kind of bring that differentiation. But it creates a less predictable timing in terms of the decision. The specific one you're talking about was involved in consolidation talks. And in one way, it didn't really matter because this is for a 2027 platform. And clearly, this consolidated entity was not going to have consolidated platforms until later than that. So in one way, it didn't really matter. But in another way, like when you have noise like that in the background, it does kind of lead to a little bit of paralysis of decision-making. I think that that's kind of moved past that at this point. So we're back re-engaged, and we feel good about it. We feel good about many of these programs.

Moderator

Great. And as far as, and I want to maybe just double-click here, different products. Are the decision-makers that are choosing at your customers between Surround ADAS versus Supervision, is it the same set of decision-makers? Maybe what gets an automaker to say, you know what, OK, we want to, it's more about creating an experience for the customer beyond just compliance or requirements?

Dan Galves
Chief Communication Officer, Mobileye

So I think it's not really a one or the other choice. So kind of what we're seeing is most OEMs want a Surround ADAS, kind of highway hands-free type of product targeting maybe 30%-40% of their volume. And this is three to four times our normal ASP and yeah, can scale pretty quickly. Supervision is seen as more of a, in the first generation, maybe more of a 10% of their volume, probably on more premium vehicles. The decision-makers at the beginning are the same, right? You have kind of the product teams that are evaluating the technology, kind of making recommendations, doing testing. Like you have to kind of go green across all of these different criteria through this kind of due diligence phase. Then kind of purchasing gets involved. And so then it starts to diverge a bit, right?

Because Surround ADAS is, like I said, is seen as kind of a have to have. You need to make a decision. Kind of the leadership and execution that we've kind of shown over the last 10, 20 years is seen as like a huge positive. Our kind of cost efficiency, EyeQ6 High, having all of this new technology within it. So it's simpler. Supervision gets into more of, it's more political, right? Because you have kind of internal software groups that still feel like they have contributions to make. And you have kind of the CEO that could be affected by kind of noise in the media or the press. So it becomes, you need to have a lot of internal alignment within the OEM itself to kind of get to the decision.

And so we see ourselves as significantly in the lead across five, six OEMs, whatever it is. With Supervision, the other thing about Supervision, it's really seen as these kind of aspirational programs for kind of 12 cameras around the vehicle, multi-sensors, lots of compute in the vehicle is seen as the first step towards eyes-off, which is really seen as the kind of the ultimate solution where you could give people the ability to do other things on 90% of their driving, like across most of their commute, even if it's just on the highways. So I think that that plays into it as well. We want this fully hands-free product, but it's more of a bridge to eyes-off, which means you have to show a pathway to the precision levels that are required for Eyes-off as well.

Moderator

OK. A question on competition, which I know just keeps on coming up. Who are your competitors, and to what extent are you seeing the Chinese players, say, a Horizon Robotics as a competitor in the West?

Dan Galves
Chief Communication Officer, Mobileye

Yeah, I mean, we're not seeing them in any sort of competitions or kind of any Chinese suppliers right now. I mean, there's a lot of geopolitical barriers on Chinese-developed technology. But more importantly, those companies have not shown the ability to get to a five-star safety rating under kind of North American and European standards. So that creates kind of an unqualified situation. Our competition on ADAS is kind of the same as it's always been, Bosch, Continental, Denso. I think on surround ADAS, there's some consolidation of kind of traditional competitors going to market with either a tech provider like somebody like Ambarella or going to market with computer vision through a startup, like a variety of kind of configurations there. So the competitive set is a little bit different.

But we see ourselves with cost and technology advantages as well as kind of just the experience and ability to execute. And then on supervision, like I said, it's really internal software groups utilizing kind of a hardware platform like an NVIDIA or a Qualcomm. That is the real competition.

Moderator

OK, great. OK, I want to pivot to AI and in English, layman's terms, please. OK, you've talked about extremely efficient AI. This was one of the highlights of your CMD. Discuss it. How does it differ from Tesla? And then maybe you could help us understand from a practical standpoint, how is this making, say, chauffeur better? Is this actually going into chauffeur?

Dan Galves
Chief Communication Officer, Mobileye

Yeah, so I mean, it's a complicated question. So I think that, I think first of all, Mobileye has shown an ability to implement AI consistently over time, right? Deep neural network technology was publicized in 2011, 2012. We had a deep neural network-based kind of end-to-end component of Tesla Autopilot system on the road in 2014. We used it for holistic path planning, so kind of understanding kind of what the route forward for the car was. So I think we've shown an ability to adjust or kind of integrate new technology very quickly in the past. And that's what we're doing here. So I think that transformer-based architectures in comparison to convolutional neural net, which was kind of the previous state of the art. It allows you to kind of analyze a frame of a video all in one instead of analyzing each pixel, right, individually.

It does create a lot more efficiency in kind of on the training side and kind of the ability to understand an image as a whole. It reduces the number of steps you have to do to take those pixels and create like an actual kind of sensing state out of it. We've been working on our transformer-based architectures for the last like three years. In a testing environment, we're kind of 100 times better in these videos than if you were using like a vanilla transformer. I think we feel really good about kind of that part of it. I think that when you think about what Tesla is saying kind of publicly about monolithic AI and pure end-to-end and no glue, it implies like we don't really need engineering, right, or human-based engineering. We kind of completely disagree with that.

So I think we are utilizing end-to-end networks as components of our structure. But we see challenges in terms of getting to the precision level needed to just use that. I think a good example is just kind of the long tail problem, right? So when you think about edge cases, if your plan is to identify edge cases by essentially putting the system out on the road, kind of finding the problems, kind of understanding the problems, you can do that, right? And then you either find additional data or create synthetic data and retrain the network and do this over and over and over again. Over time, the time between finding the edge cases is going to get longer and longer, right? And then you've got to retrain every single time. So it drives this need for massive amounts of compute.

Our plan, so it seems very risky and kind of unclear kind of where you would go. And if you look at kind of the public data on Tesla, like maybe their critical intervention every eight, nine, 10 hours, something like that. When you talk to people in the industry or OEMs, like kind of the lowest bar for an eyes-off system is 50. So going from eight or nine hours to 50,000 hours by this process of finding edge cases and fixing them and retraining and over and over again, it doesn't seem realistic to us. Our kind of structure, the structure of our system is to essentially add redundancies into the system that take big sets of edge cases and make them normal cases. OK, so a good example is you would encounter many kind of odd vehicles on the road.

They might be agricultural trucks or something like that that may not be in your data set. And so what happens when you encounter one of the vehicles that's not in the data set? It won't be recognized. So in addition to our kind of end-to-end computer vision perception, we have got a backup here. We've got a geometric system. So we use kind of the multiple cameras on the vehicle to create a 3D point cloud. We call it Vidar. So it gives you a sense of like, OK, there's something out there. I can't classify it as a vehicle, but it's big. It's above the ground. I don't want to hit it, right? So we have kind of multiple, like people think of redundancy as camera plus radar, right? And that's an important one.

Even within the computer vision system, we have multiple levels of redundancy that effectively take edge cases and make them normal cases. And then the kind of the camera plus radar is really important. I mean, I live in Utah. Like when you drive through the canyon, you can come up on fog like at any point, right? And so if you have an eyes-off system, you can either try to wake the person up or get them out of their book or whatever. Or you can just have a radar system on the vehicle that isn't impacted by fog. So I think that these kind of multiple levels of redundancy, creating structure within the system, creating a system where every task that our system performs is done in three different ways.

And the way we've set up the system is for if it's going to fail, two out of those three have to fail at the same time, which becomes very unlikely. So it definitely adds some cost, right, to the system. But we see it as kind of giving us a much higher probability of getting to the precision levels that you need for eyes-off. And that's critical for the OEMs to be able to demonstrate that because they don't want supervision to be kind of the last iteration.

Moderator

Some of it pointed to the significant resources Tesla is using here, like 90,000 or 100,000 NVIDIA GPUs, which is a lot to spend on. Discuss the compute that you've dedicated to AI and why it's not necessary to use such a heavy compute footprint.

Dan Galves
Chief Communication Officer, Mobileye

So we spend about $100 million a year on kind of training infrastructure. Part of that is off-prem, right? So it's more like you're paying each year for kind of the same resources, but it becomes kind of very leverageable. And half of that is kind of new training compute that we use. And we use mostly NVIDIA GPUs. I think we've been a resource-constrained company kind of as our DNA, right? So we've always looked to create efficiency in whatever we do. So our view is we need to first understand kind of how much compute we need and then set the infrastructure that way. I think kind of the recent developments, like a DeepSeek, are kind of indicating that engineering ingenuity can create the ability to be very efficient in terms of training compute, particularly when you know the specific use case that you're going after.

So yeah, we just simply don't feel like we're compute-constrained at all.

Moderator

OK. I want to wrap with a question on full AV driverless applications. Maybe you could just remind us because I know it's a spectrum of products you showed us Drive. And maybe if you could just talk to there were some recent media headlines talked about Lyft plans to bring fully autonomous vehicles to the market in Dallas powered by your system in Texas with a fleet owned by Marubeni. Just discuss the efforts. How synergistic are they to the rest of your efforts?

Dan Galves
Chief Communication Officer, Mobileye

Yeah, I mean, I think one of the beauties of our kind of approach is that the core technologies are built for an eyes-on, hands-free consumer vehicle are built to achieve kind of the precision levels that are required for a consumer product that is you're going to allow the driver to fully disengage across large situations and also is relevant for kind of an inner-city robotaxi fleet deployed type of system. So we employ more of a teacher-student where you've got the core technologies are being built for everything. And then you have teams that are adjusting or refining the technology to be put into a specific solution, whether that be for Surround ADAS, Supervision, Chauffeur, or Drive. There's a ton of activity right now in robotaxi. I think that our plan requires multiple players, right?

So we want to have a set of different types of vehicles that are engineered to integrate our drive system that are coming out of normal factories. So VW ID. Buzz is a good example, right? VW will be producing that vehicle in kind of the normal ID. Buzz factory. It'll be reconfigured for robotaxi. It'll have different seating configuration, that type of thing. But it's like a purpose-built robotaxi, but able to be produced in large numbers efficiently. We have a 15-passenger shuttle that's being produced by a company called Holon. We have a two-seater by a division of Rimac called Verne. And we're looking for one more vehicle producer to do more of an ID. Buzz sort of just kind of standard six-passenger, seven-passenger people mover.

You want those vehicle producers, like the benefit of having them beyond kind of the efficiency is also they have contacts to sell vehicles, right? So you need an owner-operator. So in the case of Lyft, this announcement was Marubeni, which is a fleet operator from Japan. And then you need a user base, right? And that's where Lyft comes in, right? Because they have tons of users and they can generate demand. So yeah, we're excited about this engagement. Lyft is they're very focused. Uber's taking more of a scattershot. Hey, like bring your robotaxis to us, whoever. And that's totally fine. Lyft wants to be, it seems like, a little bit more precise and create kind of these kind of network communications between the vehicle and their system. So it feels like something that once the technology is ready could scale pretty quickly.

We see ourselves as kind of supporting an operating cost that is significantly less than human-driven vehicles.

Moderator

Great. OK. We'll leave it there. Dan, thanks.

Dan Galves
Chief Communication Officer, Mobileye

Thanks, Dan. Yeah, great questions. Thanks.

Powered by