Mobileye Global Inc. (MBLY)
NASDAQ: MBLY · Real-Time Price · USD
9.23
+0.53 (6.09%)
At close: Apr 24, 2026, 4:00 PM EDT
9.28
+0.05 (0.54%)
After-hours: Apr 24, 2026, 7:57 PM EDT
← View all transcripts

Goldman Sachs Communacopia + Technology Conference 2024

Sep 11, 2024

Mark Delaney
Analyst, Goldman Sachs

Okay, great, thank you, everybody. My name is Mark Delaney, and I have the pleasure of covering Mobileye for Goldman Sachs. With me today is Dan Galves, the Chief Communications Officer. Thank you very much for joining me.

Dan Galves
CCO, Mobileye

Thanks, Mark. Thanks for having me.

Mark Delaney
Analyst, Goldman Sachs

Yeah, I thought to kick things off, maybe it would help to level set everyone on the key products and the rough pricing per product. A lot of our discussion, I think, will get into some of the offerings and ASP opportunities. Level set people, if you could please, around the base ADAS product, the rough price, what's SuperVision, what's Chauffeur, et cetera.

Dan Galves
CCO, Mobileye

Yeah. So I mean, the core revenue driver of the business today is base ADAS, where we go to market with a single chip with our software on top that supports a front-facing camera system. Typically, we're getting about $49-$50 per chip on average at a you know, 70% gross margin. That's. Those numbers have been pretty stable for quite a while. The next step up is what we're calling Surround ADAS. So this is a system that's intended to kind of meet the late decade regulatory drivers that we're seeing to kind of develop in the U.S. and Europe, where regulations and standards are really stepping up significantly.

We don't have any design wins for this business yet, but it's a single EyeQ6 High chip supporting six cameras around the vehicle, anywhere from two to five radars. This chip can handle the processing for all of those sensors, and kind of the pricing is gonna be in the $175-$200 per chip range. Again, kind of similar, you know, 70% gross margin. Supervision is a significant step up because this is a system that supports point A to point B driving. So basically, put your address into the system, and the car will drive you there with, you know, as little kind of human input as possible, but you need to keep your eyes on the road because it's a single sensor system. It's camera only.

We support this with two EyeQ6 High chips that support that processes all the cameras, integrates the mapping, includes the decision-making software. It's, you know, anywhere from $1200-$1500 per unit, but this is a little bit different. We're selling the chips on a domain controller, so the gross margin is a little bit lower, kind of in the 40%-45% range, because there's a lot of third-party components within that, third-party hardware within that system. The next step up is, you know, it's basically the supervision system plus a second perception system based on radars and LiDARs. This is called Chauffeur. This supports eyes-off driving in specific situations, certainly highways at high speed.

You add a third EyeQ6 High chip, and the price goes up to kind of the $2,500-$3,000 range. And then we have what we call Drive, which is a complete self-driving system for robotaxis. This is a very different business because you are generating revenue per mile instead of revenue per car, you know, as the operator of these robotaxis. So you are really kind of supporting the elimination of a human driver, which is very expensive, you know, generates a lot of cost savings, so we are selling this for significantly higher prices. So, you know, I will not get into specifics on that one, but in the tens of thousands of dollars.

Mark Delaney
Analyst, Goldman Sachs

Very helpful. Thanks. Now, in terms of developing your product set, AI is one of the key technologies needed to develop the algorithms for partially and fully autonomous driving. Mobileye's put some work and blogs out there around your views of the best way to develop this. You call it the compound AI systems approach. Maybe talk about what you're doing with AI development, how that maybe compares, and is similar or different versus an end-to-end approach.

Dan Galves
CCO, Mobileye

Yeah. So I mean, this is a complicated question, and it's one that, you know, I think that, you know, requires a lot of education. There's a lot of noise around it, that is, you know, as usual, it's more complicated than people make it out to be. I think that the first thing I would say is that this idea of end-to-end, you know, pure AI, data in, controls out, versus a kind of rules-based system is a false dichotomy. I think no one is really doing either.

Like everything, you know, the truth is in the middle, and I think what compound AI systems means is essentially you are incorporating transformer-based, end-to-end AI architectures inside of an engineered system, and you may have kind of other systems within that, that create efficiency or more accuracy within the overall network. And I think that that's basically like if you kind of look at the research and the latest, you know, talk from OpenAI and Google and other researchers, this is kind of, you know, large language models have moved beyond pure end-to-end AI and moved towards compound AI. And I think the reason is because, like, you know, if everyone was using kind of the same approach, and everyone has the same access to the same data, like the internet, you're not gonna be able to create any differentiation.

It's really in the engineering that creates the differentiation. And it's kind of the same for automotive. And automotive is even more kind of challenging because you know, it's an area where safety is paramount. You know, you're not going to, you know, die from making the wrong prompt in ChatGPT, but you could easily die if a car makes a mistake. So I think this calls for, you know, a significant amount of engineering and kind of fallback systems within the overall system, and that, that's really our approach. And I'll give you an example. So, you know, we very much believe in kind of the value of new architectural approaches like transformers.

For our EyeQ6 based system, you know, the core vehicle detection and many of the other detection and perception engines are based on transformer-based, kind of end-to-end approaches. It's very helpful because in a, you know, in a convolutional neural network approach, you're actually analyzing each different object in the image separately, and then you have to stitch them together, and then you have to eliminate overlap, and there's a lot of steps to it, and it's not as accurate. Whereas in a transformer-based system, you're analyzing the entire image as one, so it creates a much more efficient approach. If you encounter an object that's not in the dataset, the system could easily just ignore it and run into it.

Thankfully, for us, we have legacy techniques based on geometry that essentially are saying, you know, "If an object has size and mass, I don't care what it is, I want to avoid it." So having those two different approaches within the vehicle detection part of the computer vision system creates redundancies, creates areas where different engines fail in different ways. And we see that as kind of the most efficient way to get to the high performance requirements needed for these systems.

Mark Delaney
Analyst, Goldman Sachs

Maybe we can talk then around where you stand on some of the feature capabilities. SuperVision is your hands-off, but eyes-on product.

Dan Galves
CCO, Mobileye

Mm-hmm.

Mark Delaney
Analyst, Goldman Sachs

So highway driving, but driver needs to pay attention at all times.

Dan Galves
CCO, Mobileye

Mm-hmm.

Mark Delaney
Analyst, Goldman Sachs

That works on the highway. Say, you already ship it. You've hoped to roll that out to work in city streets. Where does that stand?

Dan Galves
CCO, Mobileye

Yeah, good question. So I think that, you know, in China is where the supervision system is in production on the road, in kind of automaker vehicles with Zeekr and, and a couple other brands. We launched the software, you know, a little bit more than a year ago, or we kind of enabled the software a little bit more than a year ago in two different cities for highway only, and now it's operating in 160 different cities, highway only. For urban, you know, our view is that high-definition mapping is extremely important to create kind of certainty around different intersections, you know, where the crosswalks are, which traffic light is relevant to which direction, yield patterns, yield rules. All of this can be included in a high-definition map and would take a lot of the pressure off the perception system.

So our competitors, which are primarily kind of self-developing, you know, auto OEMs that have done their own software, are also only on highway and have not been able to kind of operate in urban settings, and we kind of attribute this to lack of mapping, and the high-definition mapping in China is very expensive because it's done manually with kind of LiDAR-equipped vehicles. Our kind of crowdsourced mapping, called REM, we see as kind of the trigger to create that kind of certainty in the urban areas, but we have some challenges in China with lack of vehicles, you know, harvesting REM. So bottom line answer is we have not delivered kind of the urban software yet in China.

I think, you know, on a collaboration agreement that we announced with Zeekr last quarter is intended to work towards unlocking those kind of roadblocks. Very difficult data restrictions in China, so, you know, and lack of vehicles, you know, this is the intent of this collaboration is to kind of progress towards a more robust REM map. What I need to say, though, also is in the U.S. and Europe we already have maps that are kind of covering more than 90% of roads in the U.S. and Europe. So when we launch systems with supervision in the U.S. and Europe we see it as a much kind of faster ramp up and including urban streets as well.

Mark Delaney
Analyst, Goldman Sachs

Okay, and then within China, though, more work to do and no timeframe that you can share as to when you think that'll be done?

Dan Galves
CCO, Mobileye

The target is by the end of the year. So we're hoping to get to that.

Mark Delaney
Analyst, Goldman Sachs

Okay. And then the next level up is Chauffeur, which would be eyes off, and so on a highway, maybe the driver can look at their phone and also have their hands off the wheel. When do you think you'll be able to ship that type of a product?

Dan Galves
CCO, Mobileye

So I think the main Chauffeur activity is with Audi, which the target launch date is late 2026. So I think, I think it's kind of the perfect partner to have because, you know, we'll collaborate on this, with Audi. You know, kind of very kind of robust validation processes. They'll rely on the other work with the VW Group for supervision, because it's really kind of a scalable, you know, compute platform, so we can really focus on the radar, LiDAR perception system, as well as kind of validating the system. So we still feel like we're on track for late 2026 launch of Chauffeur, in Europe, likely the U.S. as well. I think that will be able to take advantage of the fact that those REM maps are already available in those areas, so it should be a pretty fast ramp-up phase.

Mark Delaney
Analyst, Goldman Sachs

So, then the VW Group, when you have orders for vehicles, not just to be sold in Europe, but also into the U.S., if I heard you correctly?

Dan Galves
CCO, Mobileye

For sure. And, you know, there's a plan for China as well. So it's they're intended to be global vehicles sold globally. That's another reason why kind of the collaborations in China are very important, 'cause we wanna have kind of the landscape in place to support those Western customers in China, as well as Chinese OEMs.

Mark Delaney
Analyst, Goldman Sachs

I think your VW Group win is a lot broader than just the Audi Chauffeur agreement you alluded to. You have supervision as well. You know, as I think about the U.S. market, maybe help me better understand when we may see vehicles with supervision on the road and Chauffeur on the roads here in the U.S.

Dan Galves
CCO, Mobileye

Polestar 4 should be the first SuperVision-equipped vehicle to be sold in the U.S. They are not delivering cars yet because the car is produced in China, so it's subject to a 100% tariff right now. The intent by Polestar, which has kind of always been the intent, is to produce the car in Korea starting next year. They'd be able to get around the tariffs that way. So I would say by, you know, middle of next year, you should be able to drive a car with SuperVision equipped. And then I think, you know, the Porsche and Audi, VW Group, vehicles start to launch first half of 2026 with SuperVision. It should be pretty concurrent, Europe and the U.S.

Mark Delaney
Analyst, Goldman Sachs

Okay, and as we think about that mid-next year, Polestar 4 timing, do you think supervision, software, and mapping's ready to actually allow for the features?

Dan Galves
CCO, Mobileye

You know, I think it's always kind of a combination of us and the automaker, right? So I think that, you know, I don't think you're gonna see too many vehicles really ever launch and have kind of the software for, you know, the kind of the full suite of capabilities available right at once. So I would expect that Polestar has, you know, a bit of a validation process after the vehicle launches, but I don't think it should take too much time.

Mark Delaney
Analyst, Goldman Sachs

Okay. Robotaxis have been very topical. We're in San Francisco, lots of Waymos. Tesla's got its robotaxi event planned for October. On your last earnings call, you talked about some incremental interest in Mobileye Drive, which is the Mobileye robotaxi solution. So maybe talk a bit more on the interest Mobileye is seeing in the Drive platform, and any sense on timing of when we could see Mobileye Drive-equipped robotaxis on road somewhere?

Dan Galves
CCO, Mobileye

Yeah, I mean, robotaxis have been a crazy rollercoaster for, like, the last 10 years. I think just as kind of the Cruise, you know, negative news peaked, and I think that's died down a lot, you know, Waymo has made a, I think, a ton of progress with adding cities and removing drivers. I think it's very encouraging to the whole industry to see that. Yeah, we kind of have experienced the same or similar level of kind of renewed interest. I think from. Our plan basically is to try to put together vehicle platforms that are purpose-built for that business. Like, so shuttles that have multiple seats and are configured a bit differently, that are engineered to, you know, to integrate our self-driving system.

So, you know, pre-engineered, you know, for where all the sensors are gonna go, like, where the compute is gonna go. You know, kind of integrated APIs, you know, that can be used by whoever the network operator is, visualizations for the customers. Like, there's so many different aspects to this business that need to be in place. You really need, like, a vehicle development partner to work alongside you. And so we have Volkswagen Commercial Vehicles. We have a company called Holon, which is a division of Benteler, that just raised a significant amount of capital for this specific business a few months ago. Those are two, our two main ones. Then we also have Verne, which is a division of Rimac, that's creating more of like a, you know, a smaller shuttles.

And so we have a good set of vehicle partners that have been working with us for the last few years to kind of integrate these systems. And then we're the provider of the self-driving system, and then you need to connect the demand side. And the demand side is probably best served by a transportation network company like Uber or Lyft, but also like public transit operators. So we have an engagement with Deutsche Bahn that's supported by you know specific German states that are looking to replace fixed-route buses with on-demand autonomous shuttles. We have a relationship with Ruter which is kind of the public transit operator of Norway that you know is kind of very eager to you know reduce congestion in Oslo and other cities in Norway,

Beep, which is a kind of another sort of public transit operator in the U.S. So we're kind of creating this ecosystem, and, you know. What I would say in terms of progress is the Deutsche Bahn, the Deutsche Bahn testing activities, the Holon Ruter testing activities, and the VW commercial vehicle testing activities, which are in Austin and Germany, are all targeted for closed group testing, starting by the end of this year. So that would be, you know, essentially a, you know, group of people that have signed up to be users of this service. So, you know, we think Waymo is the leader here. It's a different technology approach. You know, we don't see it necessarily as scalable, but very kind of impressed by their performance, and we feel like we're right there.

Mark Delaney
Analyst, Goldman Sachs

Okay, that's helpful. One of the sensor technologies that Mobileye uses for level three and robotaxi vehicles and the roadmap is LiDAR. You had some news out recently around the change in your LiDAR development program. You've been working on a next-gen LiDAR in-house. You've made a change going forward. Maybe talk a bit more on what you plan to do in terms of sensors and then the financial implications.

Dan Galves
CCO, Mobileye

Yes. So we, as of a few years ago, we started developing two different sensors internally. One is an imaging radar, and one is an FMCW LiDAR, which is, you know, essentially, like, a next generation type of LiDAR, as compared to the current LiDAR, which is called time-of-flight. The imaging radar is very strategic to us because we see the potential for this sensor to create LiDAR-like output. So give you basically as much resolution, as much range as LiDAR does, and it's really all about cost. So if we can, you know, essentially reduce the number of LiDARs that are needed on a robotaxi or an eyes-off vehicle by using imaging radar, which is significantly lower cost than LiDAR, then that creates a more scalable, lower cost system in general. The imaging radar is on track.

It's hitting all the performance targets, and it launches into production second half of next year. The FMCW LiDAR was a little bit of a hedge. You know, what if the imaging radar doesn't really turn out? What if our technology roadmap requires a more sophisticated LiDAR than time of flight? We only saw one or two developers, or one or two groups developing FMCW LiDAR, so, you know, what if we don't have access to that? So it was a bit of a hedge. We're now at the point where R&D on the LiDAR needs to start moving to the production phase, which leads to a ramp-up of expenses.

Because of significant progress on the computer vision side, significant progress on the imaging radar side, and also, like, very kind of steep price declines on time of flight LiDAR, we just don't see a scenario where we need this anymore, and so didn't see kind of the value of continuing to invest. But we still plan to use LiDAR on the Chauffeur and Drive systems for sure, or just plan to use third-party LiDAR, which we were already planning to use for the, kind of the first generation.

Mark Delaney
Analyst, Goldman Sachs

I think there were some expense savings in the press release you put out.

Dan Galves
CCO, Mobileye

Yeah. So this, our kind of adjusted operating expenses, which excludes stock-based comp, are around $940 million this year. You know, that's what's incorporated into our guidance. $55 million of that is related to the LiDAR activity, so that's a cost that we will not need to repeat next year.

Mark Delaney
Analyst, Goldman Sachs

Okay. I wanted to talk about the competitive environment, including potential future design wins. On the last earnings call, the company talked about for its advanced products, so SuperVision and Chauffeur, you've either got wins already or you're in advanced discussions with fourteen global OEMs, and those fourteen OEMs represent over half of global auto production. Anything you can share around how those engagements are going and what it might take to get new wins for advanced products?

Dan Galves
CCO, Mobileye

I think continuing to make progress, right? And, no change in terms of kind of the direction, kind of the engagement, or the timeline of what we expected as of, you know, a month ago, when we talked about that. And there's really been no change, you know, I would say, for the last, you know, four or five months. The expectation is that, you know, four or five of these kind of new, SuperVision engagements are targeted for decisions, by the end of the year. You know, what are the chances that all of them come to decision and there's no delays? Probably fairly low. But I think with that many opportunities, there's a good likelihood that we get decisions on some of them.

There's also, you know, four RFQs for Surround ADAS that also, you know, some have a chance to be announced in the near term. So I think from a business development perspective, like, we continue to see more opportunities than we ever have in the past, and just kind of build on each other. I think it's like the, you know, the Volkswagen announcement, you know, led to, you know, reach outs from, you know, some of their competitors.

I think that there's a couple of these kind of new deals that are very kind of influential mass market OEMs that would lead to, you know, greater urgency with, you know, others in that group. So we feel really good about kind of the opportunities in front of us. Like, we don't have full control of when they're announced or when they're completed, but we're not seeing any roadblocks right now.

Mark Delaney
Analyst, Goldman Sachs

Okay. And just to make sure I understood those comments properly. So four to five awards slated to be decided in the second half of the year, that's specifically around supervision and chauffeur?

Dan Galves
CCO, Mobileye

Correct.

Mark Delaney
Analyst, Goldman Sachs

But some of those may slip, because that's somewhat out of your control when-

Dan Galves
CCO, Mobileye

Yeah, and that's what we've kind of always thought. But I think that, you know, kind of I think the, you know, investors are kind of rightfully looking for that kind of next proof point on the advanced products. And I think that there's, you know, consistently a lot of noise in the automotive market, like whether it's kind of production headwinds or, you know, kind of lower EV penetration that, you know, means, you know, less investment required. And, you know, so I think that there's a lot of noise around kind of, you know, what the automakers are gonna do. But we continue to see the same level of kind of eagerness, progress, engagement, and these are the kind of things that are not. It's not like you're having a check-in call, like, once every couple weeks.

There's real, you know, demo work, validation work, you know, physical testing of the products, roles and responsibilities, legal, commercial, technical agreements, so there's a lot of work to do, but we feel like we're on track.

Mark Delaney
Analyst, Goldman Sachs

Okay. DXP has been part of your offering and central value proposition that OEMs may wanna use Mobileye for these advanced products. Maybe talk about how DXP is being received.

Dan Galves
CCO, Mobileye

Yeah. So DXP is. Well, you know, at a higher level, like, the one thing that, you know, our automaker customers are unwilling to compromise is having some level of control over the driving experience of the vehicle, right? They don't want to have a turnkey system that, you know, is the same as everyone else, you know, that has no chance for them to differentiate. And so, you know, early on, that led them to feel like they needed to control the software of the entire system.

But we've been kind of arguing or advocating that there, there's a middle ground, and DXP is our way to kind of find and enable that middle ground by creating an API, where the automaker can take core technologies like computer vision perception and mapping that, you know, the consumer is never gonna see. They just wanna know that it works. But then be able to, you know, fine-tune things that, you know, are visible to the consumer, like how aggressively are you kind of cutting in front of cars on the highway or in a roundabout? Or how, what the braking profile is in front of a stop sign, or do you stop, like, right at the stop sign, or a little bit ahead of it, in front of it?

All of these things are the things that the automakers, you know, want to control, and DXP has kind of found the sweet spot of how to do that without, you know, putting too much responsibility onto the automakers' developers, which creates risk.

Mark Delaney
Analyst, Goldman Sachs

If they're going through the work of doing some coding themselves and using DXP, why not just go all in-house?

Dan Galves
CCO, Mobileye

Yeah, because I think it's too hard, it's too expensive. Like, I think the last thing that the OEMs want right now is to invest in a program that doesn't have a high probability of success, you know? And I think that they've already spent time in trying to do in-house development. They've been down that road. It's very expensive. I mean, XPeng just announced, you know, or just disclosed that they spend $500 million a year on ADAS software engineering. This was disclosed yesterday. So I think that the, you know, they're kind of past that point of kind of wanting to do everything themselves, but are looking for ways to be able to tweak the driving experience, and also ways to, you know, make sure that if there is an issue, that they can identify the root cause.

This gets back to kind of the case discussion, which is, you know, pure end-to-end, kind of data in, controls out. It's really a black box. So, you know, if there is a problem, your solution is to try to identify kind of what was happening in the environment, and then generate more data, whether synthetically or real-world data, and then retrain the network and hope it fixes it, basically. That's not gonna satisfy regulators. That's not gonna be a good answer to, you know, an automaker's customer. So that's the other thing that I think that is sort of a, you know, a deal breaker if you can't provide.

Mark Delaney
Analyst, Goldman Sachs

Okay. Mobileye is assuming in guidance that it becomes optional rather than standard fit in the future for Zeekr with supervision. Can you give us a sense of how much of your supervision shipments the last few years were to Zeekr, and how impactful going forward will this change to be optional on your shipments?

Dan Galves
CCO, Mobileye

I think it's definitely impactful. Just to level set, the Zeekr 001 is the main has been the main volume driver of SuperVision volume, and accounted for, you know, around forty-five to fifty thousand units in the first half, out of the seventy thousand that we shipped. And every Zeekr 001 produced had the SuperVision system. Zeekr, for the last five years, has been developing their own kind of internal system that's similar to SuperVision, and they have it on a car called Zeekr 007, that they've been producing for the last year and a half or so. They made the decision in August to kind of split the Zeekr 001 volume between their own system and SuperVision.

Unfortunately, for us, like, we're gonna have kind of the minority of the volume, because it's only on the lowest price version of the car, because it's significantly lower cost than their system. We're estimating that it'll be about 20%-30% of kind of the going forward volume. That was incorporated into our second half guidance as of the Q2 earnings call. That this is not, you know, anything that's gonna drive a change to volume, but it does create kind of a lower run rate of volume to start next year. Yeah, it's I think it's, you know, from the Zeekr perspective, it's, you know, that there's a lot of investment that they're making into this software. They're proud of it.

There's marketing positives in China to say that, you know, you're in control of software, and you're using, you know, NVIDIA processors, and you have LiDARs on your car. So I think that there was a lot of reasons for them to do it, not to mention that they really need to kind of spread the R&D across more volume, which the Zeekr Seven was not, you know, generating that much volume. So, I think it's something that, you know, we're fine with, obviously. But we also see opportunities to continue to collaborate with Zeekr, like I was talking about earlier, in kind of like productizing REM in China. Like, that could be a big differentiator for Mobileye.

But I think we've kind of realized that we do need help from a partner. We've used partners in the past, but not very sophisticated ones. So I think this could be helpful, not just to generate more business with Zeekr, but to generate more business with other Chinese automakers.

Mark Delaney
Analyst, Goldman Sachs

This year, Mobileye expects to ship advanced products for about a hundred and twenty thousand vehicles, and incorporating some of those factors that we just discussed. Any directional views as we think about advanced product shipments in 2025 and 2026 that you can help us with?

Dan Galves
CCO, Mobileye

I think it's too early to really start talking about specific numbers. I think that, you know, what we have is kind of likely incremental volume from Polestar 4 next year, likely incremental volume from FAW next year. And then, kind of the big change to the business happens when the Western OEMs start to launch, like Porsche and Audi, in 2026. And then, you know, the RFQs that we're pursuing for, you know, other Supervision customers, you know, are targeted for 2027 launches and, you know, could be very meaningful, you know, could make today's business, you know, today's Supervision business look extremely small, you know, any one of them.

Mark Delaney
Analyst, Goldman Sachs

Great! Well, Dan, really appreciate you coming in again this year. Appreciate all your time and, thanks so much.

Dan Galves
CCO, Mobileye

Thanks, Mark. Really appreciate it.

Powered by