Okay, great. Thank you everybody very much for joining. My name is Mark Delaney, and I cover Mobileye for Goldman Sachs. I'm very pleased to be hosting Dan Galves, the Chief Communications Officer of the company, today. Thanks for joining us.
Thanks, Mark. Appreciate it.
A lot for us to get into over the next 35 minutes, but to start, I thought perhaps you could cover Mobileye's key offerings and, and maybe talk around how the tech builds up, you know, starting from that base ADAS product and how that goes into SuperVision, Chauffeur, eventually Mobileye Drive. And, you know, I think, you know, the ability to have some scalability of, of the technology, I think is, is pretty interesting, especially with what it might mean for your ASP. So can you talk a little bit more around how the product set weighs out? That'd be helpful to start.
Yeah, sure. So our kinda core historical product is the software and chip that processes data from a front-facing camera that helps the car avoid dangerous situations. So if you're about to, you know, get into a rear-end collision, the car can identify that the situation is happening and send a signal to the brakes to brake the vehicle. And that's really kind of the core of our business. It's 90% of revenue today. You know, five or six years ago, we made the decision that we should pursue, you know, higher levels of self-driving, but we wanted to do it through kind of a building block approach, right? And so, you know, if you think of the target as cars that can drive themselves, you know, you can think of two different markets potentially developing.
Like, you could think of the robotaxi market, where you have completely driverless cars that are, you know, intended to replace human drivers and operate, you know, within a fleet, try to operate many hours a day to drive high utilization. So that's, that's one approach. And then, you know, you could also think of the potential for regular consumers to buy cars where, you know, in certain situations, or in maybe all situations, you could disengage, do other things, and the car would take over the driving functions for you. So our desire was to create a technology that would work for both of those markets. So, that means it, it can't be, you know, $50,000, $60,000, $70,000 for the system. That might work for robotaxis, but it's not gonna work for consumer vehicles. It has to operate everywhere. It can't just operate in San Francisco.
You know, that would work for robotaxis, but not for consumer vehicles. So we've kinda tailored our approach to that. And, you know, the building blocks are really the first step, is to take the single front-facing camera and create a 360-degree computer vision system, so data from 11 cameras around the vehicle. So you're not just monitoring the front, you're monitoring the entire scene around the vehicle. You know, we've done that. Then we believe that, you know, high-definition mapping is very important. If you think about, you know, us as human drivers, we're a much better driver in our hometown than if you dropped us into France or someplace like that, 'cause we're much more familiar with the signs, we understand the traffic patterns, the culture.
Since we have this kinda great position with front-facing cameras in vehicles, we have about 70% market share. That means we have a lot of front-facing cameras in vehicles. We're extracting data from these vehicles that we can turn into a high-definition map in the cloud. So, you know, with those two things, you have a really good kind of model of the environment. Then you have to make decisions, so we created a decision-making software. And then kind of the last step, I would say, is to, you know, get to the failure rates that would support, you know, better than human-level of performance. And we feel like you can't really do that with just one perception system, so we add a second perception, an independent perception system based on radars and lidars. So this, this is the approach.
The benefit is, you know, if you kind of ignore the radar, lidar perception system, you can create kind of a complete hands-free driving system where the driver needs to monitor the environment, right? So the driver needs to stay eyes-on, but the car can take over the driving functions for you. We call that SuperVision. So our approach to autonomous leads to the ability to commercialize kind of less, you know, systems that are not full self-driving. So the SuperVision system, which is in production today, we generate maybe, you know, $1,300-$1,500 per unit, compared to $50 per unit for our base system. For the fully self-driving, you know, for the kind of fully autonomous systems, you know, the selling price can go, you know, into the thousands of dollars for us.
That's a really interesting scalable aspect of your business. You're in discussions with a lot of different customers and potential customers around that. And maybe starting with the six OEMs that are currently signed up for SuperVision or Chauffeur, a couple of which are Porsche and Polestar, with recent announcements. Can you talk about the cadence and the ramp for SuperVision and Chauffeur with those six OEMs?
Yeah. So we started this year with one car with SuperVision, so the Zeekr 001, which is an all EV from a company. It's sold in China. They're starting to export to Europe, you know, in the next couple of months. We did about 90,000 units in 2022, so this was a car that generated good brand recognition, good volumes. But that's kind of... We started with one car. Starting in Q1 of 2024, we'll have five cars with SuperVision so that includes two Zeekrs, a Polestar 4, a vehicle from smart, and a vehicle from a fifth, from a fourth, vehicle brand. So, you know, starting next year, we'll have five vehicles in market. We have a design win with Porsche that launches in the middle of 2025.
We have the potential to do a program with VinFast, so we have a design win with VinFast to get a small volume, you know, that would launch in 2024. So these are really kind of the production contracts that we have for SuperVision today.
And of course, you, you hope to build upon that. You said on your last earnings call, you're engaged with nine brands for potential new wins with these advanced products. What do you think needs to happen in order to be successful, and how long do you think it may take to know if Mobileye is selected?
Yeah. So I think that there's a lot of competitive dynamics happening within kind of the automaker community now, that creates a lot of pressure to make decisions on hands-free driving. So clearly, Tesla has a system on the road that is, you know, we think, very good. We have a lot of respect for what Tesla has done. You know, so there's competitive pressure from there. This kind of hands-free driving functionality has become very big in China. So in addition to our system with Zeekr, there's four or five other systems that I would consider, like, good that are in China. Kind of the non-Chinese automakers understand that these brands are moving into Europe in the near term. So I think that there's a sense of urgency that's been created in the industry. We have also, I think, gone from...
You know, there's a big difference between having a system in production on the road and, you know, I mean, we, we were much more than a PowerPoint, right? We could bring people to Israel and do demos and kind of show them the performance of the system, but actually getting it on the road, having consumers experiencing this, having positive feedback, it's really a huge proof point. We now have our maps that are kind of covering more than 90% of North America and Europe, so we can essentially bring these cars to anywhere in North America and Europe. You know, if the, if the automaker wants us to show up at a, at a mystery location so they know that we can't do anything to train the system, like, we can do that. We can drive...
You know, we've done three-four thousand kilometer demos for customers. So I think it's all become much more real, and really, if you want to compete in, you know, the 2025, 2026 time frame, you really need to start programs now. So we, we feel great about, you know, these nine OEMs that have essentially told us, "We've chosen your technology." You know, now it's really about, you know, completing the contracts.
You brought up Tesla on the one hand. The things Tesla is working on perhaps is a catalyst for other OEMs to consider Mobileye and these sorts of products. On the other hand, Tesla aspires to be a competitor, and they said they're open to licensing their FSD technology to other OEMs. Have you seen that impact your discussions at all with OEMs, and are you seeing them a competitor in the marketplace?
Yeah, we haven't seen any impact from that. I mean, it's not the first time that Tesla has said that they'd be willing to kind of license or sell their, you know, self-driving technology to other automakers. I think that's, it's funny because this sense of urgency that's been created in the industry, we feel like is more related to Chinese automakers than Tesla. And, you know, maybe this is the fatal fault of the world's automakers underestimating Tesla, 'cause I think what they've done is very impressive. But I think the view on the automakers is, number one, our system has better performance, is more kind of confidence inspiring, less interventions.
I think number two, this kind of scalability of our system, where we have, you know, high-definition maps in place across most of the world, where we have a plan to scale the eyes on system to eyes off just by adding a second perception system. It's very appealing to the automakers because, you know, they see a lot of... I mean, I think that this is, you know, kind of common sense, right? That a system, obviously, that, you know, the consumer would have to trust, but a system that would allow the consumer to do other things while they're driving can create, like, significant value streams versus one where you have to monitor the system. So I think the automakers see a path with us to that eyes-off technology.
So yeah, we haven't really seen any kind of impact to that matter.
Maybe we can stick on the market share dynamic, and some of the OEMs have spoken of wanting to do more custom solutions themselves as they consider bringing out eyes off and hands-off types of ADAS capabilities. And they've talked about potentially working with Qualcomm or NVIDIA. One of those was VW, and you now have a win with the VW group for SuperVision and Chauffeur. But there's a number of other OEMs that are also, you know, considering going down that route. So do you think there's an opportunity to, you know, perhaps have some wins with SuperVision and Chauffeur, and Chauffeur with some of these other larger OEMs that have said they may consider working with Qualcomm or NVIDIA?
Yeah, for sure. So I think that... This is a big question, right? 'Cause I think in a lot of ways, our main competition in this field is our own customers. So it's the internal efforts that the automakers that are using high-power processors from semi companies like Qualcomm or NVIDIA. I think that, you know, the results. So, and this has been going on for four or five years. I mean, NVIDIA started talking about, you know, their desire to kind of penetrate into this market back in 2015, and have been working on it. I think that the results, you know, have not been that great. I think it's... If you think about, you know, the decisions here are gonna be made based on cost and performance, right?
Because we have software and hardware engineers under the same roof, our chips are architected, you know, with full knowledge of what type of workloads are gonna be running on these chips. So our goal is to have as little processing power as is necessary in the system, because processing power means cost, it means power consumption, each of which are bad things to automakers. So what we've seen kind of with these experimentations and internal efforts is that you end up with, you know, too much processing power, you know, a lot of it unutilized, too much cost, and the performance has been de-scoped versus kind of what the original intent was. Now, one of the reasons why, you know, the automakers wanted to go down this road is because they, you know, they do wanna customize their systems.
And I think the view was, "Hey, you know, if we go with Mobileye, then, you know, if 10 other automakers go with Mobileye, then we'll have the exact same system as everyone else." We, we anticipated that happening. So with our single front-facing camera business, like, it's okay to be a black box, 'cause it's basically like you either see a car in front of you or you don't. You either, you know, tell the car to brake in time or you don't. There doesn't need to be any customization. But we knew that when you start, you know, offering systems that take over the driving, you know, Porsche would not want the same driving experience, the same feel as Ford. You know, they would each want it to be aligned with their own brands.
You know, a couple of things that we did, you know, when we were acquired by Intel, one of the big positives was that we knew that Intel was experts in creating open compute. So we leveraged resources from Intel, you know, back in 2018, and now have something called EyeQ Kit, where there's tools and libraries and SDKs and kind of open pieces of the chip that automakers can deploy their own code onto. We also created a system where you essentially have, like, 500-600 knobs, where you can tune the driving experience. If you want to, you know, do more lane changes and takeovers to be a more aggressive feel to the drive, if that's what you want for your brand, all you need to do is really tune the system to do that.
You can adjust after the system is into production. So we feel like our system is very, very customizable. And so I think, you know, between us getting SuperVision onto the road in a production form, us kind of like marketing and kind of evangelizing the openness, the openness within our system, and kind of the poor results of the internal efforts have led to sort of a turning of the tide over the last year or so.
So a lot of interesting work, underway there. Sticking on market share, can we speak specifically on the China market and what competition you may be seeing from Chinese, semiconductor and, and technology companies, even if it's just around base ADAS, functionality? And I, I think, you know, referring again to VW, right, they, as one example, talked about trying to work with Horizon Robotics. And, you know, not trying to get into that-
Yeah
... customer in particular, but, you know, just, you know, talk, talk a little bit around the competitive landscape in China, and how much of a risk is there with some of the geopolitical tensions for your business in China?
So kind of on the driving assist, sort of the base ADAS business, we were a little late to enter China. We really started kind of the business in 2014, 2015, whereas the rest of the world was more like 2007, 2008. Bosch had most of that market at that point with radar systems, you know. So, at the time that we were acquired by Intel, we had maybe 25%-30% market share in China; now we have over 60% market share in China. So we've done really well on kind of the single front-facing camera, ADAS business.
And, you know, it's become, you know, a big chunk of our business, you know, maybe 25% of revenue, something like that, so which is very balanced, you know, relative to auto production. Now, when you look at. And we don't see, you know, competitive threat, any new competitive threats in that part of the business. Now, when you look at, like, the hands-free driving segment, China is really where the action is. This is where the market is moving most quickly, this is where the media is most interested in these technologies and promoting them, and this is where, you know, systems are going on the road pretty quickly. So you have, you know, what we call eyes-on, hands-free, right?
You have systems from Li Auto and NIO and XPeng, and now Zeekr, and maybe there's one or two others. And the systems, you know, besides Zeekr, are mostly internally developed, you know, either using Huawei processors or NVIDIA, and they're pretty good. But when we look at kind of and we think that the best one is XPeng, but when we look at kind of the configuration of that system, you know, there's multiple lidars. There's 1,000 TOPS of processing power versus 48 for the Zeekr system. The system is working in two cities today. These are the cities that XPeng mapped. You know, we were told it was very expensive to manually map these cities, so, you know, maybe they've stopped doing that. So, and it... The point is not to kinda compare and contrast.
The point is to say that you've got some very good systems in China, but we feel like ours is better. The feedback since we've kind of rolled out the full highway SuperVision software in China, you know, based on kind of media benchmarks and things like that, is that we have a better system. But I think China is a great market. It's very challenging for us, but the fact that, you know, it's moving so fast, we think is creating this kind of surge of urgency that, you know, is helping us globally.
More on the China market and the capabilities of your product. I think there's some news out just late last week around the Zeekr SuperVision system and some of the things it's able to do now. So maybe you could talk a little bit on what you enabled with an over-the-air software update.
Yeah. So the system in Zeekr cars has been, like, Level 2+ highway assist for the last, you know, six, seven, eight months. So what that means is, it's kind of adaptive cruise control on steroids. The car can slow down and speed up. It can keep you in the center of the lane, but it's not really driving for you. What we launched after kind of a long validation period and after kind of a beta group of more than 1,000 users for more than a month had the system is we launched kind of the full SuperVision capability on highway, you know, onto all, you know, 100,000+ Zeekr owners. And the feedback has been great so far.
I think it's, you know, it's not that hard to create a system that can sort of in light traffic, you know, keep the right speed, maybe make a couple of lane changes, but you can go onto the web and see videos of our system, you know, like, in really bad weather, you know, recognizing construction cones that just went up and, you know, merging left, you know, where human drivers are getting confused in front of complex, you know, intersections, again, in construction areas, and our system kind of, you know, breezes through without a problem. So I think we're really - we expected it to perform really well, but we're very pleased with how well it's performing.
The customers is super pleased, and in a lot of ways, until you actually get it on the road in tens of thousands of units, you don't really know. I think the other thing that's... You know, I think that the perception in China was that the Zeekr system, you know, maybe they can get to be, like, as good as the others, but how could they, you know, out-compete them with only 48 TOPS and no Lidar? And I think that's been the biggest takeaway from the media is like: Wow, this looks like maybe a leap ahead, and they're doing it, you know, with two hands tied behind their back in terms of cost.
In terms of what the SuperVision system can enable, you talked about hands-free highway driving, but what about within city streets and in certain, you know, cities? Where do you stand on enabling that?
Yeah. So the, you know, the big constraint in China is mapping, because in, you know, in North America and Europe, we just extract data from... There's about 2 million cars sending data. It goes onto our cloud. We have full access to it. It, it's much easier. In China, data has to be, you know, no non-Chinese company is allowed to even see or touch any consumer, you know, Chinese consumer data. So we needed to find a third party to kind of be the owner of that, you know, extraction process, the owner of the cloud, and we really just started collecting data about a year ago. So the map is really the constraint. It's building out quickly, and so, you know, as the map builds out, we'll be able to launch the system on urban and suburban streets.
Zeekr is gonna, you know, have a big sort of promotion and test period during the Asian Games, where they'll, you know, where the system will be working on suburban streets. But we're expecting it, you know, towards the end of the year to roll out broadly.
Okay. We'll be looking forward to that. You know, and then the next level would be L3, so hands-free and eyes-off situationally, and that's your Chauffeur product. Maybe talk about what's still needed from a product development standpoint and validation basis in order to get to that type of L3 capability, and when can investors expect that?
Yeah. So I mean, if you, you know, if you come to the Analyst Day in Israel, you'll experience Chauffeur, you know, in the crazy streets of Jerusalem and Tel Aviv. So the system operates. Like, there's no technological breakthroughs that we need. But with eyes off comes, you know, liability and more regulation, right? Because you're... You know, with eyes on, it's like as long as you have a robust monitoring system of the driver, you kinda know that that human is responsible for the performance of the vehicle. But when you start telling people, you know, "You can go to sleep in the back seat, the car will drive for you," then, you know, it creates risks for other people on the road, so the regulators get involved.
So I think it's more of a validation process that needs to happen. So, you know, the intent is that if you can create two separate perception systems, each of which are capable of driving the car on their own, each of which have an approximately human level of performance, which ours do, if you put both of those systems on the same car, the chances of both of them failing at the same time go down exponentially. So the process right now is to convince the automakers that we can, that that is true, and that each of these perception systems is approximately kind of human level of performance.
And then we have a you know— So you wanna convince your customers first that you can get there, convince them enough that they'll go move forward with the program, which Polestar was the first kind of Chauffeur design win. And then the second part of the process would be to actually prove it on the road, and that's gonna require you know lots of vehicles. So the plan would be to launch with eyes on you know monitor the system in the background, generate enough miles and data to prove it to the regulators, and then kind of launch with eyes off. So it's none of this is easy.
It all comes with some complications, but I think it's seen as a game changer for the automakers to provide a level of not only, you know, something like 10 times, 100 times better than human level safety, as well as, you know, the opportunity to do other things, to be more productive when you're driving.
Just to be very clear, the vision system, I mean, that is SuperVision, and then you add a second one, and that's what becomes Chauffeur. So that speaks to the scalability and modularity of your product set that you spoke to-
Exactly.
In the talk.
Yeah, Polestar is a really good example because they adopted SuperVision for their Polestar 4 vehicle, but, you know, it's kinda easy because, like, they're a sister company with Zeekr, and they had access to the platform, and it's carry over, so it wasn't a huge deal. And they're launching that, you know, late this year, but now they're talking about launching a Chauffeur program in 2025. So you would think that, you know, to do eyes on versus eyes off, you would need a completely separate program and, you know, another three years of development. But you can leverage so much of the SuperVision system in terms of launching a Chauffeur system. So it's, you know, you can get to market fairly quickly.
Maybe the final product would be robotaxis using Mobileye Drive. What sort of timeframe should investors have in mind for that?
Yeah. So here, again, it's, like, everything's complicated in this business. But, I mean, we're in San Francisco, like, I've already seen 30, 40, like, Waymo and Cruise going around. Our desire or our business plan is not to own networks of vehicles and run our own robotaxi network. It, it's really to be the supplier of the self-driving system. To do that, you obviously need customers, and you also need vehicles. And the vehicles shouldn't just be like an off-the-shelf car that you kinda stuff a self-driving system into. It should be a car that's built for robotaxis. You know, so we're engaged with a company called Schaeffler and VDL in Europe. You know, it's an engineering company and a vehicle producer. They're designing sort of a 9-person configurable shuttle that's pre-engineered to accept our self-driving system.
We're working with a company called BENTELER that's doing really the same thing. Volkswagen Commercial Vehicles, they want their ID. Buzz to be a kind of a platform for robotaxis. So we have three engagements to be able to supply vehicles with our self-driving system by the middle of 2025. And then we have relationships on the demand side. So we have a, you know, a relationship with Sixt Rental Car that's interested in this, Deutsche Bahn, which is a public transit operator in Germany, a few others. But then also, these vehicle builders have their own distribution, you know, contacts and networks. So we really want, you know, we want multiple entities trying to sell these vehicles. We want multiple customers, you know, engaged, and the economics work.
Like, the customers are telling us that we can save them EUR 100,000 a year by eliminating the driver, and, you know, the system will cost much, much, much less than that.
One more tech one, if I could. I wanted to get an update, please, on where Mobileye stands with its most advanced chip development. And I realize it's not just about the chips, it's the whole solution you're developing, but maybe talk around timelines for the EyeQ6 and EyeQ Ultra chips.
EyeQ6 is really critical because the EyeQ6 Low will be our new sort of workhorse chip for kinda high-volume ADAS programs, and that launch is, like, Q1 of 2024. So we have several launches of kinda ADAS platforms with EyeQ6 that are happening, you know, that are on track for the spring of 2024. So that feels very good. EyeQ6 High is supposed to be ready for production sort of late 2024, and then there will be kind of a further validation process and, you know, industrializing it onto the domain controllers. And so the first launch of EyeQ6 High is the Porsche program in mid 2025. So we feel great about the chips. And then EyeQ7 or EyeQ Ultra is more like a single die, you know, with all the processing power needed for full self-driving.
but I think... And that's, you know, we feel like that's on track as well, but really, we, we can easily put, you know, two EyeQ... You know, the, the EyeQ6 is pretty scalable, so SuperVision is two, Chauffeur on highway is three, Chauffeur everywhere is four, drive is four.
Got it.
So yeah, the chip roadmap is great.
Got it.
Um-
Okay. Maybe we could talk a little bit on end market trends, and you're tied to auto production, right? It could, you know, certainly outgrow auto production over time as more vehicles adopt ADAS capabilities. But could you comment on auto production trends and what Mobileye is seeing by region?
Yeah, I mean, I think it's been a good year. I mean, I think it started out with a lot of caution from the supply chain and the automakers. You know, a lot of concern around vehicle affordability and general economics, interest rates. And I mean, what's actually happened is North America and Europe demand has probably been better than people expected. China was quite weak in Q1, but then really kind of started to pick up in Q2. So I think in general, auto production has been better this year than anybody expected. What we've experienced is that, you know, in April and May, you know, we were getting some, you know, there was some concern around the back half and, you know, can we move some shipment volume out of one quarter or the other?
The answer was, like, "Let's let this play out and see what happens, you know, in the next couple of months," and then all those requests went away. So we didn't really see a lot of upside from the better-than-expected production, but it felt like there was a little bit of risk there for a while that's kind of gone away. So we feel great about where we are for this year. You know, on a volume basis, we should be up, you know, double digits. So, you know, it's been a good year in a tough environment.
Great. Well, we've got time for one or two questions from the audience. You know, I can keep Dan here for another half hour, but I wanted to see if anybody has a question. Yeah, we got one in the front, so if you don't mind waiting for the mic. Right there.
Hi.
Hi.
I would like to ask, in terms of your... because you do a lot of business with China, so in terms of export control by U.S. government-
Mm-hmm
... because autonomous driving is also considered an advanced application of semiconductor, so do you see it as a risk that the export control will spill over to this area? And do you have a plan if that really happens? That's one side of the question.
Mm-hmm.
The other side is, you mentioned that you already gave up, like, storage, you know, storing the maps or information in China, like the cloud usage. But also in China, it is sensitive-
Yeah
... you know, in terms of autonomous driving, because, like, Tesla, you know, there are news that they are not allowed to go into the government, you know, because of, they are taking all the, you know, camera information.
Yeah.
So, how do you prevent such, you know, scenarios in China when you do a lot of business with China?
Yeah. So I think on the semiconductor side, you know, it's the, it's definitely a risk, right? I think what we've seen is the types of chips that have been controlled so far are kind of in a completely different league than our chips in terms of data speeds and processing power, you know? So, like, our chips are maybe 10% of kind of where that line is, where you know, the chips are being controlled. There's also, like, a lot of a language around programmability. Our chips are not programmable in, like, a traditional sense. Like, you can't... They're built, purpose-built for automotive. You can't just put them on a data center and start programming them for a weapons system or something like that.
So we feel like if our chip types of chips start getting restricted, then the whole world's got a really big problem because, you know, our chips are not, you know... They're not super high powered, I guess is what I'm trying to say. So they're far away from kind of where those restrictions have come so far, so that makes us feel fairly comfortable. On the data side, you know, this was one of the areas that Intel helped us a lot. They've got good kind of, you know, regulatory government affairs assets in China, so they helped us to understand the data laws and create a framework that we feel like is for sure compliant today and also can withstand changes to any regulations.
So we have a third-party partner that is the one that's officially extracting the data, the one that owns the cloud. We have self-validating algorithms in the cloud, so we never have to see or touch the data. So we feel really good about kind of how that's come together as well. I think it's key to have Chinese partners, right? And we've got great partners, you know, kind of on the mapping side. You know, we talked about a company, ECARX, that's gonna be involved with the Polestar, you know, Chauffeur program. So I think we're very kind of interested in making sure that, you know, we're always kind of partnered with a Chinese company, and we think that that'll help us to, you know, stay away from any restrictions.
Well, unfortunately, we're gonna have to end the session there. But Dan, really appreciate you joining us today for all the questions.
Thanks, Mark. Great questions. Thanks, everybody.