Hello everyone and good morning. Welcome to Needham & Company's 27th Annual Growth Conference. My name is Neil Young, and I work on the Semiconductor Research Team here at Needham & Company. It is my pleasure to introduce Innoviz Technologies. Innoviz Technologies designs and manufactures solid-state LiDAR sensors and perception software that enable the mass production of autonomous vehicles. Joining me from the company is CEO Omer Keilaf. Omer is going to take us through a presentation followed by questions that anyone on the Zoom may have. Omer, thank you for joining us. Please take it away.
Thank you very much. So hi everyone. Happy to be here. Give you a short introduction of the company and possibly touch some of the recent news over the last month or so, and maybe provide some more kind of context to everything that we're doing. So one of the earliest announcements we made during December was our collaboration with Mobileye, and I'll touch on that in more details. But basically, Innoviz is working with Mobileye on being their LiDAR partner for the Drive platform.
We are also their partner on the Chauffeur. Maybe I'll go maybe one by one, and it will be easier. So we're developing LiDARs for autonomous vehicles for different customers. Innoviz is a direct supplier, Tier 1, providing end-to-end solution, including production and perception software. Our customers are expected. BMW already launched with the i7 in Germany. We're still expecting a launch in China.
We have several customers from Volkswagen and also Mobileye who are expected to launch during 2026 and 2027 on different platforms, and I'll touch each one of them. Second, okay, so we talked about the BMW i7. This is with InnovizOne. We acted as a Tier 2 through Magna. Magna is producing LiDARs today in the US in their production line and ship it to BMW. Since then, we became our own Tier 1 and responsible end-to-end. Our first win in the market was for the VW Group. The first program we won was for Cariad Group. Since then, we've added two new projects with VW. The first one is the ID. Buzz with Mobileye. The ID. Buzz is equipped with nine LiDARs from Innoviz, three long range and six short range or mid range.
We also added another project with Mobileye for the Chauffeur, a Level 3 program going to be applied on Audi and Porsche. And as I said earlier, we've been selected by Mobileye to be their LiDAR partner for all of their customers who are going to use the Drive platform. Basically, the same platform that Mobileye is responsible for, the ID. Buzz at Volkswagen, they are going to use for different customers. I'll go back to that in a second. Another big announcement we made during December was locking down an NRE payment plan with our customers that amount to approximately $80 million. Over $40 million are going to be paid during 2025. The majority of the rest is in 2026 and a small component of that in 2027. These are payments that are going to be provided to Innoviz as we go along during the program and meeting milestones.
A very important element in the funding of the company reaching to SOP and beyond. These $80 million do not include any sales of LiDARs. It's non-dilutive. Of course, very important asset as Innoviz acting as a direct supplier, allowing us to touch these NREs and basically supporting our activities. We expect during 2025 that we'll continue bringing more programs and additional NREs that will continue to strengthen the company balance sheet. In terms of our collaboration with Mobileye, so Mobileye has already several customers that have selected the Drive platform, the Level 4. All of them are going to use the same set of long and short mid range LiDARs, total of nine LiDARs per vehicle. The first one, obviously, the VW ID. Buzz. Holon, it's a people mover. Verne, it's a robotaxi two-seater luxury robotaxi shuttle with VDL.
There are expected to be more programs that Mobileye will bring in while they continue to market their platform, the Drive platform. As I mentioned earlier, we are also working with Mobileye on the Chauffeur Level 3. Our expectation is that more opportunities through that would continue. The recent announcement we made relates to our partnership with NVIDIA. Basically, NVIDIA currently offers the Hyperion platform to different customers based on InnovizTwo. This is primarily for non-Chinese OEMs. We are expecting that that, I would say, collaboration would yield additional important customers. Obviously, working with partners such as Mobileye and NVIDIA is, I would say, a good opportunity for us to scale. Being picked by these two very important platform players to be the LiDAR of choice for their platform is obviously a very important asset for Innoviz.
And of course, we continue to work with the different other OEMs and platform players. BMW, which already launched, we were happy to show that at the CES. The VW program, which is using our long range, this is the different programs for Level 3. Let me move forward. A bit timeline for the different products that we have: InnovizOne, InnovizTwo, SOP of the first vehicle in 2026, InnovizTwo, short medium range. Similar timeline, the first program to launch is the ID. Buzz. Using both of them and following that, several customers of Mobileye. There are actually also other OEMs that are expected to launch in the similar timeline, which we are still in the process of in the RFQ. The InnovizTwo Slim is targeted for 2027 based on a certain specific OEM. We just came back from CES.
I think it was a very exciting time for us. We were able to show the progress we made. Let me show you a bit of what we were able to demonstrate during the show. I think you might find it interesting, so first thing, we were able to show for the first time the short range LiDAR, which is included in the ID. Buzz, but also in the different Mobileye platforms. There are three long range LiDARs on the top of the vehicle. There are six short range LiDARs around the vehicle. One of the key KPIs for a short range LiDAR is very tall vertical field of view, up to 90 degrees that we are able to demonstrate here, and another is being able to see at a minimum range of almost zero.
In this demonstration, we could actually put our hand on the window and show that we can detect reflections. That's actually something that is quite challenging for LiDARs due to the recovery time of the system. Once a laser is shot, usually it takes time for it to recover. And most LiDARs are incapable of seeing in the first few meters. Next, we showed a very interesting demonstration, which I think most people are unaware of the importance of it. And this is me, by the way, through the view of the InnovizTwo. You can obviously get also an impression of the high resolution and fine quality of the data. But what I've tried to demonstrate here is how our LiDAR is operating even in a very, I would say, obscured situation.
Imagine that the car is driving over a puddle of mud and the window is becoming dirty. And you might ask, how should a sensor act in this situation? I can tell you that I'm not aware of any other LiDAR in the industry that is capable of showing such resiliency. And you might ask, why is that important? And I think it's an obvious question and an obvious answer.
People ask me, what's the difference between Level 2 LiDAR, Level 3 LiDAR, and Level 4 LiDAR? And what's the difference between Innoviz and the others? So any car which is operating at Level 2, you might argue that you don't need a LiDAR, or if the LiDAR shuts down every five minutes, it's okay. Because the driver has its eyes on the road, and if something goes wrong with the system, he's there to react.
When you go into a Level 3 vehicle and the car drives autonomously on the highway and the eyes are off, of course, that's not permitted, meaning that the LiDAR should not reset. It should not be affected by artifacts or noise to avoid emergency braking. You might say that it is okay or acceptable that it becomes dirty from time to time. So you can, because you have 10 seconds to bring the driver to reengage, take control of the vehicle, and apply the cleaning system to allow the system to go back to activity. When you go to Level 4, there is no driver. There's no 10 seconds grace period. You have five people sitting at the back seat.
Now the robotaxi or shuttle again drives through a puddle of mud or goes through, I would say, a group of butterflies and basically gets the LiDAR, but also the camera dirty. When talking about redundancy between sensors, you need to work with sensors that have very low correlation of degradation, meaning that whenever a camera degrades, you need to prove that the LiDAR does not degrade. You would argue that when a puddle of mud is thrown or rain or droplets happen, it can come simultaneously in the same time on both the camera and the LiDAR. You want that in this situation, at least one of them would be able to operate normally. Surprisingly, this is a very difficult task for LiDARs, even where they are covered quite massively by dirt.
You will start to see holes in the field of view because of the blockage of the window. We were able to show here to our customers and, of course, any visitor how our LiDAR is unaffected by this. Of course, we also demonstrate this capability when we drive outside. By the way, there is no software manipulation here. It's not operated by AI or whatsoever. This is optically done without any loss of any pixel, and this is one of our very unique capabilities and I would argue a necessity for Level 4. I mean, at Level 3, you might argue it's okay to have a cleaning system, et cetera, but once you go into a Level 4, for sure, when you have the LiDARs around the vehicle, not behind the windshield, even mounted low, you need to have this capability.
This is a very unique capability of Innoviz. This is, oh, sorry. Where am I? This is just another video to give you a high impression of how the LiDAR is providing very high details in real time. I can say that I did the trip around the hall. I haven't seen any LiDAR that reaches this high density and this quality in the entire show. Of course, we did demos outside. We've done driven demos to different customers. You can see here in this video the new feature we talked about just a few months ago of the Ambient IR, meaning that the LiDAR provides not only the 3D data, but also the image of the scene. Basically, what you're getting out of it is a camera and LiDAR fusion, which is pixel by pixel accurate at the same frame rate. There is no synchronization issues.
Of course, it comes for free. There is no additional hardware involved here. This is basically coming from the same sensor of the LiDAR and gives additional information to the perception software as kind of a sensor fusion based on camera and LiDAR in one box. Of course, we did also driven demos with our perception software here as well. We used the blockage to show and to have people the impression of what we're capable of, showing how we are able to do object detection and classification of different objects and targets and small objects, et cetera, a very important element in an autonomous vehicle.
Going back to the short range, we're trying to provide a full set like we do for the ID. Buzz for both long and short range, or even, I would say, ultra long range for a truck configuration, which we believe that Innoviz as a one-stop shop can offer. I'll stop and pause here. Happy to get your questions.
Thank you. Yeah, it looks like we have a question here from the audience. So regarding the LiDAR/autonomous driving market, there's been quite a number of push-outs and cancellations of programs. Just curious if Innoviz could give their point of view on the current market.
Sure. I hope I don't look that busy. Actually, we sense a great deal of transition for the traditional OEMs that are moving out from Level 4 to Level 3. You see that with Ford and Argo AI going to Level 3 programs, and you see that from GM with Cruise recently announcing that they are going to focus on Level 3. And that goes hand in hand with, I would say, our strategy to focus on the early launch of and volume coming from the privately owned vehicles. And I would say that there are several launches that are expected in the timeframe of 2026 and 2027.
So even though it's kind of behind the scenes because the launches are in 2026, there's still a lot of effort. You can imagine that developing such a system takes time and effort. It's not done over months, but definitely still the main target of many OEMs. I like to use China as the crystal ball for the rest of the market. When you see the Chinese OEMs move faster than the rest of the world, but you see that they are already in a very much, I would say, the adoption of LiDAR technology is quite vast, and I think that if we look at the Chinese market as a crystal ball, I think we should expect to see the same soon in the West.
Great. Thank you. It looks like we have one more, so Tesla hasn't used LiDAR in their vehicles, and it seems like it's mainly because of the price. Just curious to get your thoughts on the statement and how Tesla has been able to pursue L2 plus autonomous driving strategies without any LiDAR.
Yeah, so actually, the answer embedded in the question, so it's an L2 system. Therefore, it's not autonomous. People tend to compare Tesla to Waymo as like two players in the autonomous driving. I would say it's a very odd, I would say, trial to group them together. They are offering two different products. Waymo is trying to develop a Level 4, you might say it's eventually Level 5 system, which has no driver, while Tesla is offering a system which requires the driver attention on the highway. And there are several steps between the two. As long as you have the human factor and actually liability, you might not even use a radar. I mean, basically, if any mistake happens when the car is driving, it's still the fault of the driver.
The only way where you can actually allow the driver to remove his eyes is when the car takes responsibility. And in order to get there, by design, it means that anything that might happen on the road, you need to have redundancy. You cannot have a situation of safety because of a single point of failure. And I would argue that splattering of a bug on the window, and it's not a software bug, is not even a failure mechanism. It's just something normal. And low light conditions and direct sun and rain, those are all very, I would say, realistic scenarios that you cannot increase the danger of the person in the car, and therefore, you need to still prove that you have the needed redundancy.
Now, eight years ago, when Tesla had to challenge the rest of the market and run faster, there was no LiDAR. LiDARs were not available. They were very expensive. Production was not in the right position. They couldn't wait for a LiDAR. They decided to go for a Level 2 versus a Level 3. And they've succeeded, and I think they made the right decision. If they want to go to Level 3 or Level 4, at some point, they will need to add redundancy. I think that the assumptions that were used to make that decision at that time have changed. LiDARs are cheaper, are more available, and I think are providing better capabilities than camera, for sure, and there is no other way around than using a LiDAR for Level 3 and upwards.
There was also something quite interesting that happened two months ago when there was an investor day by Tesla. Something very interesting happened there and nobody noticed. Since the time I started Innoviz, there was always the threat above the heads of the OEMs, and I was told that one day, all of the cars that Tesla sold would become autonomous. That's the biggest risk that car OEMs, I would say, need to account for and created a sense of urgency because if one day all of the cars become autonomous and they are only starting to sell autonomous vehicles, they are in a very significant, I would say, disadvantage. Now, two months ago, there was an investor day, and suddenly that story does not hold, meaning that all of the cars that were sold would not be autonomous.
They are going to be Level 2. He kicked the ball up the hill two years later. At that time, like two years from now, there will be a new car, a different car that will be autonomous, not the previous one. He didn't really try to explain why something would be solved by that time. Only that the promise that all of the cars that were sold would become autonomous by vision only is not going to happen. And I would argue that it will still not happen as long as he will rely only on cameras for the obvious reasons that I mentioned before. So as long as he's still referring to Level 2 as Full Self-Driving, I think it's, I would say, not a comparable, I would say, solution for Level 3 and Level 4.
And therefore, I wouldn't argue they need a LiDAR for Level 2. Great. Great. I'm seeing no further questions in the audience.
So thank you for your time and thank you for presenting, and thank you everyone who attended.
Thank you.