Hello everyone, and welcome to the Innoviz Physical AI Webinar. Before we get started, I would like to remind you that our discussion today will include forward-looking statements that are subject to risks and uncertainties relating to future events and the future financial performance of Innoviz. Actual results could differ materially from those anticipated in the forward-looking statements. Forward-looking statements made today speak only to our expectations as of today, and we undertake no obligation to publicly update or revise them. For a discussion of some important risk factors that could cause actual results to differ materially from any forward-looking statements, please see the Risk Factors section of our Form 20-F filed with the SEC on March 4th, 2026.
Hey. Omer, nice to be with you today. My name is David. I'm Director of Industry Solutions here at Innoviz. I'm leading our implementation of our LiDAR and non-automotive spaces.
Yeah, I'm Omer Keilaf, CEO and Founder of Innoviz, and happy to have this session today.
Great. I'm super happy to be with you. We're gonna talk about your white paper that you've just recently released and we posted a request for questions, so we will go through some of these questions today and hear your take on Phy sical AI.
Yeah.
Great. Let's start talking about the Physical AI and the World Models. Why is it so important in your view, and why did you decide to deep dive into this topic right now from Innoviz's perspective? Sorry.
Yeah, sure. I think obviously AI has been a very a very big topic over the last couple of years, and every company talks about how they utilize AI, and I think it's really hard for people to really make the difference between AI that you just utilize language models, what they call Digital AI. Digital AI is where you take data such as text or images, and you use language models in order to generate the next pixel, the next word, and you generate a lot of content with it. There are many companies have been utilizing those language models to do that.
There is a term which is called Physical AI, which I think is very important to understand because it differentiates between the Digital AI and Physical AI in a way that help us to explain what we do. Physical AI has been practiced for many years, and the first application that was designed to scale was autonomous driving. When you think about autonomous driving, it's where you take real data, unstructured mechanism, where there is a car that needs to maneuver in many uncertainties, in environments and roads, and you need to utilize AI in order to make sense of everything you see around you and make decisions on how to proceed. Now, on top of Physical AI, which is now growing to different domains, there is another term that is important to understand, which is the World Models.
There are now companies that are building platforms for the next generation of Physical AI, where they intend to provide platforms that allow you to train models that would learn how our physical world behaves. NVIDIA just released, just a month ago I think, a new platform called the Cosmos, where they plan to allow developers to study and train algorithms that are meant to predict how the world will behave based on trained data. Most of the data, or I would say the platform that they are developing, is based on simulations, on emulations of how these environments operate. They talk about digital twin cities. When I think about digital twin cities, it's very exciting because you can create a model that allows you to monitor and experience and analyze and study and train.
When I think about it makes a lot of sense to me that these simulations, I call them statics, that you generate either by simulations or 3D reconstruction. When I see the capabilities that our sensors can add to create those digital twin cities, I see it as if we are bringing life to those World Models. You can use these sensors not only to utilize the application, but you can use it to study. You can create a digital twin city in real time, and use it in order to really analyze how the world behaves and build those real models, World Models in a more accurate manner.
Why now? Why did you decide to deep dive into this topic, now?
You know, autonomous driving was the first application of Physical AI. While AI has been tremendously developed over the last few years, there is a desire to use these capabilities on more applications. Again, coming from, for example, very standard process like a production line, right? Where you have a very strict process of how things work, robots that move things from one point to another. Where AI actually brings a lot of values when you have behaviors that are unstructured, where there is a lot of uncertainties. We live in a world where it's very hard to really predict how things will really happen, and you need to use. If you want to really be able to automate and to optimize these applications, you need to create models that are trained by real data.
Now that there is a desire to take this AI to additional applications on top of autonomous driving, it's clear that there is a missing part. There is a missing part in the story of how those applications will eventually be developed and how they will be utilized. When I am asked about what Innoviz is doing, usually when I tell people that we are developing a LiDAR , then the first instinct or the first reaction is, "Oh, I know LiDARs from autonomous driving." It's like as if Innoviz is developing LiDARs that could only be used for one application, and it's obviously not the case. When we started to operate or offer our LiDAR to other different industries, it was very hard to even explain, you know, which industries we are working in.
I think Physical AI is actually setting a certain umbrella where you can explain that what Innoviz is developing is a technology that can grow Physical AI and can be an infrastructure for Physical AI. I think though those are terms, it's not like it's a new world because, as I said, autonomous driving was Physical AI. It's a new term that allows people to understand better about the capabilities of such technologies to other industries.
Right. Yeah, of course, perception is a bottleneck to Physical AI, right? Why do you think 3D sensing is so critical to the space, and why do you think LiDAR is the solution here?
I'll start with the fact that when even in autonomous driving, people refer to LiDAR as ground truth, right? Even when you are trying to train an algorithm for a vision-based system, you use LiDAR as ground truth. That's a very strong statement because when you're trying to develop a Physical AI or World Models based on inferred data, or you're trying to develop your model based on simulations, obviously you are differing from the truth. You are reaching a bias in your model. When you use a 3D sensor with high capabilities, you are getting an errorless data, and you can create 3D models that can help you eventually get to result which is better.
I think the affiliation to LiDAR is obvious, and I think it's also important, you know, for me as a CEO of a LiDAR company, is to be able to educate investors to understand the potential of the company and where things are going to be. Through this paper, you know, the paper is split between two parts. We released the first part several weeks ago. It talks about the terms of Physical AI and World Models and the size of the market, et cetera. In the second part, I'm trying to give a certain prediction on where things would be because there are still several LiDAR companies or LiDAR technologies, less than they were in the past, but I believe there will be continuous convergence.
I'm trying to explain how LiDAR, which is going to be a very meaningful and important infrastructure of Physical AI, how it's going to shape out. It's, you know, the LiDAR space have changed quite dramatically over the last 10 years, and that's part of what I'm trying to do and explain in the second part.
Right. Tell us more about the space of automotive. I mean, Innoviz has been playing very strong in this space. How has this transformed over the past decade with respect to autonomy?
Yeah. Eventually, it's a long game, right? You know, over the last decade, there were probably 200 or maybe more LiDAR companies. In the paper, I'm trying to explain, you know, how the market evolved because like any industry that went through disruption, there is a certain way these markets behave. I'm referring to the Gartner Hype Cycle, which I appreciate. I think it's a good way to study any industry. You know, Gartner, every year, they are publishing a report that tries to pinpoint where each industry is at in the Hype Cycle, understanding that it's a very traditional trend. You know, it's a very traditional trend. There's always a certain disruption.
A disruption is an opportunity for newcomers to step in and take a big part of a very big market. This is a case where investors jump on the opportunity because they want to put their chips on the players they believe will eventually become the meaningful players. You can see it in many sectors, and you can see it on different reports of Gartner. Taking specifically on the automotive space, in 2015, 2016, autonomous driving was a huge hype because everybody knew that this is going to change, and it does, it is changing. There were probably 200 LiDAR companies understanding that LiDAR is one of the biggest challenges of the decade, you know? Behind it is a huge market.
Billions of dollars have been invested in hundreds of companies. As always in this Hype Cycle, there is this trend where you can see waking up to the reality that the market eventually cannot absorb too many suppliers. Especially in automotive, they tend to consolidate around very few suppliers. There was a race to get there, and it was obviously when you see 190 companies fail, then you have 190 CEOs going around the market telling, "It's not my fault, it's a really bad market," you know?
Yeah.
It's like you have a lot of people now try to push away the failure by saying, "There's no market, there's no opportunity." It's not true. Okay? There is a market, just there is not a market for everyone. The automotive market have moved its target all the time. You have the requirements that you got for a LiDAR in 2013 is very different from the spec that you got in 2016, in 2019, and in 2026. Okay? Because your customers always become more educated about what's more important, and they've experienced these programs for the first time. As long as the requirements, the knob, continue to change, to move, you've seen less and less companies that were able to make the cut.
One of the things that I'm talking about also in this white paper is that it's going to continue. I'm talking about the different phases, you know, of the automotive market. In the first phase, 2015, 2016, you had many startups with prototypes that were able to demonstrate certain capabilities, but many of them did not meet the cut, you know? In the second phase, I think it was also around many situations, macro situations such as the COVID-19 and the semiconductor situation, and the EV. You know, there were many disruptions, bad disruption to the automotive space. But eventually though, there were much less LiDAR companies. Some of them managed to, I would say, get an award for an OEM.
Even myself, there were times that I would say because it was super noisy and crowded, the space of the LiDAR , I would say, you know, the only real way for you, for investors to really appreciate if a LiDAR company's successful or not is whether they have or they don't have an award by an OEM. Because in a way you can see it as a good scrutiny by the customer that he says, "Okay, this is the right one I want." Today I'm saying it's actually, I was wrong. I don't think it's the right way to look at it because now when you see in perspective of the last five years, there were several LiDAR companies that got an award.
But failed. It's not enough to get an award. When you realize how difficult and challenging it is to get from an award to deployment, you understand that it's not enough to just flag, "Oh, I got one customer to get excited about what I do, and believed my promises." It takes a bit more than that. To me, it helps to kind of push back on some of the claims that I see because, you know, you see a lot of press releases by many, but the reality is slightly different.
I'm on the white paper, on the second part at least. I'm trying to explain a little bit about those challenges. Like, what really is needed to get to serious production, real serious production, not just, you know, getting someone to be convinced. I'll pause here f or more questions.
As we see the LiDAR being implemented and driven through the automotive industry, how do you see what happens in the automotive market, how do you see that impacting other non-automotive Physical AI applications?
Yeah. Again, I think that the automotive market is still setting the standards. There is still no other market that can provide any LiDAR company any contract to volumes that are offered in the automotive space. Okay? Because of the size of the market. On the other hand, it's also setting a very high bar on the requirements, and those requirements, as I said, they keep moving. They are moving all the time. One of the requirements, for example, that I'm seeing in the automotive which I know that people are less familiar with, but I see them as very dramatic, is related to the LiDAR. Not only people know about range and about resolution and frame rate, those are the KPIs of 2019. You know?
It's like in the old days where you would compare digital cameras by the resolution, right? That's the only way to compare. It's obviously not the case. The quality of the image is also affected by many other things. When you talk about autonomous driving and safety, then I usually like to talk about what I call the dirty secret of the LiDAR space. When I think about autonomous car, I think about the car where you have a family sitting in the back seat and the car is driving without any ability of anyone to intervene, right? That's where we are targeting. Now, the claim to fame of LiDAR in the space is that it provides redundancy to the camera, right? That's what we are all flagging.
Yeah.
We are providing safety by providing redundancy to all of the failures of the camera. Now, where a camera fails, it fails in direct sun, it fails in weather conditions, it fails in rain droplets, right? You'll be surprised that many LiDAR solutions architectures are actually unable to work under those conditions. To me, it was surprising. There is another element that is usually disregarded or not thought well. I talk about the dirty secret is when you think about the Functional Safety, it talks about the correlation of the failure mode. If the camera fails in a certain failure mode, you need to prove that the other sensor does not fail simultaneously together. That's the only way to get real redundancy.
Now, if the car drives over a puddle of mud, and the mud is now thrown on the vehicle, it is very hard to prove or to even claim that both sensor will not become covered by mud. How do you solve this problem? Because I'm not talking about an unrealistic situation.
Yeah, for sure.
Because it could be mud, it could be bugs, it could be many things. I'm not familiar with any LiDAR but ours that deals with it. To me, it's shocking that for a Level Four system, you know, we can argue about Level Three, that you have a driver that is maybe reading a book, but he can be called to reengage while the system is cleaning the car, the sensor-
Mm-hmm
In a few seconds. When there is no driver at all, like Level Four, to me it is a catastrophic, really catastrophic and alarming situation. We're assuming that the sensor is incapable of dealing with these situations. The reason that Innoviz was able to solve this problem is because we already launched our first generation, InnovizOne, and we tasted the road, you know. By understanding real world problems, we knew that in the second generation, we need to bring it to a different level. We redesigned the system, so even if you throw mud at it, you will still see it very well. To most customers, it's like black magic, and they think that we're doing things in software or AI or something, but it's not. It's really just optics. To me, I sleep well at night.
You know, seriously, when I think about I'm offering a product that I believe can actually work in Level Four, and I think that others can't. To me, it gives confidence that I have a robust roadmap, you know, to actually enable these applications. I don't know how others can do that, you know, because it's a very unique design that we've done, and I don't understand how anyone can use anything other than that.
Other than that. Exactly. Yeah.
You know. I think that just to kind of push the nail on this, there is a difference between a LiDAR for Level Two, a LiDAR for Level Three, and a LiDAR then Level Four. I think people are not very familiar with this thing, you know, this differentiation because they usually think if the LiDAR has higher resolution, then it's good for Level Four, or if it has good range, then it. It's actually nothing to do with that. The only thing that differentiate between Level Two, Level Three, and Level Four is the availability of the sensor. In Level Two, you have a driver that has his eyes on the road and is ready to engage. Theoretically, you can say the LiDAR can not work basically.
Yeah, shut down.
It's kind of like, you have the driver as a redundant sensor. Even if the LiDAR works sometimes, it's fine. You know?
Yeah.
It does not cause the vehicle to lose its Functional Safety. When you go to Level Three and the driver doesn't look on the road any longer, you need a very robust sensor. You need a sensor that is highly available. The only cases that you are allowed that the driver is requested to reengage is if the sensor is now it has a certain degradation. For example, if it's covered with mud, right? And then the driver is called to reengage, take the wheel, and let the sensors to be clean. When you go to Level Four, it's done.
No option.
There is no option for any degradation in a way. You need a highly available sensor. The level of quality between Level Two, Level Three, and Level Four is quantum leap. It's like several orders of magnitude. I believe that as and kind of maybe closing kind of the brackets on the question, autonomous driving is setting the requirements. It's driving the volume, and eventually it will help to companies like Innoviz to offer really the best technology to any Physical AI application.
Yeah. Right.
Because eventually if you talk about security, it's also safety, right? If you talk about logistics where you have machines that are moving in environments where there are people, it's also safety.
Certainly intelligent traffic systems.
When you talk about, yeah, intersections, it's also safety. Functional Safety, it's not only autonomous driving, and the compliance that we are having for automotive is meeting the requirements for all of those other markets. Of course, the higher performance, the higher resolution, and of course also the lower cost potential coming from the volume from automotive will eventually drive the Physical AI market.
Yeah. No, you definitely managed to connect those two dots, right? Between the automotive and non-automotive.
I tried to remember what you really ask.
Let's talk about technology. I know you're, you know, very strong in technology and there's been a lot of excitement about different LiDAR technologies over the years. Can you talk about the evolution of the various approaches and where you think we are now?
I love this topic because at any point of time you will have an emerging technology that is calling out to change the world and basically be the technology that the whole market would pivot to. Unfortunately, it didn't work well for many of those technologies. You know, when I founded Innoviz, on the first day I started the company, it was the January 7th, 2016. On the same day, there was a press release by a company called Quanergy. At the time it was a different company. They claimed that they have a solution based on a technology called OPA, Optical Phased Array. You can look up in the Internet. You can find the January 17th, 2016 press release on a product called S3.
There was a huge hype around OPA. It was the future of LiDAR. It was fully solid state. It was $250. I remember it, they said they can see 8% and at the target reflectivity at 200 meters. It's in mass production. It already had like partnerships with Mercedes-Benz and like it was. I remember looking at this and asking myself, "What am I doing?
First day. Yeah.
It's like I'm only starting, and this company is already claiming that they are in mass production with all of the companies, and it's meeting performance that I can only dream of, you know, one day.
This is a totally different technology, right?
Yeah. Lucky for me, you know, I was stubborn enough to learn a bit about the technology, you know, what the really technology is about. I actually came to the conclusion very quickly. I think after a week, it was already clear to me the technology doesn't work. I talked to many professors and, like, people who were actually involved in developing the technology. The technology did not work. For three years, I was asked continuously by investors how am I capable of competing with such a technology that is already in mass production. I was really confused, like, "Seriously? Do your due diligence. What are you talking about?
The technology does not work." Obviously that product never came to life, even though they raised a lot of money and this had never really reached the public. After that, there was a different hype. It was 1550. I remember five years ago, I was asked daily, "905 can ever compete with 1550 nanometer." Like you said, I'm technology-driven, you know, I go into the details and I couldn't understand how people could really be that confused. I just didn't know, other than giving evidence by, "Look at my product, look at the competing product, and make your own assessment. You know, put aside all of I'm saying, you know, about the technology." Now there's a new hype, right? There is always a new hype.
The problem is that people don't really spend enough time to understand the differentiation of technology. Now in essence, automotive is a very difficult market. The requirements are very high, and as I said earlier, the more you turn the knob, you will leave less margins, you know, less capable of technologies. In 2016 due to the hype, any way you can imagine developing a LiDAR, you would find a company trying to do it, you know, for LiDARs and raising a lot of money around it. There are really very research-driven, interesting ways to look at it, but they have nothing to do with these capability, with the requirements.
I always tell people, like, "Just see the product, see the size of the product, see the performance of the product, see the power consumption of the product." These are things that other companies don't really show. You know? I tend to show real footage and videos of our point cloud because I think it's really the only way. That's my product, I need to show it, right? That's the only way for me to give confidence to-
To show what you have.
To investors and customers to really evaluate. Like, you know, I can always say, "I'm the best," but look, here's my product. When I see others not showing their product, you know, I see them showing videos that are processed, which I have a very sharp eye to see when someone is
You have. Yeah.
Lying about the data they are showing. To me, it's the best evidence that, you know, that they just don't have anything to really show. Now, the automotive market is going into a new phase. I call it phase III, and I'm talking about where I believe eventually the market will consolidate to. Look, I was saying, the OEMs have always increased their demand. They want it smaller, cheaper, et cetera, and there were different types of installations, the grille installation, roof installation. They've always hated those options. They always wanted to put it behind the windshield because aesthetically it looks the best. The problem was that in order to bring it behind the windshield, you need to have a very small device because there is no room in that area.
You have very small area. It takes all of the flux from the sun, so it needs to be consuming significantly less power consumption than the solutions that were available and you need much higher performance because you need to overcome the attenuation you get from the window.
Mm-hmm.
Those extra requirements probably five years ago would kill anyone ever able to offer something. They were willing to compromise, put it in the grille, sorry, on the roof, but the holy grail was always behind the windshield. I think today, now stepping into our third generation, we can benefit from the fruits of our work that we've done with InnovizOne, reaching Level Two to the road, and InnovizTwo. It's a product that is going to series production end of the year with Volkswagen. Level Four gives us a very robust baseline that allowed us to take another step and develop InnovizThree, which is, you can compare it to InnovizTwo, right?
It's just totally incredible.
Yeah.
Size reduction.
It really is a very small product. Now, I believe that this phase has many implications. One, I think it will stop the madness. You know, even for us
Yeah.
Moving from one generation to another requires a lot of effort and in terms of scale, you want to stabilize, you want to keep in one, you know, have a design that meets everything.
Yeah, not having to reinvent the wheel every time.
Now, because of the desire of the OEMs to move to behind the windshield, I believe that it will stay there. It will no longer move, you know, from the grille to the roof. Behind the windshield, I don't think it will move to behind the seat or something. I really hope so. But I think that this is where the market will eventually converge to.
Mm-hmm.
I understand that also these extra demands would likely leave in the history of the LiDAR space more companies that will not be able to meet it because their solutions are too big, they consume too much power, and their performance is not good enough. I think that this is a good reason, at least for me, to believe that the LiDAR space is going to even further decrease. I think that those technologies, emerging technologies like OPA, 1550, and FMCW, those are all technologies that are not a good fit for the automotive. The only technology that has the sufficient capability in terms of volume, in price, in performance, in power consumption, the only technology that has managed to do so is time of flight 905 nanometer.
This is how these are the lenses I wear when I really look on what's my competition.
Great. Actually, that's a great step to talk about the competition and how do you see the landscape, the competition landscape today?
Okay. After we, I tried to put the different other companies under the different buckets of which are the technology they are using. To me, it's clear that the technology of 905 Time- of- Flight is the right one for automotive. Under those glasses or lenses, the competition landscape look very different, and I'm talking about automotive.
Mm-hmm.
I'm not talking about non-automotive. In the non-automotive you have many solutions that were possibly not good enough for automotive, so they started to develop a certain other applications. When I kind of put it down on like who are the sensors that are offered today with Time- of- Flight 905 in automotive, it's in the West it's primarily us and Valeo, and then you have the Chinese sensors, right? Now the Chinese vendors they've I think they've went through all of the solutions. They've learned very well from many suppliers on how to do how to bring up LiDARs. They've benefited tremendously by the fact that, one, the LiDAR is a strategic technology by the government of China and got a lot of funding.
It's because the new electric vehicle companies that had to leapfrog into the market and compete with the Western market, they had to bring software that was premature and make it more robust by using L2 with LiDARs. That's the main reason why L2 in China was really flourishing and provided a lot of business to the Chinese market. There is a different problem where all of those cars are sold at a loss, but put that aside, it helped the ecosystem to develop. Now there are several acts now in the U.S. about the use of LiDARs from other countries, from China, due to national security. You can already see two acts that are being pursued.
One is the SAFE LiDAR Act. There is a new act that just came a few days ago from the DOT. The reason is because eventually a LiDAR is a mapping tool, you know. Going back to the Physical AI, a LiDAR is going to be the infrastructure for the perception of the future AI, Physical AI. You will have LiDARs not only in cars, you will have LiDARs in the infrastructure, in security borders. It is going to be in robotics, in drones. Eventually, LiDAR, which is a 3D sensor, is going to displace all, you know, added to any 2D sensor. In any point of perception, you will eventually have a 3D sensor.
Yeah.
The richness of the data and the ability to create those twin cities, digital twin cities and live digital twin cities, it obviously creates a lot of sensitive information. From that perspective, I believe, I cannot predict, but I believe that the market will stay split between LiDAR that are offered by Western suppliers and Chinese suppliers to be operated in China. When I kind of nail down, you know, this competition, you can see that eventually there is not a lot of competition. Of course, there are the emerging technology.
They are likely, in my point of view, not to be good enough for the automotive space.
Yeah. I think it's almost time to start wrapping up. Considering everything we've discussed so far about the need for perception, the requirements in the automotive space and LiDAR technology, can you try and tie that all together and tell us how you think Innoviz is positioned in the market right now?
Yeah. Well, look, eventually it's a long journey.
Yeah.
We are developing a new ecosystem. Okay. LiDAR is going to be a dominant sensor in the future. When you look in the white paper, I try to estimate the size of the market, the TAM, the total addressable market for LiDARs. Now, the obvious one, or usually you'll find a lot of coverage is analysts talk about the TAM of LiDARs in autonomous driving. Because it's, as I said, it's like the obvious almost only application for LiDARs. Those figures talk about $10 billion i n the next decade. Now, the more difficult, I would say, analysis is to understand the size of the market for LiDARs in Physical AI.
Here it's more difficult because, it's less discussed. People are not familiar. Most of the people are not familiar about the capabilities that LiDAR can offer. I'll give just you know one example. You know, we've started to offer our LiDAR to the security market. In the security market, the dominant sensors are camera and radar. The use of radar in physical security is, to me, it's you know it's hard for me to explain because it's so horrible. It just doesn't work well. But it's the only technology that is available.
Yeah. That is available.
When you see the ASPs, the average selling price of these radars, it boggles my mind, right? It's this could be tens of thousands of thousands of dollars and you know in volumes that will you know blow your mind. Now, when we show these customers what our LiDAR is capable of in terms of range and resolution and ability of solving those very obvious problems you want to deal with in physical security, you know, a very stupid kind of example is radar cannot see beyond a fence.
A fence.
How could it be that for a security system you have a system that doesn't see beyond the fence? I don't know. That's the reality. Now in those situations, I look on the TAM of radars, and there is a lot of work because radars are around for a long time, and you can actually see the size of those markets already today. If LiDAR is going to displace radar in security, which I believe will happen massively, you can actually get an estimate of what might be the size of the market for LiDARs in security. Now, I'm not saying it will displace all of the radars in those applications, but it will take a bite, a nice bite. Those markets are continuously growing. You see the same in agriculture, and in smart cities, in logistics.
Eventually, in my point of view, a radar is a good technology for a space that is empty, void, in air. If you want to see something in clear sky, a radar is a good option. Once you talk about a complex visual scene, a radar is pretty useless because of the very lack of resolution and noise. LiDAR is excellent there. If you take a LiDAR like ours which is automotive grade, resilient to dirt, and can see very long ranges, you know, we have things coming up in terms of improving our capabilities in those domains, it's actually really is going to be a huge success. I see eventually Innoviz's position in the automotive as a ground our baseline.
The market will continue consolidate. I said it's a long journey, so we are now entering the phase III, and I expect phase III to leave behind less companies.
Yeah.
That will be our base camp from where we can start penetrating to different markets. We started with security because it's I see it as a premium market because it demands performance that other LiDARs cannot offer. We try to see who we'll compete with in the LiDAR space in security, and there's no one.
Yeah.
Because no one is actually able to meet those requirements, and that leaves us a nice premium. Nothing holds us back, you know, for offering our LiDAR t o other more established markets where customers are already familiar with the LiDAR s pace and already understand the disadvantages with the solutions they already have. I think that I see it as a long game. I don't check the score because the game didn't end yet. You know? I understand that it's a long journey. You need to make bets on your strategy. We made our bet on our position in automotive, and I believe it will be in our strength in the non-automotive.
Eventually, I see Innoviz as the market leader in the future, and I expect that it will be a very fun thing to do.
Yeah. Moving forward. Maybe just the last question on my side is some of the proof points that you're seeing in terms of customer wins and traction in the automotive space and beyond. You touched on that a bit, if there's anything you want to expand on there.
Sure. Right now we are working towards the launch with Volkswagen for Level Four with the ID. Buzz. It's the first serious production robotaxi in Europe by an OEM. Then we are working in collaboration with Mobileye. Through that relationship, we also reached a commercial agreement with Mobileye where they are using our sensors on other programs that they are involved with. They are using the same set of sensors. We are using nine LiDAR per vehicle in Level Four, and they're using it on other platforms. For example, HOLON. It's another company that is going to SOP also after the ID. Buzz. Then we have Daimler Truck where we have been selected for the sensors, and it's also multiple sensors around the vehicle.
We are working, and there is a program with Audi for Level Three. We are continuously working with the industries on the different RFIs and RFQs. I expect this year that we'll be able to get more awards, either from InnovizTwo or InnovizThree. Really depends on the customer. If it comes to the windshield integration, then InnovizThree. But if it's a robotaxi and they want to launch tomorrow, then InnovizTwo.
Yeah.
Is a great product. It's, I think InnovizTwo is an amazing product. I think the best in class today, and I think InnovizThree will definitely be the successor, a very sweet one.
Great. It's been amazing speaking to you, Omer. I think, for me personally, one of the nice parts about selling LiDAR is the initial reaction that you get, showing someone for the first time a point cloud, and it's like showing them the future. I think, yeah, definitely your piece on Physical AI definitely gives us a bit of a glimpse to the future. So it's been really great talking to you today.
Thank you.
Thanks very much for your time.
Thank you.