Dear investors, good afternoon. I am Daisy from the IR Department. Thank you very much for attending today's Investor Day 2024 of Sunny Optical Technology. This is our 15th Investor Day, and we are celebrating the 40th anniversary of our company this year. And again, I'd like to extend my warmest welcome and the sincerest gratitude for your presence. Some of the investors may be unable to physically attend today's session, so we have arranged a webinar to virtually involve them as well. Today's communication will focus on the non-financial side of the company, highlighting the development of the optical market and the technological trends. We hope that you can have a deeper understanding of our company and the industry.
First of all, please allow me to introduce the management present here today. They are Executive Director and Chairman, Mr. Ye Liaoning; Vice President and Joint Company Secretary, Mr. Ma Jianfeng; General Manager of Zhejiang Sunny Optics Co Ltd, Mr. Wu Jun; General Manager of Ningbo Sunny Automotive Optech Co Ltd, Mr. Qiu Wenwei; General Manager of Ningbo Sunny Opotech Co Ltd, Madam Wang Mingzhu; General Manager of Zhejiang Sunny Optical Intelligence Technology Co Ltd, Mr. Wang Zhongwei; General Manager of Zhejiang Sunny Smart Lead Technologies Co Ltd, Mr. Zhang Baozhong; Senior IR Director and Joint Company Secretary, Madam Wong Pui Ling; IR Director, Madam Liu Yanfeng.
Next, I'd like to introduce you to the agenda of today's event. On your table, we have placed an agenda, and there are two half-day sessions. In today's session, we have three parts. The first is the welcoming remarks of Mr. Ye, and the second part is the prepared remarks. The IR Director, Madam Liu Yanfeng, General Manager of Ningbo Sunny Automotive Optech, Mr. Qiu Wenwei, and General Manager of Zhejiang Sunny Smart Lead Technologies, Mr. Zhang Baozhong, General Manager of Zhejiang Sunny Optical Intelligence Technology, Mr. Wang Zhongwei, as well as the General Manager of Ningbo Sunny Optech, Madam Wang Mingzhu, are going to share with you the latest business updates and technological advancements in the sectors of vehicle, XR robotics, vision, and smartphones, as well as our technological deployment and As for the presentation materials, you can scan the QR code on the agenda form.
The third part is the Q&A with the management. After this session, we have prepared a dinner for you. In the dinner, you are able to have more in-depth face-to-face communication with our management. Tomorrow morning, we will have a tour of the factories. We are going to the Chengxi Industrial Base to visit the production lines and the showrooms. After the tour, the investors can have a brief communication with CEO Sun Yang and Vice President Ma Jianfeng, as well as the IR team. This is the brief arrangement. Next, I'd like to welcome the Chairman, Mr. Ye Liaoning, to deliver the opening remarks. Welcome.
Dear investors and friends, good afternoon. Thank you very much for attending today's Investor Day 2024. On behalf of all the colleagues of Sunny Optical Technology Group, I would like to extend the warmest welcome to all of you. In recent years, the company's development has decelerated due to various factors, including ongoing international geopolitical tensions, global supply chain restructuring, and industry cycles. As we move into 2024, sluggish consumer demand driven by the global economic downturn persists. Market competition intensifies, and the company continues to face significant growth challenges. However, the digital economy and AI technologies will remain the primary driver of the global economic development in the future, creating new growth opportunities for the optoelectronics industry.
Sunny will not be hindered by the macroeconomic situation or halt our progress. We will maintain strong confidence in our development, enhance the refined management of our existing businesses, and ensure a positive net operating cash flow and high-quality profitability. We will refine our strategy to speed up the growth of our in-vehicle and XR businesses, aiming for significant improvements in our industry standing. We will strengthen our renowned and supportive role strategy to secure key projects by aligning the group's resources end-to-end with the needs of strategic customers. We firmly believe that by sticking to our strategic positioning, heightening our crisis awareness, expanding open-source initiatives, cutting costs, and maintaining our commitment to innovation, we will overcome the challenges ahead and achieve even better results.
We would like to thank all the investors for your continuous support for the development of Sunny. We value this opportunity to communicate with you and hope that this session will deepen your knowledge and understanding of Sunny. We also look forward to your valuable suggestions and opinions to support us to have a more sustainable, high-quality development and to create greater value for our shareholders. Again, I would like to express my sincerest gratitude to all of you attending today's Investor Day, and I hope the complete success of today's event. Thank you.
Thank you, Mr. Ye, for your opening remarks. Next, we will have the second part. First of all, let's welcome the IR Director, Madam Liu Yanfeng, to introduce you to the latest business updates of the company. Welcome, Madam Liu.
I'm very honored to be the first speaker, and I hope I'll give you a great opening. In my presentation, there are two parts. The first is the company overview, and the second is the market trends and opportunities. As you know, Sunny was established in 1984, so this year marks the 40th anniversary of our company. Reviewing our development, we have encountered the financial crisis, and there were industrial competitions, and there were reshuffles of the industrial players. Regardless of the situation, we had always focused on the main business of optical. Our goal was to become the leading optoelectronics supplier.
For Sunny, we are able to integrate the relevant businesses, mainly in mechanical, electronics, optical, and computing technologies. We are able to integrate all these technologies and to assist our product development as well as the large-scale production. In our whole group, there are three main businesses, and you also care a lot about the market share of these market segments. The first is the smartphone lens sets, and we rank number one in the world in terms of the market share. In terms of the handset camera modules, we also rank number one in the world, and for the vehicle lens sets, also number one. These three number ones are from the TSR December 2023 reports. We have a 26%, a 13%, and 31% respectively for these three business segments.
For the vehicle lens sets, our market share had continuously increased, and the gap between us and the second and the third market player had been enlarged. We had always focused on co-create as our core value. In terms of the distribution concept, we always focus on the concept of money flows, but people unite. We had optimized the share incentives mechanisms, and the result has been quite good. We have a lot of recognitions from the capital market. In this slide, I had showed you some of the indicators, such as the Hang Seng Index and some of the sustainability-related indexes. All of these had demonstrated great results. In terms of the business segments, there are 8 of them. For example, the handset-related products, this accounts for the largest part of our revenue. According to the financial results in 2023, handset-related products contribute 66.1% of the total revenue.
The second part is the vehicle-related products, including the vehicle cameras and also the modules, LiDARs, HUD, as well as the projection headlights. The contribution to the revenue from the vehicle-related products also continued to increase. In 2023, the number is 16.7%. With the autonomous driving moving to the higher levels, we believe that this share will continue to expand. In terms of the AR or VR-related products, in 2024, it's about 6%. We believe that with the further development of different industries, especially the emerging ones, we expect that revenue-wise, we will have a more even contribution from these different segments. Next, I'd like to introduce you to the corporate development strategy, mainly around three aspects. The first is in smartphone-related businesses. We will continue to explore the new potential and strive to improve the business quality.
There are three aspects to work on. The first is to benchmark against the peers that are taking the lead in an industry and also strengthen the parts that we are still relatively weak. And the second is to improve the product mix and optimize in the customer structure, and whether the large-sized customers in overseas markets or the top customers in China. And for all of them, we wanted to have better and more in-depth cooperation. And the third is to deepen the joint cooperation capability of smartphone-related business because we believe that with the image development of the smartphones, we need to have more integration of optical technologies and electro-optic technologies. And with our advantages and expertise, we wanted to have a more coordinated development between the optical and optoelectronic technologies and to improve the loyalty of the customer.
The second aspect is in the field of vehicles, XR, and robotic vision. We will continue to enhance our competitiveness and focus on improving market positions. To be more specific, in terms of the vehicle lens sets, we will continue to explore our potential and expand the competitive edges. Secondly, in terms of the vehicle camera modules, we will continue to expand our target customers at home and abroad and optimize the customer structure so that we are able to expand the market share. The third part is to continuously break through the key technologies and manufacturing processes, strengthen resource integration among business units to seize market opportunities in XR and robotics fields.
The third part is in terms of management. We will strengthen management of the capital input and to enhance the input-output efficiency and optimize our operational excellence. Especially in some of the more established businesses, based on the orders and market trends, we will reasonably allocate the capacity. Also, in some of the emerging fields, we will strictly control the investment risks. Our goal is SOP-based minimal investment. The next part is about the market trends and opportunities of the handset lens sets. I will give you a very brief introduction. So later on, we have invited the general manager, Mr. Wu, to attend the meeting.
We also welcome investors to raise questions in the later session. On this slide, you can see the industry forecast for handset lens sets from 2023 to 2027, what is the trajectory of the growth? Please take a look at those numbers. It is a report from TSR in December 2023. So actually, the data of 2023 was not the final one. It was just the prediction of TSR. The final result was a little bit lower than this one. From this image, you can see that the growth is higher than 1.2%.
All right. That was the situation. In the overall industry, as the ending of COVID-19 and our economy actually started to maintain a long-term recovery, and in 2023, we can see that all different equipments, like the mobile terminals, we can see that has hit the bottom and also being helped by AI's fast development that we anticipated that smartphones' demand will actually have a very gradual revival for the future. What comes after that is that as a key component, our handset lens set will maintain a stable growth speed.
In terms of the change of the camera structure, as you can see on this slide, this is actually based on the introduction of our optical equipment and products because I know that some of the investors, they actually focus on the fast recovery of the industry. As the launching of the new product, we can see that these high-end smartphones are actually selling pretty good out there. Actually, the camera handset lens sets are pretty high-class as well. So what we're focusing on is the main camera and the telephoto lens. As the size actually grows of these handset lens sets, we actually change the lens itself that will have more requirements in terms of the image size. The trend is actually pretty accurate.
For instance, like a few years ago, we're talking about the 1.7 of 1-inch lens is actually a very high-end product. Now, what we're seeing is 1.3 of 1-inch or 0.9 of 1-inch. And in terms of large image size, what we're seeing over here is that on the manufacturing side and design side, the difficulty of manufacturing one lens is actually becoming more difficult because in terms of the molding and other procedures, the difficulty is actually growth because it requires higher precision. And in terms of the fabrication of the lens, it's actually becoming more difficult as well. And how do you cope with the chip itself, of the sensor itself? That requirement actually grows as well. So in terms of a lens production side, actually, challenge comes. Another thing is about our hybrid lens, so that's glass plus plastic lens.
In terms of the glass lens, what we can do is it can hit larger aperture, and we can cope with more using scenarios. For G+P structure, it can cope with not making the cell phone that thick as well because hybrid, we have non-spherical glass lens in there. So the reliability is actually more reliable. When it's in a higher temperature situation, we can still maintain a stable image quality. And as a company, in terms of the molding, it's actually the difficult thing in there. And our molds are actually in-house technology and in-house manufactured. And how do you bring these components together, and how do you make sure that it's aligned in a single line and to ensure the quality of its optical performance is what's difficult behind there? And also, testing is pretty challenging as well in this sense.
You can know that already we have some Android customers. They have this hybrid lens in their main camera. The performance is actually pretty good. So besides these Android players, we believe that more and more players, they're going to equip with this hybrid lens that's on the main camera. In terms of the telephoto lens set, normally speaking, in order to hit better performance, we need large image size plus large aperture. We want to use this telephoto lens to be adapted into different scenarios. For instance, that we can use internal focus to hit telemacro photo shooting. Besides that, we can have high-power applications and so on. For this multi-reflection periscope, by using that, we can make the structure of the lens smaller. We all know that within the industry, we have already got customers using this kind of technology.
Besides that customer, we have also Android players from China. They do have a precious view on this technology. Another point is about our plastic prism. So for plastic prism, it has been used on our medium to lower-tier products. For itself, we have got many advantages, especially the cost can be lower if the volume goes up. When the scale goes up, its yield rate and reliability will start to mark up because it's made by plastic. When we're making this prism, so the light, so the weight is pretty light. In terms of the driving of the motor, it's going to cost less energy and less power in order to drive that prism. That was about our straight phones. Probably, you'll be more focused on foldable phones. In terms of the foldable phones, the development's pretty fast as well.
But for these foldable phones, because of their structure, it's pretty different. So they have limited space for our camera sets. And its selling point was not on the camera. So it makes foldable phone's camera modules not in the top tier, but it's really high potential. In terms of the industry development, what we're seeing is that more and more customers, they would like to equip their phone with a periscope lens. And also, we have higher requirements in terms of making it lighter and thinner. So next, I would like to briefly talk about our strategic goals for handset lens sets. It's pretty straightforward, actually. So what we're doing here is that we want to ensure our top market share in the world. And also, we need to continue to optimize our customer structure and product mix by using four pillars. One is our design and simulation capabilities.
Second is about our molding and forming capability, including our hybrid G+P molding capabilities. Third is our coating capability, like our ALD coating technologies and so on. Next is our equipment development and testing capabilities. For this type of equipment, we have our own in-house team. They're making automations and transformations in their upgrade as well. We have also our own teams to have different capabilities. All in all, in terms of the landing of the strategy, we have got three perspectives. One is marketing, second is technology, and third is manufacturing. In terms of marketing, that's on the sales end. What we're doing here is we want to implement the product strategy of cultivating deeply and carefully.
That is to dig out the potentials of our product and also to make our products more reliable and to improve its quality, to improve the satisfaction rate of our clients. Second is what we want to do here is to firmly adhere to the innovation-driven technology roadmap to release new technology, new products to create new technologies. Second is that we would like to enhance the core competitiveness of our three high products, including high spec, high quality, and high added values. The last one is on the manufacturing end. On the manufacturing end, what we're doing is we want to deepen the roadmap of lean and intelligent manufacturing system. Next is that in terms of delivery, we want to ensure the high-quality delivery. That will be all of my introduction for our handset lens set business. Thank you so much.
Thank you, Madam Liu. We all know that we have been deeply rooted in the automotive field for more than 20 years. Also, with our automotive lenses considered the first in the industry for several consecutive years, with the continuous development of the automotive industry towards autonomous intelligent driving, the importance of various automotive sensors is increasingly prominent. What market opportunities will Sunny face in the future in this industry, and what preparations have already been made? Next, let's invite Qiu Wenwei, General Manager of Sunny Automotive Optech, to share his insights with us. Welcome.
Dear investors, good afternoon. Here, I would like to talk about, in the past years, our business situation here in the vehicle business, in the lens vehicle business. Here, our positioning is automotive plus optic. So our products have been surrounded in a few fields, like this vehicle-based LiDAR camera, HUD, and also intelligent headlights, projection lights, and so on. So next, I would like to introduce these by modules. So first of all, I'm going to talk about the automotive camera market for the global vehicle lens. So we can see that the market is actually pretty big. Based on the third party's report, we can see that for the future trend, that the growing trend is phenomenal. And in 2023, that the global vehicle camera market volume, comparing to previous years, we have 12.6% of growth.
For Sunny vehicle lens set shipment, comparing to last year, actually, we have 15.1% of increased. So we are slightly better than the overall industry. Based on that trend, we can also tell that the ADAS product application in the automotive lens is actually growing in terms of the share because ADAS has always been the strong capability of Sony. So for the ADAS lens within the global market share, we have around 50% of market share. For all the lenses, comparing to the second-place player, we have a very big leading gap, holding the first-in-the-world position. So people are focused on the new EVs because in the past, what we're doing is these overseas markets, what we're doing is pretty good. In the traditional markets, we're doing pretty good out there for overseas market. But now for the new EVs in China, it's actually growing pretty fast. And for all these automotive best lenses, a third of this equipment is using on the new EVs.
So here in Sunny, within all the brands and new EVs, we actually made a calculation. We have the statistic like more than 80% of the new EVs brands actually adopted our own product. So here on this slide is about our facing the fierce competing market. How are we going to bring our clients and to bring the market offer more competitive products? So here, this slide is talking about that since 2012, we have the first ADAS application lens. And for the last year, we successfully made a breakthrough of 8-megapixel technology. And this year, we are now shipping these products to our terminal clients. So in terms of G+P, we are in the leading position in our competitors. So by the end of 2023, we have more than 36 million of ADAS vehicle lens sets being shipped out.
On the high end, 17-megapixel application, we have already finished the preparation of large production. Very soon, the 17-megapixel can be launched to the market. These are our lens sets. Now I'm going to talk about the LiDAR market. In 2023, the overall market growth is actually not matching the forecast that we did, slightly lower, actually. In terms of the growing speed, it's still relatively fast. We think that although we are hearing that people are thinking whether they should choose LiDAR or not, we are still optimistic about that. In terms of the sales results of Sunny Automotive Optech in the past three years, the growth has reached 10x. Up until today, in 2024 and 2025, there are about 15 projects hitting the SOP.
Going forward, they will have a very positive contribution to the total revenue of the company. We are very optimistic about it. We also have done some statistics surveys for all of the LiDARs used on the vehicles, covering 96 models. Among them, 60% of the LiDAR are using our own products. On this slide, this is about HUD. We mainly focus on the AR HUD because for the TFT HUD, although there is a big market size, but there is not much that we can do. For the AR HUD, the current penetration is going deeper and deeper. Every year, there is a growth of overgrowth of demand at over 100%. We are also adapting to this trend starting from last year.
In 2023, the AR HUD PGU business, which is the core projection module, has hit over 100% of the growth. Recently, we also have the nominations from multiple OEMs. It's expected that in 2024, the AR HUD PGU market share in the global market will increase from 3% in 2023 to 6% in 2024. Next, let's talk about the smart headlamp or projection headlamp. It also has a huge market size, and it's established market as well. We focus on the digital pixel applications. In terms of the future growth of the digital projection headlamp, we believe that every year, there will be over 60% of the growth. Based on this kind of future momentum, we believe that 0.01 megapixel headlamp will have faster applications. Although the megapixel ones are great products, but the cost is a bit high.
So we believe that 8-megapixel headlamp in the coming few years will have a very quick development. In our company, we also have made some statistics as well. In the top-tier Tier 1 companies, we have gained a lot of SOP projects. In 2024, we believe that the Digital Pixel Headlamp will have a greater market share for us, growing from 1% in 2023 to 18% in 2024. So this is my introduction. Thank you.
Thank you, Mr. Qiu, for your presentation. I believe that now we have a better understanding of the capabilities, plans, and strategies of our vehicle lens sets. With the electrification, digitalization, and intellectualization of the vehicles as a very important part of the smart cockpit and smart cabin camera market have faced a greater demand, bringing new opportunities for the companies. Zhejiang Sunny Smart Lead Technologies have accumulated great experience and have great expertise in optical electronics. And we can very quickly respond to the market needs and the customer needs. And we have established very professional teams. And since 2018, when we had access to this field, we have been taking the lead in the industry. And the customer structure had also become more and more international.
And next, let's welcome the General Manager of Zhejiang Sunny Smart Lead Technology, Mr. Zhang Baozhong, to share with us the opportunities and the challenges we have faced in the industry as well as our future strategy. Welcome.
Hello everyone. I'm Zhang Baozhong. So automotive is a great battlefield of Sunny. And just now, Mr. Qiu shared with us the vehicle lens sets. So from my part, I'd like to introduce you to camera modules for vehicles, some of our latest development and progress. Looking back to our development in camera modules of vehicles, we had experienced the exploring stage. In 2018, we had clarified the strategy of accessing the front cameras of the vehicles. Back then, the Chinese market did not have very high requirements of the specs. After deciding our strategy, we believe that the components and parts of the automotives have very high safety levels. We must work with a great customer to improve the qualities of our automobile parts and components. We had a long-term cooperation with the tier one company from Japan. I still remember the requests. That was 30 PPM.
After several years of development, we could reach only 12 PPM. That was very rare in the whole industry. With our development with the tier one company from Japan, and based on our observation of the industry, we believe that the perception capabilities of the vehicles should be greatly increased, and the pixels of the cameras should be greatly increased. In 2019, we had deployed the 8-megapixel camera modules for vehicles ahead of the industry. Back then, we mainly collaborated with the famous company, which is a platform-based company in Israel. In 2021, the Chinese market had a quick adoption of the 8-megapixel cameras. We were also one of the first companies in the world that had mass production of the 8-megapixel products. We were the early mover in this market in terms of the 8-megapixel solutions. We still maintain a great relationship with that company.
We still are an important supplier for that company. In this process, we believe that to work with the international leading OEMs is very key to us. Then we have got the certifications from Japan. On the one hand, we had to meet the great demand from the market. On the other hand, we need to look ahead. We believe that China provides a great consumption market of automotives, but the global market is even bigger, and it's more appealing to us. Therefore, we had successfully passed the tier one certification of South Korea and of Germany, and also of Europe non-German OEM inspections. So at the same time, for the surround view from the VGA camera modules, 1.3-megapixel camera modules to 3-megapixel camera modules. For the e-mirror, 3-megapixel camera modules, we also had hit the SOP.
So in 2022, it was a year for faster development of Sunny Smart Lead Technologies. We not only covered the productions of the perception-related technology products in China and also built a sound relationship with the domestic customers. In 2023, we also made great progress. We have the software algorithm and integrated solution nominations. Some are from Europe. We also got the nomination of the perception-related products from the Japanese OEMs. So we officially initiated our perception supplies in Japan. We not only need to take care of the surround view, but also the undercar. We have provided the solutions of the undercar cameras that year. For LiDAR, we also have more development, which supports 3D additive materials. We are able to better integrate the 3Ds with the perceptions.
So in 2023, we had seen the initial progress in the overseas markets, and we got more nominations from the overseas OEMs in 2023. So this is a brief overview of our past development. And with that, we can categorize our products into several groups. Usually, we have the sensing-related products and also the naked eye products. And for the sensing products, we also have the front and rear camera products, as well as the window or the in-cabin products. So with this kind of categorization, we can have a more intensive design and better meet the needs of the customers and have a deeper understanding of the manufacturing process. While we are further developing our products, we had also been reviewing about how we can better meet the needs of the customers, especially the OEM customers. We made analysis about that.
So here, for the automobile market, it's very different from the consumer market because we really need to have the RFI or RFQ from the customers. We must be certified by the customers. There are very high requirements. The larger the OEM is, the stricter the requirement is. They have their own independent inspection systems or certification systems. The next slide, I'm going to show you the situations of the certifications. This is a must for us. For QC 080000 . For the ESD, we must be certified. Also the cybersecurity and the software development certifications. Although camera is a kind of hardware, but in a handshake, most of the software are ECU-related softwares. This is for the communication with the cameras. Safety is of paramount importance.
Over the years, we have been certified in so many different systems, as you can see from this table. This is what we have done over the past few years. We established all these systems and have been certified by the customers. As for the customer certification, we have been certified by VW and BMW and Mercedes-Benz and a series of other well-known OEMs across the world. From Europe, we already have about over 10 OEMs certifying us. In Japan, over 10 OEMs and several in America. We believe that it's important for us to be certified by the institutions and customers. Also, the ecosystem is very important for us. Surrounding our computing platforms, we had built a solid ecosystem. The image sensors and cameras, as well as the intended devices, we have the in-depth interactions with the customers.
Based on their demands, we can better develop our products so that we can make sure when there is a project, we can quickly provide the solution to the customers so that the development cycle can be further shortened and the lead time can be further shortened. This is why we are always favored by the customers. The SoC platform partners, including Mobileye, QUALCOMM, NVIDIA, Horizon, all that. If we need to further improve our perception capabilities, algorithm is a must. Therefore, we had collaborated with some of the leading companies in sensing-related algorithm companies, such as Zhongke, Chuangda, and some other companies, through some projects. Also Momenta, Hongruan, and TuSimple, both domestic and international algorithm developers, so that our product development can go in towards a very high-quality direction.
So what we talked about is how do we build interactions between the ecosystem and our customer. So what we have here is we are making more deployments in both hardware and software and product. So in terms of camera and control, how do we make them together to make them connected together as we're making our further deployments? So here, we have fully digitalized our company for global market share. And also here, in terms of the sensing product, camera modules, the pixels will be increased up to 18-megapixel. In the future, this will continue to increase in terms of image itself. While here, we have launched our own COB technology. So the COB experience from our mobile department, we want to connect them to the auto industry.
Through our analysis, we think that this year's camera, in terms of the debugging technology, or the way of dealing with that, is not enough. So we have improved this type of performance. And for our camera, we have the self-cleaning capabilities, including small bugs, dust, and so on. Our camera can clean these types of things off the lens. And also, the pixel has been growing. And the communication between our camera and ECU is important. And through cooperation, we are improving this domain as well. We think that in the near future, our camera will be deeply connected to its sensing capability. And also, it will be connected in different light spectrums or infrared spectrums. We'll all be connected together. And second is about our intelligent in-cabin. Now, we have already got 8-megapixel in this product.
In the past, it started from 2-megapixel, 5-megapixel, to now 8-megapixel. We need to make it senseless. So we want to make them as small as possible. So we need ultra-compact design. Also, we have DMS and OMS. These need to be connected together into one camera. So for the intelligent cabin, we also got RGB and IR processing technologies as well to ensure that the functionality within dark situations or night situations can still be functioned. For instance, for these infrared lights, it needs not to be sensed by a human eye. So you need to have anti-red glowings and so on. Also, the interactions between human body and the car itself, we need to have the ability to sense it. So we need to have the ability to do 3D sensing. Within intelligent cabin, we will include XR.
We are now making this type of exploration. For the future, we need to make all these highly reliable and also high precision and high performance positioning. We need to have the ability of highly simulation. Now, the temperature also is very cruel as well. It's very challenging in terms of the environment. We need all these products to be high quality in order for them to be on board to our vehicles. The automation of production and the digitalization of our plant is crucial to us. We think that all these are required to our product. These are our capabilities within our manufacturing technology. What we're saying is that to cope with the group, if we want to make further development, our business to spread it to the world is something that we need to do.
So we need to satisfy the demands from the demands from over the world. So we have a lot of collaborations for more than 85% of the OEM plants. And manufacturing capability is really important. Now, we have already implemented a plant in Vietnam. And second is that we need to have an on-site service to our customer, like in Germany, in South Korea, and also in Japan. And also, for our overseas plant, we need to duplicate the capabilities we have here in Zhejiang, Yuyao, and to throw them out overseas so that we can construct the capabilities the same as here at Yuyao to overseas. So the qualities from overseas plants can be secured. Okay? All right. So that will be all from me. Thank you so much.
Thank you, Mr. Zhang, for your excellent presentation. As the continuous iteration of AI large model technology and a drive from various application scenarios and factors such as real-time performance and reliability, there is a growing trend towards edge development, bringing about new transformation for a wide range of intelligent hardware. Vision plays a critical role in the process as a key input. Under the backdrop of AI empowerment, so here, what are Sunny's perspective and the preparations to the realm of vision? Next, let's invite Mr. Wang Zhongwei, General Manager of Sunny Optical Intelligence, to share his insights with us.
All right, distinguished investors. Now, I would like to share the situations from my side. As we all know, that in the beginning of this year or within these two years, AI has been a hot trending topic. We believe that AI will be the next boost of intelligent hardware. For instance, what we're using, apps and so on, those were trained based on clouds. The logic has been run through on cloud. As technology's development, now we have come to have some perspective that is from cloud training is going to migrate towards is going to emerge towards the edge. For the future, the reasoning and the logic thinking has well become a necessity for the support of these intelligent hardwares.
Because the reliability of the internet will improve, right? But if all the things stays on cloud, if your internet is not stable enough, and then the performance won't be secured. So edge will be a certain trend for the future. Under that trend, due to the reliability of the data and the security of the data, the safety of the data, the instantaneity of the data, bandwidth load optimizations, and so on, is going to cause a lot of change to our hardware. Although what we're doing is robotic vision. But today, the AI I'm going to talk about will touch on mobile phone, automotive, XR, including laptops and other devices, and also robotic as well. In terms of the neural network model's perspective, these are the three main elements, right? One is about the input. Second is about processing. Third is output.
In terms of the processing, this is something we're talking about recently. For the future, is it going to be centralized processing, or is it going to be a distributed processing? So people have different perspectives towards that. And now, for the large models, the computing chips, the neural networks, it's actually booming recently. People have different opinions about them. Within the industry, we're actually talking about this as well because robots, they're not humans at all. Because to do distributed processing, it has advantages. Like for the airplanes, they don't need to fly like a bird spreading its wings, right? It has a fixed wing. So input, that is what Sunny is very good at because there are a lot of visual stuff in there, including audio stuff.
In terms of video stuff, we have image recognition. We have video analysis. We have the semantic segmentations, 3D point cloud constructions, and so on. In terms of the phonetics, it is about speech recognition and the semantic comprehensions and so on. So we have a lot of deployments in Sunny within this perspective, including the AR and VR display in terms of VR, including pancake for AR, waveguide, and hardware, and so on, and including human-machine interactions, right? In terms of output in robotic, localization and recognition and navigation is something that plays an important role here. So based on the previous slide, after AI have video input, right, and other types of inputs, audio, touch, and force, and so on. If you plus edge computing, edge AI in there, what's going to change?
First of all, I would like to talk about on the phone end, right, on the mobile end. So on this end, as you can see, this is very down to earth. Two days ago, we also saw that the A company actually launched their own upgraded Siri model. Now, the present smartphone or the traditional smartphone, the input of the video, it's about 2D and 3D image processing. In terms of the processing, it's using ISP and 3A processing, like auto white balance and auto exposures and so on. In terms of output, it's about photography, videography, and face unlock and some AR applications based on the mobile phone's display. So that's one normal phone does. Now, if we're talking about AI phone, the requirements for 2D, 3D image will probably have higher requirements because it needs to analyze that.
So it's going to escalate the demand for visual hardware. And also, it's going to need higher computing power in terms of processing. That's something besides the video quality or image quality itself because it needs to do image recognition. It needs to have the semantic segmentations. In terms of output, this is what we're so excited about, especially for a smartphone. It can actually become our life assistant, everyday assistant. And also, it creates a lot of alternatives when we're doing jobs. And in terms of large models, it provides us a lot of professional suggestions and, for instance, like content optimizations, right? Two days ago, I was actually at some exchange with a professor from Fudan University. They hold the opinion that in the future, apps will no longer exist because all these will be integrated into AI large models.
So our phone is going to change along towards the trend of AI growth. And it's going to change how it appears right now. And so it's going to escalate the demand of visual hardware. Also, it's going to trigger people switching their phones, upgrading their phones. So in terms of smart cars development, now we're adapting more and more high computing power chips on board these vehicles. The traditional visual elements are for humans. But now we need to make the car understand. Now all the sensors will be connected together. They don't stand alone. The next two parts will be smart lights. For AR, VR, there are three phases. One is on the B side, some professional applications like education and training. Second is the C-end applications which are very common right now, including the gaming and entertainment.
We believe that with the further upgrade of the edge side AI going forward, we may turn MR into a daily tool for improving the productivity. For example, copywriting, therapies for diseases, trainings, simulations, and collaborative designs. Of course, there's a higher requirement for the hardware. If you need people to use it as a production tool, then I believe that we need higher confidence of using this kind of hardware. There will be higher requirement for the display. For eye tracking, gesture recognition, and the 6DoF there will be higher requirements. So how you feel when you use this kind of devices, when you wear it, and anti-glaring is very important. It must be very comfortable. With the development of the edge side intelligence, we believe that in terms of the robotics, there will be new formats of the products.
Just now, we have mentioned the centralized processing and distributed processing. Right now, in this industry, we have some communications with the institutions and the experts. We believe that there is a higher possibility of developing the distributed processing. It's very likely that we will have the motion-oriented products in the robotics. For this kind of motion parts, they need to receive the signals transmitted by the brain. As for the detailed sensings, like how you smell, how you see, how you hear, we can have the distributed and more edge-side processing. We could also have autonomous motion controls. We can make decisions based on that. Of course, this kind of motion organs, there will be joints, servos, reducers, encoders, etc. We will also add vision sensing to it. This is what we mean by organoid smart hardware.
Why we want to transfer to that direction or evolve towards that direction is because the processing tasks of the brain should be reduced. Also, the dependency on the bandwidth of the network should be reduced if it's on the cloud. And if it's on-premise, then the bandwidth should also be reduced because if we put everything into the brain, there are a lot of things to be transmitted, and that's too much. Also, the organized functions can be normalized or standardized. For example, we can have the standardized chassis, the limbs, the hands, the feet, etc. And it should be friendly to maintenance and repair. And with the large-scale production, it should be highly standardized. And therefore, we are able to reduce the cost further.
And based on the current development trends, we believe that the humanoid motion hardware with the distributed processing may be a form of products that will develop for a long time. Lastly, I'd like to share with you some of our observations about the products, what are the categorizations and trends of those products. Right now, most of the things we see is the collaborative AI, for example, the smart eyepieces and VR, the smart watches, and even the smartphones because they still need to assist the people who are using them. The second type is called the functional AI. It means that they are not general. They are not collaborative. They can fulfill some tasks independently. Those tasks are very specific, such as the AGV, delivery robotics, or domestic robotics, and also the vacuum cleaners and the autonomous mowers, etc.
We call this kind of robotics the functional AIs. Last but not least, with the edge side AI intelligence further develops, we believe that the universal AI has the widest scope of application. But at the moment, we cannot achieve that yet. And it still takes some time. But going forward, we believe that the universal AI is able to adapt to multiple scenarios, and that will be the trend. And we have already some deployment in the first and the second type. And for the third type, we are also actively communicating with some of the potential customers. And for the functional AI, we already have very sufficient deployment in the past few years. And in some of the business, we already have over 50% of the market share.
And in some other projects, we have over 20% of the market share because the total market size is not large yet. So I believe that the investors are not paying much attention to it. And for the collaborative AI, we mainly focus on the smart eyepieces and the VRs. In terms of the edge side AI further develops, we believe that it will promote the traditional hardware to transform into the smart hardware. I believe that a lot of the hardware will be able to have wider applications in multiple scenarios. We should also dig deeper into the values provided by those hardwares. We can also develop more types of the products with various formats. That concludes my instruction. Thank you.
Thank you, Mr. Wang, for your excellent sharing. I believe now that we all have a better understanding of the edge side AI development and its applications of the vision-based solutions. Our Executive Director, Wang Zhongwei , had just come back from a business trip. He also attaches great importance to today's Investor Day. He has immediately come to our venue after he had landed from the airplane. So let's welcome Mr. Wang. Next, let's welcome the general manager of Ningbo Sunny Opotech, Madame Wang Mingzhu, to share with us the development of the smartphone market in 2024, the trend of product development, as well as our core capabilities, especially the processing capabilities.
Dear investors, next, I'd like to share with you the handset-related business. In 2024, the global economy is growing slowly, but there is a recovery of the economy and the handset market. We predict that the sales of the global smartphones in 2024 will be about 1.169 billion, with a year-on-year growth of 3.5%. In 2024, the smartphone sales in China would be CNY 276 million, with a year-on-year growth of 0.4%. The camera adoptions on the smartphones compared with the previous year would be similar. The foldable phones are expected to grow by about 38.8% year-on-year. The flagship models are focusing on the telephoto image experience. In terms of the aesthetics of the cell phones, we will have a further miniaturization of the cameras. This remains to be a strong demand.
In 2024, the competition continues to be ferocious, but our company has continued to optimize our product mixture. Our goal is to further focus on the integrated solutions and to integrate the modules, actuators, etc., and to enhance our productive profitability. With the further development of the technologies for the core components and the devices, we can support the large image size actuators. We are the first to support the large-scale productions of the long-stroke cameras. We can have further miniaturization and high performance of our modules. The competitive edges of the premium product can be further enhanced, and the market share can be expanded. The product competitiveness, as well as the profitability, can be increased. In terms of the miniaturized packaging, we can have the variable apertures and periscope development and take advantages of miniaturization.
In terms of the flagship high-end products, as well as the foldable products, we are able to improve the added value, and the customer loyalty can be further increased. Relying on the optical longboard advantages of miniaturization and high-precision assembly, we have achieved the first larger-scale mass production of a monochrome-colored micro-LED optical engine in industry. In this area, we will continue to take advantage of our integration in the vertical sectors. We will create the smallest micro-LED with the mono colors to further expand the gap between us and the runner-up. Our VNM plant had already reached a scale effect. We have the cost advantages. And our global production and the supply capability will further be enhanced.
Next, I'd like to introduce you to our product upgrade trends for the handset camera modules. The first is the continuous variable aperture module. And we wanted to have the image effects catering to all the application scenarios. We cover the scenes such as the physical blurring with large aperture and the full depth of field mainly used in the main cameras of the flagship modules. And the second is the advantages of the integrated modules between the actuators and the modules and the lens. We are able to have the low stack height of the module and the ultra miniaturization of XY. So this kind of technology can be widely applied to the ultra thin and the foldable phones. And the third part is for 1/4-inch large image format telephoto cameras. The AOA technology can support the separate lens group movement significantly enlarge the amount of the light intake and enhance the resolution.
It will create the outstanding portrait image quality and the multifocal lens telescopic, as well as the 5-centimeter macro ranks the first in the industry. The fourth trend is in the continuous optical zoom modules. We have the 24-axis AOA technology supporting the full focal lens coverage to simultaneously meet the needs of a portrait and a telescopic photography. We can have the non-destructive zoom or the lossless zoom. We can meet the needs of users in various scenarios. Those are all about the upgrade of our module products. To support the module upgrade, we also need the actuator technologies. At Sunny, we have a very innovative guiding rod structures. We are able to have the ultra long stroke, very limited tilt actuator.
So we can improve the image performance of the large image size contributed to the thin handsets. We can ensure the far and near focus resolution. In terms of the 3-axis OIS actuator, we have the very exclusive and proprietary technology of the one single-layer ball-type cross slot patent. We can have the high-precision optical optimization, provide high stability and quality photographic effect under the state of motion. We will continue to expand the scale of mass production. Then we also have the periscope actuator for the telephoto large-scale application. The multiple guide rod structure periscope actuator can have the low dynamic tilt. Combined with the AOA, we can reduce the size and to have the high-resolution miniaturized variable zoom module.
And last but not least, we have the packaging technologies to support the micro-LED optical engine. The full-color optical engines can reach 0.4 cc. For the monochrome ones, we can reach 0.2 cc. For both of these products, we have hit the mass production scale. Just now, I mentioned several different product upgrades. Then I would introduce the core platforms. One is about our ultra-compact module and packaging technology. This type of technology not only can lower our z-height and the XY compared with conventional solutions, but we can also extend this kind of application in extreme miniaturization fields such as XRs. For the high-precision optical assembly AOA technology, we use these active optical alignment systems. For instance, we can make this peak tilt field to cover into other optical properties.
And third is our high-precision actuation technology, just like the motor I just mentioned. And so in terms of the module, in terms of the motor and the lens set, all these can be mutually beneficial. And next is about our high-precision insert molding technology because we're miniaturizing all these different components. So this insert molding has now become a key technology in order to achieve all these different technologies. And this will become our core competitiveness. So now will be our process capabilities. So besides the CAD line for CCM, our actuator production line and our XR production line has already achieved industry-leading closed-loop assembly and testing line. And besides these, we also achieved CAD line for CCM as well.
So this all-in-one is not just about design, but besides the design phase, the molding, the module, and the production of the motors can now be offered into the same grade of technology. And that will be all of my introduction. Thank you so much.
Okay. Thank you so much, Madam Wang. And so that will be the end of our session, presentation sharing session. And before we begin our Q&A session, I'd like to reiterate again that our discussion today will not involve financial data, but we'll focus on market developments and technological trends. And if you'd like to raise a question, please raise your hand. And please wait for our staff member to give you the microphone and to tell where you're from and your name, please.
Hello, management. I'm from Huatai. My name is Huang Leping. My question is about AI, AI mobile phone, because AI is a really hot trend nowadays. So for Sunny, this generative AI contribution, or when you're talking to your clients, is it about input because we have more cameras required? You actually mentioned the smart glasses because in the past, we're talking about XR. But really, I don't know what are the hardwares that we have already talked about that probably will see some volumes next year? That's the first question.
So currently speaking, in terms of robotics, both input and output is what we're doing. For input, we're still using some sensors that we mentioned about. For output, it's going to involve navigation, positioning, obstacle maneuvers, and also the feedback outputs of the sensors. So that's what we're touching.
So what will be the form of the hardware is going to be like?
In terms of the form of the hardware, one is what I mentioned in the PPT, is that for the future, we think that we will be involved in some maneuver joints. And also for the chase of the robots, that's what we're doing. And for the vehicle-mounted lens, what you mentioned is that we have now covered 80% of the clients, right, globally, because there's a North America client that takes 20% of the global share. I don't know if that's already our clients, or probably that's what we're focused about today. Yeah. 80%, I think maybe there's some misunderstanding here. 80% of the new EV players is what already we got, not 80% of the market share. It's not the 80% EV market share.
So if there are 10 EV brands and eight of them are using our product because the market is focusing on the largest EV player right now, are they our clients?
I think we need to put some efforts in it, and we're already there. We probably will be there soon.
Hello, management. I'm from Guangda, Fu Tianzi . Thank you for taking my question. I've got two questions. One is about AI mentioned by the investor. So here, what I'd like to know is that this is a focusing topic by the market. In terms of mobile optics, what are the upgrades you can do? For recently speaking, I think that will be these kind of auxiliary cameras. It will be assisting AI to get input. In terms of module and lens and the motors, they all need to be upgraded?
Not necessarily the motor or the actuator, but for the auxiliary cams, we probably will have some touch on them. They probably will need some upgrades.
All right. Thank you. And my second question is that last year, the industry actually experienced a pretty intense competition. And so we're looking forward to see your rise again of your report. And thinking about next year, if the overall mobile phone market did not increase, what will be the financial increase of this company?
Well, so today, we actually are talking about technology and trends. And your topic's a little bit over that. And so first of all, in terms of smartphones, recently, we have two core topics. One is that our balance of international clients, I think we need to make more efforts in that. In our North American clients, their share is still a little bit far away from the target that we set it. In terms of module, we probably will need more high-end and mid-tier projects to improve. But what we need as core is what Mingzhu mentioned, and Chris mentioned, that the upgrade of the lens, the core is the constant upgrade and the renew of the products, including the adaption of the new technology. We need to reshape the product structure so that overall, it can pull up our GP margins increase in these related products.
In terms of handsets, next year, this will probably be the direction what we're going to take. In terms of the automobiles industry, Mr. Qiu actually mentioned, and we talked about this to the investors. No matter is the vehicle board cameras or the modules will be a major input for the future. All right. Thank you.
Thank you, Mr. Ma. So can I ask another small question? Because in the past, the management have really objective perspective towards the market. So what do you see the volume in terms of AI mobile phone for next year? What's the volume going to be like?
In terms of growth, it's going to be there. But in terms of the actual volume and the connectivity between that volume to camera, it's going to take some time because all these mobile phones need to be custom-made, right? The industry chain needs to be prepared beforehand. It will be quite difficult to run through that preparation cycle in just a year. But the future trend, I think, will be beneficial to that.
All right. Thank you.
Hello, everyone. I'm from SPDB International. I've got a few questions here. First one is about the edge AI because what we're seeing is that on the mobile AI, right, and for smart driving capabilities on the new EVs because we have got these optical products being used on them. And for Sunny's capabilities on mobile end, on smart driving end, are there any synergies between them? Or these two points, what are the differences between them that we need to throw in different resources, invest in different resources between the AI capabilities of these two domains? I'd like to hear from you. Thank you.
I would like to talk about smartphone, all right? In terms of the phones, because we have limited space and size, and for the future, it's going to become lighter and thinner. In order for this device to cope with AI, to help AI to get the input, that controversial point that requests technology breakthrough is to make it even more thinner and lighter. Probably it's going to be there's a certain need for this device to become lighter and thinner compared to the past. So I would like to answer this question. In terms of the automobile edge or end, actually, there's a lot of synergies that we can take on.
First of all, on vehicles, right? We are also pursuing lighter and thinner in terms of the components within these vehicles, especially to miniaturize them, all right? And within these handsets, that experience that we have in the handsets can be used in the vehicle as well, especially within the cabin because we want them to become senseless to the drivers and to humans, right? Second, for these modules in the vehicles, the amount of them is growing from small batch comparing to large batch production. For that model, we actually can learn a lot from mobile end. For instance, like the highly automation capabilities that we have on the mobile phones and also the automation capabilities that we have in the production line and also the equipment's capabilities. All of them will be beneficial to us.
As the module amount increase, what we're seeing obviously is that throughout the world, the packaging capabilities that we have for automobiles throughout the world is insufficient. For the COB technology that we have in the handsets, comparing to the technologies that we have of COB used in the automobile industry, this packaging and testing capability is insufficient, as I mentioned, that this can be learned from the mobile technology. So that we can enjoy the synergies within Sunny, especially for the hybrid technology that we have. You can feel it as well, right? And also, you need high reliabilities in automobiles and vehicles. And within the automobile vehicles, we need to hit the 10 PPM yield rate, right? And that will be from project management to production.
There's a lot of differences that we need to take on in the automobile industry. That is in the consumer industry and automobile industry differences. So also, we have different scenarios in terms of applications. Within vehicles, we have wider scenarios of applications that we need to cover, right? For instance, like -40 degrees to 105 degrees Celsius, we need to ensure the stability performance of these optical devices and mechanical devices. And also, the cabin needs to be stable as well. So within the automobile, it's actually more challenging than the normal environments, right? And also the anti-wear, anti-UV capabilities, and so on. All these problems and issues are different from the consumer end. And also, within the vehicles, we have higher requirements. Within automobile, we have higher requirements for the qualities. That will be the key, all right?
I have another question about the vehicle-related products because for the OEMs, there is a higher stress of lowering their price. And this comes from the upstream. So for vehicle camera modules and the LiDARs, there was also a trend of lowering the price. So what are the products that give you a bigger stress of lowering the price from the OEMs? And what are the products you can reduce the technological cost or the project cost?
We do see very heated competitions in the automobile industry, and this kind of stress had been transferred from the OEMs into the supply chains. So I believe that for all of these products, there is such stress. And OEMs do expect to further lower the price. And for us, in some of the new products, maybe the emerging products, in the short term, we may face bigger stress in pricing because everyone wants to first make a breakthrough and then to pursue a higher margin. And for the more established products at present, I believe that there is less stress on the profit side. And in terms of the operation, with technological innovation, we are able to further reduce the cost and to be more competitive, such as the hybrid technology, how to differentiate ourselves with such technology.
This is something we really need to work on in such an environment. Also internally, how can we increase the yield and efficiency and to tap into the potentials? From the customer side, we can also sense that feeling because our customers are the camera companies, and they do have a requirement for the yield. It's the same for the handset and modules. So we not only need to improve our own competitive edges. For example, the dust control capabilities, we need to do better than before so that the cost of our customers can also be further reduced. Therefore, we are able to reduce the cost across the industrial chain. Everyone is trying their best to reduce the cost, including upstream and downstream.
What we need to do is to work together and to drive the R&Ds and to solve the interface problems, and we can blur the boundaries and to work together in R&Ds. Finally, we are able to reduce the cost across the industry chain. So this is everyone's efforts, and we are all trying our best to reduce the cost. Thank you.
Indeed, we have sensed the stress from the OEMs, and especially with the NEVs. Right now, the NEVs are not making a heavy profit, especially the Chinese NEVs. We don't have many of them making profit right now. So reducing the cost is definitely something we pursue. We should have an understanding or awareness. According to Elon Musk, we need to sell by kilo. This is the underlying logic or rationale. In terms of the cost, it is composed of several parts. The first is the design cost. Although we have a lot of the application scenarios, we can reduce the design cost even more. For example, we can use some specific materials or the processes to reduce the design cost. The second part is the manufacturing cost. It's related to the volume and automation level. With the larger scale, we can reduce the manufacturing cost.
We do have some chance to reduce the manufacturing cost, but the key lies in increasing the volume. For example, we don't want to have the exclusive production lines for only several models. We also need to reduce the product-related cost. This is something we should always pursue. If there is a large-sized recall, then the quality cost is extremely high, and the compensation you need to pay to the OEM is extremely high. So you need to pay extra attention to that. And also, we need to take advantage of the expertise of Sunny. And the camera module is very important for us, and we are able to reduce the sourcing cost as well. And by reducing these costs in different aspects, we can better meet the needs of the customers in controlling their cost.
Hello, management. I'm from the CICC. My name is Liu Yanshuan. Thank you for hosting today's Investor Day. And in our industry, sometimes we can hear some rumors about your company. So I'd like to take this opportunity to confirm some information with you. The first is that I'd like to ask, next year, the customer from North America, will they further upgrade the optical products? And from the industry, we have heard that they are concerned about the variable apertures or the hybrid lens sets. I'm not sure whether it's true or not. Also, previously, you provided more lens sets to the North American company because now they are going through the MPIs, and maybe there are some opportunities aligned in the camera modules. So is that true?
I answered this question. Regarding the technological roadmap of the North American company, we cannot disclose more details. This is the first part of my answer. The second is that our core right now is to improve our handset lens sets business. There is still a gap between reality and our goals. We wanted to hit that goal in the short term and to increase the margin. Actually, you did not answer this question. It goes to my second question. It's more general, and maybe it's easier to answer. I discussed with some of the other vehicle lens sets companies, and they believe that their advantage is that for Sunny, you not only focus on the camera modules, but also the lens sets. For the conventional modules, Valeo and some other companies are doing that more.
The other companies tell us that you work both on the lens sets and the modules. The module companies do not want to share the margin with their competitors. When you are getting the new nominations, will you encounter such a problem? What is your strategy? How do you define the camera module business of the vehicles? Will it compromise the development of the lens sets businesses? Some of the other companies tell us that it's easier for them to get the projects of the camera modules. Maybe Mr. Qiu, you can answer this question.
Well, I would like to answer this question. This kind of phenomenon not only exists for the vehicle, but also for the handsets. How to balance the businesses of the lens sets and the modules? Actually, they have different markets and applications. In some occasions, there might be conflicts, but in most occasions, they are running independently. You need to balance the relationship with your customers and the suppliers. Maybe it seems to be a conflict, but if you can do it well, then you can develop both businesses in parallel. What they say is just a stereotype or a cliché. This kind of phenomenon had a long existence for the handset market. We are confident in our capability in handling that.
The second question is that you have introduced some successful products launched for AI. You didn't talk a lot about AI, but some other companies say that in the second half of the year, Meta is going to launch the AI eyepieces. Maybe you're going to use optical waveguide products. And in Sunny, when will you launch your AI IPs product?
You're asking about the AI IPs and what's the specific question?
So in the second half of the year, Meta is going to launch their AI IPs. And one of the companies said that they had provided the dual optical waveguide products. And for Sunny, when will you see such terminal products being launched with your product in it?
Well, to answer your question, you should first know that we have been deeply involved in the development of many of such products. And Meta is our bigger client as well. For example, they cover VR and AR and AI. But about the specific technology roadmap, I cannot disclose more details. And we are deeply involved in this process. That's what I can say to you. And I did not hear that Meta is going to launch an AI IPs in the second half of the year. What is your information source?
It's public information.
So maybe they will not have the SOP, they said, but they will launch this product. And another company said that they had been involved in a dual optical waveguide. So congratulations on that company because we are not sure. And he already knows this answer. So definitely, we can say that they are ahead of us because we have never heard about such a thing before.
Okay. Thank you for your answer.
Hello, management. I am Zhang Qiang. And there is one question regarding technology. Now, on the one hand, we can see the optical upgrade. We at Sunny also have deployed some very premium products. On the other hand, we can see the quick development of the AI big models. On the edge side, there is a wider application. So is it possible that going forward, we have the strong enough software capabilities, and then we don't need that premium hardware? We can already have a great input and output effect. I'm not sure how you understand this question. Also, in terms of the software and algorithms, what is your plan?
Let me answer your question. This is a great question. The development of AI has made everybody believe that hardware is not that important anymore, but you need to approach this issue from two angles. If from a static point of view, with the empowerment of AI, then the role of the hardware will no longer be so important as before because the software now comes in. But the issue is this hardware that we need will not be static. I can give you two examples to illustrate that the involvement of the hardware is a sure thing because the imaging technology is developing quickly. Now it goes towards two areas. The first is from 2D to 3D. And in this process, without the upgrade of the hardware, AI cannot play its role. This is quite understandable, right?
Therefore, in sensing, hardware still is very important because AI is responsible for processing, but hardware is responsible for sensing. Now let me introduce you to the second area. The imaging we talk about right now is all about the visible light. But in fact, now the direction goes also towards the invisible lights. As we all know that in IR and near IR, we can have a very quick development. And the hardware for that scenario also evolves, especially for the near IR. It involves both the visible light and some non-visible light, different from the visible light. And we are now developing the new types of the sensors.
For example, we are able to recognize the palmar prints and the vein prints. And you must believe that there is a basic logic for the development of the hardware. And of course, AI will greatly support the development of the system and to facilitate its upgrade. And at the same time, it is going to create something that the traditional hardware cannot reach. But they are not in conflict with each other. They are not the relationship of substitution or replacement. I'm not sure whether I have answered your question.
Okay. Thank you for your answer. That is my only question.
Hello, management. I'm Cherry. Oh, I'm back here. Okay. So I've got two questions. One is that I would like to make a follow-up on Mr. Qiu. I think you mentioned you have a new product. Now, the GP margin is not that high. For the old products, the GP margin is actually higher. Can you actually tell us more about that? What are we doing in the new product? Because you have mentioned self-cleaning, 16-megapixel, and all that. Are these lower GP margin? You're talking about Sunny's product, or are you talking about other's product?
Okay. For the new product application, right? For instance, now if you want to use the automobile to turn it into an old, like the AI HUD, you can take that as new, right?
Okay. Understood. My second question is that I would like to ask Qiu Wenwei. This week, Elon Musk mentioned that for the future, we are going to have 11 prototype robots being produced. But from Sunny's perspective, for these kinds of robots, human-like robots, what kind of camera can these types of products be adapted into it or sensing technology? We have some camera optic product, right? Can you expand on that and tell us about the human-like robot and what's that?
Okay. Thank you. I think there are three questions. One is Elon Musk mentioned the humanoid, this type of robot, right? I think they can have this multi-billion market in the future. That's kind of far away, I think. But talking about present, I think it's not going to be like that. And there will be more things and materials you can find on the internet. But in terms of Sunny's perspective, in terms of camera hardware, Sunny can actually play a certain role in the humanoid. For the AI algorithm, their capabilities are so good. Are we going to lower the requirements on the hardware? In terms of functionalities in AI, if your algorithm surpasses to a certain extent, the hardware requirements will be lower.
But for general AI, the hardware cannot be downgraded because it needs to be adopted generally. So the requirements for the hardware need to be a certain degree. So by doing AI, what Sunny's doing is doing hardware. We can have some upgrades in the hardware, right, from 2D to 3D. Now in terms of the functionalities of the robot, now for the 3D binocular visions has now become a major trend. Another thing, as mentioned, like AR/VR-like product, for instance, for these maneuvering sensings and large model AI deployments and so on. So for Sunny, one is the constant upgrade of the hardware and the AI model deployment. Another thing is vision and navigation. So for Sunny, we need to find values within these points, all right?
Is there any other questions that haven't answered? If we add all these things together, in terms of value-wise, how much is Sunny offering? And a few years ago, we started to do the camera optics, right? In terms of the visual vision part, not adding the maneuver parts, the moving parts, within the robots, around 5%, 2.5%-5% of that robot. For instance, for the maneuvering part, the control part, the multi-sensings, if we added that all together, the value will probably go up. As for ASIC, now we are in a major production phase in AIOT. In terms of the volume, cannot compare to our mobile and automobile industry, all right?
Thank you, management, for the spectacular sharing. I'm ending. Here, I've got one question. What we're seeing in the past two years within the downtrend within the industry, Sunny actually holds on to its R&D investment that you have a certain large amount of R&D investments. For instance, like CNY 2.5- 3. And all his are here. All the managements are here. Can you share with us, in the past two years, what are the technical perspectives? We can sense that we have a larger gap between Sunny and the components because of the R&D investments. And what are the parts that we can sense that it is now more competing? The counterparts are catching up. Can you tell us that?
In terms of R&D investment, I think the managements are brilliant enough for you to invest in the segments that have opportunities. So you can obviously see the risk within that and probably invest in 5- 10 projects. What are the projects that can succeed in an earlier phase? Can you tell us that? Thank you.
In terms of R&D investment, for our company, we have a plan for that. From our long-term strategic planning, we already decided how much to invest in the R&D. It will not pull out this R&D investment because there are some situations that appear in the operation of the company. So for the R&D investment, all our different BUs will still follow their own pace to invest in R&D. So all these different business units, what they talk about, they've shared about their own achievements.
That is the result of our R&D investment. Besides the BUs' investments in the R&D, the group itself is making investments as well. It probably will be more futuristic or more long-term investments. So within your questions, actually, shed a little bit of light on that as well. And so how can we talk about that? I think we don't really have enough time to talk about all of them. So if we have a special day for technological development, so on, probably can talk about that. But in terms of the results or the performance of the investments, I think you can tell, right? In the handsets, it is not that we don't have opportunities in them for this, the AI mobile phones, right? There are still chances for the future. There are still opportunities. The vehicle-based equipments, there are still opportunities.
For XRs, we don't have a significant result popping up, right? And the humanoid, I think it's in the same category. I think we're kind of like in the dark before the dawn. The automobile industry will certainly be good. The mobile phone will certainly be good. I think these are the two aces we have. So as long as here we can invest in enough technology investments, if we have the right capabilities being built up, and these businesses will still be future-proof and will still have prosperous futures, blessing AR, VR, robotic, and other optical applications. I think this is the thoughts behind our R&D investment. I cannot tell you too specifically, right?
Okay. All right. All right. Thank you.
I'm Yang Haiyan. Because I'm wearing glasses today, so probably I'm going to ask something about the glasses. The Vision Pro will start to be selling here in mainland China because in May, Google actually demonstrated its Project Astra that is a simplified version of Optic Plus AI because Sunny has multiple years of technological accumulations in this domain. So for this type of product, I think comparing to Vision Pro, this is a little bit more lightweight. Which one do you think will be a better solution for the future? Because I know that we all have touches on these different optical solutions. So I would like to know what's your perspective on that and what the industry will choose or what the market is going to choose.
Your question, it's relatively a normal question. It copes with what consumers thought. They are choosing between light and heavy, full function and light functioned. They are making choices. But when we're using the glasses, if I like it, I like it. What is it related to? This is not related to the glass itself, right, or the eye. It is related to what's the purpose for you to wear that glass, right, to wear the glasses? What's the purpose behind you to wear that? For instance, for Meta, Google, or Microsoft, or even Apple. They have made their own decisions. And now what you see, light glasses, I can basically make a conclusion that light glasses will be the most welcome product for consumers.
But why Apple is doing this full-function glasses? They have their own source, right? They have these spatial computing capabilities and so on. And it really related to what are the applications? If we really need it, for instance, like designs, right, we need to make it as our productivity tool. This can only be done by using a heavy-weighted full-function glasses or device. Or if this is fully equipped and we can improve the efficiencies, like if we need to have a united design between the U.S., China, and so on, right? And then we need this type of functionality. So the form of these glasses is related to what your applications are. And in terms of our deployment within the market, the industry, yes, we've made our deployment with all these domains you've touched on.
No matter it is an MR, no matter it is this type of a lightweight solution or the Ray-Ban Meta solutions, we have all think about that and we have made deployments on that. We are actually following them as well. I don't know if that answers your questions. Yes, they understand it. Yes, you need to satisfy different clients' demands because they're still in the exploration, right? I think your glasses are pretty good, right? If you can put a camera on your glasses, it'll be even better, right?
And the second question is that the management has talked about in robotic visions, right? And there's a question that 3D has now entered the consumer market. And also, we're adding some iToF functionalities within there. And this year, we're not seeing too many products adapting that. Why is that? And how many years later can we see that? Is it going to become a standard equipment for smartphones for the future? If you're entering a 3D era, who's going to choose that? Who's going to adapt that?
I think I'll answer that question. So first of all, is it going to become a standard equipment? Or can 3D become a widely adopted functionality on mobile phones? Is the performance going to be good? If we're not talking about time, maybe not next year or the next after next year, right? From 2D to 3D, I think it's a certainty, right? But the problems actually lie in the 3D content itself. If we shoot 3D content, how are we going to display it? We don't have a very good way of displaying the 3D content if we're just watching it on the normal display we have. And then what's the difference? So it is highly connected to the development of these AR glasses, right? So it's not just for the future that Apple can sense 3D content. It can shoot 3D content. Our phone will have this feature as well. And Canon, as I was saying, there's a camera. It has two cameras on it.
It is for 3D content creation. So when 3D content creation and 3D display is mature, and all the industry will come up. So 3D will certainly be the future development trend because adding another dimension can make us have more sense of the things that we looked at. And if we can plus AI on it, it will become even more valuable. And that is on large LLM to deal with the 3D infos. And I think it's more available than 2D. That's the trend. Even if it's not applied right now, I believe that it must be applied in the future. And we want such a future goal nearer, and we share the same minds. And we can work on that together and help everybody think so. And we must expedite that process.
I'm just back from Changchun because Changchun is a hub of optical technologies and had attracted about over 20 academicians to go. They're discussing about optical technologies. A secretary of Jilin Province was also graduated from the applied physics, focusing on the direction of optics. He understands deeply about this technology. He also delivered a half-hour speech.
All right. Thank you, Mr. Wang. My last small question because for the vehicle solutions, we have very high expectations on that. Our roadmap is the human eye vision plus the robotics vision. There is a new direction right now. The Chinese market is very quickly. First is the L3, very close to L3. Also for L4, robotaxi. Very soon, we may be able to see the robotaxi on the road. For the PGU, it actually serves for the human eyes. In terms of the ADAS market share, we rank number one. So between these two markets, which one do you think will be larger? We believe that if ADAS enters into L3 or L4, then there is no much need for the human driver's visions. Have you ever considered about that?
Let me answer your question first. I don't think this kind of AI vision has a lot to do with autonomous driving. It's more about assisted driving. So in terms of the scale, and for sure, the camera module scale is much larger than that of HUD because there are more numbers of the cameras on the vehicle. I have something to add here. Now, when we are developing the product, we can not only look at its name. It's more about the technologies behind it. It's not only about AR, HUD. It's also about the projectors. So the technologies related to the projectors are more important.
All right. Thank you for your answers. Very brief, but very on point.
Hello management. I am Wu Liuyan. I have two questions about the ROI. The first is the consumer electronics industry chain in China still faces an issue of migrating to the overseas market. So my question is about the levers and the facilities in our overseas production bases. And in the longer run, how can you shorten the gap of ROI between overseas production base and the domestic ones?
Well, I can answer this question. Many of you talk about our overseas production base deployment. And right now, we still focus on the manufacturing basis. So we built the factories overseas. Where we are now, do not have many talents in this industry. What we need to do is to cultivate the talents by ourselves. After we have built the capacity of manufacturing, we need to spend extra time, such as several years, to cultivate the local teams and talents. Finally, we can hit our goals of production overseas. Okay. The second question is related to CaPex. So with the micro innovations of the handset lens sets, such as the upgrade of the pixels and the variable zooms.
So in this kind of technological trend, what is the reusability of the products? With the upgrade of the lens sets of the company premium modules, do we need to adjust the CaPex for the fraction per unit? What is the trend for that CaPex ? For the handset lens sets, we do see the trend of upgrade. On the one hand, it's about glass aspherical product application. There are more and more of such applications. And in that end, the manufacturing testing is different from that for the plastic products. So we do need to make some investment. And on the other hand, when we are developing the lens sets, we need to meet the more and more stringent testing requirements. Therefore, in terms of testing, we have made some investment as well.
So another thing is about AOA. So it involves the chipsets. And for opticals, it does not involve the chipsets. And we can make some testing or modulations. You can call it in different ways. And there is a higher demand for that as well. And one of the reasons is related to the glass asphericals. For the plastic asphericals, there is a shear angle, and you can match the angles. But for the glass asphericals, you cannot match the angles. You need the absolute precision. So there is an even higher requirement for the modulation or adjustment. There is no standardized device for that. This kind of device for testing had been independently developed by ourselves. We have made some investment on that. For products, especially the large image size products, there is a higher requirement for testing, such as the group-by-group testing.
For those kind of products, we do need to meet higher and new requirements for assembly and testing. As you mentioned, there is a continuous trend of product upgrade. We do need to pay attention to that. Also, we have multiple apertures. There is a higher requirement for the product as well. For different apertures, the imaging curves vary. You need to ensure that for all the apertures, they are in their best status and deliver the best performances. All of these are the new areas for us to explore. So maybe it's not the occasion to discuss more about the technical details. But that is my brief answer.
I am Kuai Jian from Orient Securities. There are two questions. The first question is that some of you mentioned the non-visible light business and the expansion of this business. So we have seen some of the downstream companies are expanding this kind of business. For example, previously, they purely focused on optical sensing. And now they are expanding to the non-optical sensing. And also, on the other side, in terms of the technologies, well, we discussed a lot about AI's impact on opticals. But on the other hand, AI can also impact the multimodal sensing because it has a strong processing technology. So for the non-visible light or invisible light or in a larger scope, the non-optical sensing, what's your plan? Will you have more deployment on that?
Actually, I have answered a part of that question. For the invisible light, it's a key area for us in the future. And we already have the IR technology companies developing very well in that direction. And in terms of the IR, this company enjoys great reputation. And on that basis, in IR and near IR, we will have further development. And also for the visible light and its integration with other sensing technologies, for example, how to integrate with the spectrum or how to integrate with the other technologies, such as in the robotics industry. We can also integrate it with the audio technologies. So maybe 7% are about optical, and over 10% is about acoustics. This will be a normalized spec for the robotics. Previously, what we talked about is multimedia, but now it's more upgraded and it's more enriched.
For example, the force feedback or multi-spectrum and the understanding of the properties of the objects. So that is the situation. A brief answer is yes, we do have such plans. The second question is about LiDAR. Previously, our expectation about LiDAR was very high. But from about one year ago, Tesla started to express some opinions, not so recognizing LiDAR.
So what I want to hear is what's your perspective about LiDAR? Is there any changes or updates?
About LiDAR, as far as we know, maybe Tesla is taking a different role. They focus on the pure vision. But most of the other OEMs believe that the pure vision should be integrated with LiDAR. Some of the OEMs even have more than one LiDAR. They have multiple LiDARs. The ultimate goal is the same. But different companies can adopt different technological routes. Even they can adopt several or integrate technological routes. I think it's a question of diversity. We believe that the future of LiDAR is bright. We are optimistic about its future. Therefore, we still have normal investment in R&D of LiDAR. I have something to add about LiDAR. It's more about the depth in the vehicles. There are multiple directions. For example, some companies developed the binocular solutions, and some have the dToF solutions. On both directions, we have made some investment in R&Ds.
XR and emitting, transmitting, and receiving end are more related to optical technologies. So for the XR or the transmitters, it's more about the point arrays or phase arrays. And for the transmitters, there will also be some changes of its formats. And we believe that this kind of change is going very fast. And for LiDAR and our LiDAR plans, it's very challenging.
Due to time constraints, we can take the last question.
Hello, management. I am Pan Jian, and I've got a question here because we're actually talking a lot about robotics, right? Like humanoid, like Elon Musk, right? And from your perspective, in terms of robot and vision, vision input or output, if you can pick three technologies that's the most important, which three technologies would you pick?
Input, output. Input and output. To pick three technologies, right, based on the optical technology, binocular will be a trend. Now, we have these technologies are parallel, actually. And iToF and dToF. We have applications as well because actually investors win. Why iToF has not been used widely? Because there are some interference issues. And within robotics, we have a lot of years or a lot of iToF through optical solutions. And we can erase that problem. So in terms of the optical input, the binoculars is an important technology. And like Mr. Chua actually talked about this as well. For us, LiDAR can have a lot of application in robotics. And that includes rotating dToF or array dToF, right? So phased array. And now structured light and, well, don't have a lot of applications right now.
In terms of output, if we want to make it more valuable, and then we need to connect the training model between the optical parts and the moving parts. So it's about algorithm, about vision, moving controls. So it's not a pure optic issue, but it's a multidisciplinary issue that we need to tackle. I don't know if I answered your questions. All right?
Thank you.
As time is limited, we may not be able to address all the questions today. And if you have any further questions, please do feel free to contact our IR team anytime . Thank you for your understanding. And that concludes today's session of discussions with the management team. And next, I would like to invite our online participants to take a few minutes to complete the survey and evaluate today's event. For those attending offline, we will provide the survey after the factory till tomorrow morning. The Sunny Optical Technology 2024 Investor Day and Management Exchange session concludes here. Once again, thank you all for your participation. Lastly, I wish everyone good health and successful investment. Thank you.