I am Motoo Nishihara, NEC. We are quite grateful for your precious time today. NEC Innovation Day in 2024. This marks the first time this year. Today, I, CTO Nishihara, and Yamada, who is also serving as AI Research Officer, will talk about driving NEC's growth, advanced technology development, and expansion into new business areas. First, I would like to introduce NEC's Innovation Creation Scheme and the research and development strength. Then, our globally number one technologies and how we are using our technologies to contribute to our existing businesses. Three points now I would like to cover here. Finally, I would like to introduce what kind of efforts we are making toward new businesses and what our track record is. In our presentation, we will introduce several new product developments we are announcing for the first time today. I hope you can enjoy them.
NEC's Innovation Creation Scheme and the strength of NEC's R&D. This shows the organization. In the Global Innovation Business Unit that I'm in charge of, I am actually responsible for the corporate R&D as well as new business development, strategy, and management for the whole corporate intellectual property and the healthcare and the life science business as a new business. Basically, our R&D, intellectual property activities, and new business strength are the competitive advantages of the current businesses, as shown on the left. This includes IT services and social infrastructure businesses in our contributions here. We also contributed to the next growth of businesses, and we believe that it will generate a big outcome in the next midterm plan phase. In particular, as shown on the left side, it is very important to go through this BluStellar.
As for the organizational structure, we have approximately 2,000 specialized professionals around the world. In particular, the R&D has many overseas bases, and almost one half of the R&D team members is based overseas. In addition, we have an intellectual property management division and a new businesses division and others in various places. I would like to introduce some objective benchmarks for our technological competitiveness. One is the number of papers accepted by the top quality international conferences. This is especially important in the AI industry, as you can see in the table on the left. For the past 20 years, as shown in the bottom right, we have been in the top 10 rankings in terms of the number of papers accepted. I believe there are almost no Japanese companies that have been in the top rankings for as long as 20 years.
In addition, as you can see in the middle, we have an NEC Fellow who received a very prestigious award, the Medal with P urple Ribbon, and CEATEC just the other day celebrated its 25th anniversary, and NEC received its special award. On the right side, it shows our impressive track records in security and communication as well. Another point about the patents. Yes, we are trying to strengthen various patents, and here allow me to refer to the biometric authentication, video recognition, and analysis, prescriptive AI . In this regard, we have the highest cumulative number of international patent applications. I have listed these three because they are often becoming the foundation of the various social solutions we provide, and we have a lot more. As you know, generative AI, quantum computing technology, technology for visualizing the Earth through satellite networks, and life sciences.
I happen to be engaged in technologies, and it is fair for me to say that all of these are revolutionary technologies that would not be surprising if they were to occur once in a century. The fact that these are all happening simultaneously in the 2020s means we are living in an amazing era, and it is important to grasp the potential of those technologies and publicize our technology vision and have discussion with external people and predict accurately without missing its direction. It's critical for us to be fully prepared, and we started to announce our technology vision since 2022 and continue our dialogues with the external partners.
The cutting-edge advanced technologies that we are working, BluStellar, which NEC recently announced, BluStellar is the entire value creation process encompassing the consulting, development, product provision, delivery, and offering, and I think you can think of us as a provider of those core technologies, the central pillar. Now, allow me to introduce our contributions to the existing businesses. Here now I'd like to expand on three points. The first is about the transformation of system integration by incorporating AI agents. This is a kind of tricky quiz. Answer quickly. What is the color of the next word on the right? The quick answer is, well, at your first glance, you may say it is red since red is written there. This is called fast thinking, something close to our instinct. Humans do have a logic capability.
Logically, you can tell that quiz is asking the color of the letters. Then you think that this is green, and this is called slow thinking. These are the two things, intuition and introspection. This is written in the book by Daniel Kahneman, who won the Nobel Prize in Economics. When you think about it, human intelligence combines both intuition and logic. We humans have these two running in our thinking in power, and we come up with an answer. However, large language models, or LLMs, they have intuition, which is so super powerful and huge. It can read all languages and memorize or read almost all the documents in the world, such as documents on the internet, and can solve all analogy problems. So they far exceed our human capabilities.
On the other hand, as for the slow thinking, this logical verification is not very well incorporated at the moment, so LLMs are rather poor here. With these points in mind, then how do we provide this generative AI technology? We need to think about conversational capabilities. LLM here first. NEC announced its brand called cotomi in the last fiscal year, and we are happy to have many customers using it. Then, as a next step, we need to improve logical thinking. By this logical thinking, we are not talking about it in a general sense. Rather, we are talking about industry-specific thinking. If it fits with the specific vertical, we assume it can be used readily. We also need to have multimodal support, not only the text for it to be used in the real world.
We need to be able to incorporate the information we have been seeing and hearing. We need to have those multimodal capabilities, and of course, security addressing regulation is very critical, and we need to be eco-friendly. It has been said it will consume a tremendous amount of energy. We need to have a technology which will reduce power consumption, and we believe we can meet with these requirements by the product announcement we are making this time this year. We believe we can deliver mission-critical operations. The next step, enhancing autonomy. We will announce our new AI agents here. It is a new announcement, but the point is Yamada will talk about it later, but the important thing is not automation, but autonomy. Autonomy is very important. The first snapshot may be the provision of a system in the form of automation, but autonomy is very important after that.
Actually, this is AI orchestration, which I announced at last year's Innovation Day, AI Orchestrator. This is a diagram from last year. But if you properly break down the task in various complex business systems, then plan and execute task by task. You can do it dynamically by combining it with various AI technologies, of course, in correlation with the large language model, not statically, but dynamically. I said that orchestration is important to dynamically perform different processing every time. We are pleased to say that NEC has made it in one year. Appreciate if you think that NEC's AI agent is a dynamic IT system that continues to evolve and grow. There are several important points about AI agents. First, we can improve the customer's workflow by redesigning it. Ordinary IT systems, once they are created, then that's it, just to protect its process.
But with us, the workflow is redesigned and continuously improved, which is the key point. The other point is interface. Ordinary conventional IT systems are very fixed or routine or standardized systems. Such things are required. But when you talk to people, the way you talk may be different every time depending on the situation. Am I right? You can flexibly collaborate with people. Now, the third point, this is very important. It automatically implements missing functions. If there are no missing functions, that part is automatically designed or incorporated into the implementation. This is a very long and time-consuming step and evolutionary process. But the AI agent that NEC is aiming at is a dynamic IT system that continues to evolve and grow. Nowadays, many different AI agents are coming up, but the AI agent we're aiming at is something like this.
Specifically, in comparison with various IT systems, the value provided will expand and improve compared to the current IT systems, which have just a fixed value. Also, as I mentioned, the interface will be similar to the interface used to give instructions to people. Conventional IT systems were differentiated by quality and cost. In the future, the technology that supports the AI agent and how we can create business workflows using it together with our customers will be important. The power of these processes will be important. In that sense, there is no doubt that this technology will be used in various industries. As for the technology for that, first of all, the evolution of the new language model, the strengthening of the AI model, and the NEC AI agent.
The core technologies for creating these agents are multimodal support, addressing AI regulations, security resilience , and eco-friendly generative AI, and we will announce these products and introduce the detailed menu this year. Okay, now Yamada-san, please.
I am Yamada, responsible for the AI business in the Digital Platform Business Unit. Up until last year, I participated in this event as the head of R&D under Nishihara, but this time I would like to be given about 10 minutes to speak as the person in charge of the entire AI business. Nishihara has just talked about the direction of NEC's AI technology, but I would like to share with you where we are now, the current location. As you know, AI is still, in a sense, exploratory, especially so in the Japanese market.
Based on the BluStellar concept, we have a business model of finding problems together with our customers, developing ways to solve them, and delivering them. With AI, we co-create with our customers, but at the same time, we ourselves become a sample, and under the term Client Zero, we try out various technologies within the company and offer suggestions on how they can be used. Speed is important here. Last year, we declared here that we would increase the speed of delivery by conducting research and development and business development in parallel. This year, we have taken this idea a step further, and in August, we launched a new business division that will be responsible for everything from R&D to business development, delivery, and even operations in a seamless and integrated manner. There are two core concepts in the new business division, AI Technology Services Business Division.
The first is how to focus. How can we automate the highly specialized business processes of our customers? We are developing our business with a focus on this. The second point is NEC's unique differentiation point, protecting crucial safety and security. With NEC, you can be sure that your work will be carried out without fail. With this other point, we are currently developing our business in the form of service business and license business in addition to NEC's main system integration business. Now, allow me to talk about the service business. We have provided standard work methods in various industries in the form of software packages, in the form of industry solutions. We will promote automation step by step by introducing AI technology to these solutions.
The first product, MegaOak/iS, an electronic medical record system, has already been released this month, and we hope to contribute to the labor shortage problem that is a problem in hospital management by streamlining the document creation process in the electronic medical records. Last week, we also declared that we will use AI in the so-called PLM solutions in the manufacturing field. In the licensing business, we announced our collaboration with Sakura Internet on the generative AI platform business just the other day on Monday. We will provide cotomi as an LLM for receiving generative AI services on the cloud services dedicated to generative AI provided by Sakura Internet. We are very pleased to be able to provide our LLM business and our LLM technology to the many customers of Sakura Internet. The key to supporting these businesses is talent development.
We are developing experts with advanced AI technology knowledge in various fields, regardless of the type of job you have. For this purpose, we have developed a dedicated training program, and so far, 450 people have completed the course, and we have a plan to expand this to 1,000 people going forward. I have briefly explained the mechanism so far, but from here on, as Nishihara also introduced, I will expand on the three new products. First, what I would like to introduce to you today is the strengthening of the AI model as the new version of the core LLM, cotomi Version 2. The second point is the announcement that we will start shipping AI agents, which are at the very height of the trend.
The third point is that we have been engaged in many activities, including multimodal capabilities up until now, but we'll expand those initiatives even further in order to address the business challenges that our customers are faced with. I'll go one by one at high level. First, you can see a demo in the next room. Allow me to talk about the new version of cotomi. Cotomi, which we announced in the last fiscal year, is the world's smallest and highest-level LLM in terms of the Japanese processing, accuracy, and speed, and we will start shipping a new version that has been further refined starting from the next month. In addition, while Cotomi is, of course, a core product, we believe that when developing and using LLM, it is pointless to discuss only the performance of the LLM alone.
Specifically, we will add more capabilities in generating for pre-processing, post-processing, and the platform that operates them. First of all, as for the post-processing, we announced in September a mechanism to suppress hallucination, which is becoming a problem. In addition to this, we will introduce a self-learning prompt extension function going forward in order to reduce the difficulty in prompting. In addition, the amount of electricity used by Gen AI is becoming an issue, but at the same time, we would like to introduce a new infrastructure that significantly improves the power efficiency. The second point is about AI agents. Nishihara explained earlier what NEC's AI agents are all about, and we will ship a new agent that can autonomously think about how the LLM itself should proceed with the work and execute it without having to memorize various rules in advance.
In a nutshell, in this example as shown here, if you ask the LLM to create a proposal for a customer, then it will look into customer's issues, look into the actions by other companies in this area, look into the assets the company has, and collect all of these notes to create a proposal. This is the process that a human would normally do, and an LLM agent will be able to execute it. The first step is the so-called enterprise search area. We have a demonstration available for this in the next room. The third is the expansion of the multimodal capabilities. Last year, we talked about various multimodal methods such as multiplication of video recognition and LLM, multiplication of voice and LLM, or the multiplication of acoustic signals, which I need to strengthen.
This year, we are focusing on more basic multimodal methods such as charts and tables. You may think that AI can understand charts and diagrams. Well, it is certainly becoming easier to understand the words written on the diagrams. However, charts and diagrams have hidden meanings in their layout. For example, in this simple diagram, the horizontal axis is a screen size, which implicitly means that the positional relationship of each written word represents the size of the screen. It has been very difficult for AI to understand these things up until now. Other things include understanding a unified whole. We will provide this as a new multimodal extension. Many customers who have used RAG to process various internal documents have told us that they have a lot of charts and diagrams in their company and cannot understand them well.
We would like to provide services to customers with our new technologies in order to address this particular issue. Finally, there is a topic about AI regulations. There is a trend globally that AI regulations are becoming stronger in various aspects. In Japan, the Ministry of Economy, Trade and Industry has issued guidelines in the form of AI Business Guidelines, which shows the minimum level of compliance you have to follow if you are providing such services. These are the items of the guidelines. At NEC, we fully support all of these aspects in our technology and development process, thereby providing our customers with an environment in which they can use AI with peace of mind. That's all I have prepared. Today, I have presented the current status of our AI business with cotomi at its core, along with other new products.
We are creating a system to rapidly provide value-added services and then continuously provide new services. We are also demonstrating cutting-edge practices both inside and outside the company in order to continuously introduce new services. We are also continually introducing new functions in order to accelerate the business process reform. Through these efforts, we aim to realize a more livable society by introducing AI to every system in society where the NEC AI agent is everywhere. That's all from me. Nishihara-san, back to you. Please continue. Thank you, Yamada-san.
I would like to continue with other topics on existing businesses. Let me first go through this number two, which is advancing biometric technologies for use anytime, anywhere. We have some new announcements here.
NEC's biometric authentication technologies have been ranked number one by NIST in North America, and as recently as January 2024, NEC has been ranked number one in the world for facial recognition. We are very proud of being number one in both iris and fingerprint authentication for many years. NEC's biometric authentication technologies are being used in a variety of areas. For example, there are cases of identification for immigration and national ID. We also introduced biometric authentication for office entrance and exit, stadium entrance, and gateless biometric authentication, which we presented last year, an example of instantaneous authentication used to deal with a huge number of people at once. Now, what are the directions with this technology? There are three things to highlight. First, we will maintain the world's highest level of performance, and second is to pursue miniaturization and lightweight.
The third is robustness, which means that it will work properly even in a variety of complex environments. What's great about this is that biometric authentication can be used in areas where it has been difficult to deploy facial or iris recognition in the past. The first one is facial authentication technology based on an integrated imaging sensor solution. This product was successfully developed in collaboration with Sony Semiconductor Solutions. This is a lightweight facial authentication. What's unique about this is that first, that it is small and lightweight, so it can be installed anywhere, and as shown on the right, it can adapt to ambient light real-time. What this means is that since the accuracy of facial authentication is reduced under backlight or in the western sun, it was not easy to install this technology so far.
But with this technology, it can automatically adapt to a certain environment, which is a superior performance, and we hope to start commercializing it next year. Now, let me play a video of endorsement from Sony Semiconductor Solutions. I am Yanagisawa from the System Solutions Division of Sony Semiconductor Solutions. We're proud to announce that we started a strategic collaboration with NEC in the field of facial authentication or recognition solutions. Our image sensors continue to maintain the world's number one position in the market thanks to our technological capabilities, and we are working to provide value through innovative edge AI advanced technologies based on these image sensors.
Through this collaboration, by bringing together NEC's superior biometric authentication technology and extensive knowledge of system development with our own technologies, we're very excited to expand the market where edge AI advanced technology can be utilized and to take on the challenges of social issues together. Thank you very much. Next, let's talk about iris authentication. We're also involved in iris authentication, and we have already announced iris authentication a year ago, which is used for national IDs, domestic and international interfaces, or immigration control. But what we will showcase today is an iris authentication using a small standard camera that can only take low-resolution images, which will be connected to a tablet that is equipped with our software. And this will enable on-the-spot facial registration and authentication. You can find this technology in the exhibition room. This allows for iris authentication in various places.
Iris authentication can be used to identify tens of millions of people, so it can be used for financial settlements and so on, which means that our technologies have the potential to be used in various fields. We're introducing this technology in the exhibit next to this venue. Biometric authentication technology can, first of all, verify a person's identity, but it can also be used to understand an individual's various circumstances, surroundings, and background and optimize, and by linking the data, it can lead to a major solution that will lead to a greater convenience in the lives of individuals, which means that biometric authentication technology serves as the foundation of the entire process. The third theme is visualizing the Earth with satellite communications and advanced image analysis from disaster prevention to adaptation. As you all know, COP29 was held until last week.
I also participated in COP29 and had an opportunity to speak in two of those sessions. One of the main themes of the COP was the adaptation to climate change, which is becoming more serious, and NEC introduced a specific case study on the application of a combination of telecommunications and AI, which attracted a great deal of attention. People from the financial sector told us that they are interested in working with us. As you know, a huge amount of investments will be made to deal with climate change until 2030. Now, let me introduce some of the technologies we have proposed. First of all, in order to adapt to climate change, we need to have a clear understanding of what is happening on Earth. This technology that allows for precise analysis can be leveraged in various places.
The example on the left shows, for example, what is going on when disaster occurs. This technology will be used by government officials and police officers who are not necessarily engineers. They would want to know the extent of damage, how to prioritize their actions, and how to rescue people in disasters and other simple questions. This system would first collect topographical data in the form of RGB and then combine it with NEC's very highly advanced image processing technology, which can analyze images at a distance of, say, 30 centimeters, and also has the flexibility to respond to this natural way of speaking, which is generative AI, so we created a system that combines all three. By doing this, we can understand all the specific locations. In this example on the right, for example, you want to know which houses were damaged and the scale of the damage.
The system can analyze the scale of the damage for each house. We're showing this in the exhibit as well. The proto-system that can do this from the point where the image is taken through delivery is almost complete. Therefore, this technology can be made available by making investments to integrate with a system that can deliver these images. We're hopeful that this will be a breakthrough in response to disasters around the world. In the case of disaster response, as shown here, we can integrate granular data shown in the red and green on the upper left. Also, for insurance, it is possible to take post-disaster photos of how much damage has been done to each house and to properly assess the damage on the spot. This is how we can respond to disasters.
This will have a great impact because it will enable us to see all the disaster on Earth, and this has gained a great deal of attention at COP29. Next, I would like to introduce the possibilities in the area of new business. First, let me explain our intellectual property licensing business. We consider intellectual property licensing in a slightly broader framework. The first one is to license patents of existing businesses to various industries. The next one is to build startups in North America. We have been able to build various startups. We can expect capital gains and license fees from them. Then there is AI drug development. The drug development business itself is also an IP licensing business, which I will go through more later. The fourth is advanced technology consulting services, which we announced just five days ago.
We believe that we can monetize the intangible capabilities of researchers as intangible assets or intellectual property. First of all, let me share with you what we do with licensing. I cannot share the actual figures since it is highly confidential, but during the five years from FY 2021 through FY 2025 of the current midterm management plan, we have more than doubled the amount of patent revenue compared to the previous five years. Roughly speaking, we have gained on average tens of billions Yen. In order to further strengthen this, we have a patent pool as shown in the table on the right. We have only two pools at the moment, but we would like to increase to six by expanding to other areas and drive patent revenue.
Secondly, we're now working on getting North American startups to use our technology to earn equity and enjoy capital gains, and also license fees once they grow their businesses by leveraging our technology. We started this in 2018, and we have connected with a network of about 6,000 entrepreneurs and also startups. We have so far successfully commercialized 20 startups. In the next midterm plan, we would like to increase the number of startups to more than 100. Now, the AI drug development. Actually, we've been involved in pharmaceutical R&D for more than 20 years. In 2019, we amended our articles of incorporation and obtained a license for drug development and acquired a wet lab through M&A to have clinical development function. The lowest flow of the AI drug development business.
We do the fundamental research, design the drug, and then we do the preclinical phase followed by the human clinical phase. So we cover the first half of the process. In other words, we design the drug and then conduct clinical development. That is, we actually administer the drug and see if it has value. So we will cover up to Phase II, after which we hand over to the pharmaceutical company. We actually do gain license fees in each of these phases, which is how the business works in this industry. So we consider this as a licensing business. Now, why NEC is strong in this field? Here on the left, from cancer-specific gene expression up to the final step of antibody production, AI is involved in all these processes.
There are many competitors who have AI processes for all of these areas, but I think we're definitely ahead of them. By using this, we can develop the personalized neoantigen cancer vaccine, TG-4050, for head and neck cancer, which is the first example. We have been working on this with Transgene and got the first result just a month ago, which showed its effectiveness. As we have already announced, this disease usually has a 30% recurrence rate, but when our vaccine was administered, this group of patients remained recurrence-free for 24 months. We're now moving on to Phase II and beyond for more full-fledged trials with larger numbers of patients. Lastly, the creation of new growth businesses, which I will touch on briefly.
This includes dotData, data-driven DX, financial services, SaaS businesses, agriculture, FonesVisuas, which is a medical service business that uses blood proteins, and it is making steady progress. Although we don't have any figures here, we're making steady progress. Finally, let me summarize what we have covered so far. As for competitive advantage in current businesses, there are system integration by AI agents, which Yamada and I introduced earlier, new biometric authentication implementation technology, and the satellite vision that I shared earlier attracted a great deal of attention at COP29. We will continue to contribute to the society by utilizing communication and other technologies. Now, in our new businesses, we will also use NEC's strong intellectual properties. However, the business models and the domains where these technologies will be applied will be different.
Next year in FY 2025, we are aiming to achieve a business value of JPY 30 billion, which I mentioned two years ago. But actually, I believe the value is already building up to top this number. We also aim to contribute to 10% of the profits in the next midterm management plan. I talked about our intellectual property earlier, and I believe we can achieve this target considering where our numbers are getting to. That is all for me. Thank you very much for your attention.