Hi, everyone. My name is Charlie, and I'm the CEO and founder of OODA AI, decentralized AI apps and infrastructure. Before I tell you more about OODA AI, let me spend a few minutes to talk about my background and who I am. I've been in tech for 30 years, and I started coding when I was seven years old. I'm from Sweden, but today I live in Munich since five years. OODA AI has headquarters both in Stockholm and here in Munich, Germany. I started my first company when I was 14 years old, a 3D software development kit. It was used by BBC Round One for the TV quiz games, by Lunar Lander, Maximum Football, and more. After that, I became a serial entrepreneur, built multiple companies, and today I'm focusing all my energy on the future of technology, which is artificial intelligence and Web3.
So let's talk a little bit more about that. Our company purpose is to make AI more accessible and widely used by decentralizing it and distributing it through blockchain technology. What does that exactly mean? Let's dive in and talk a little bit more in detail. We saw several problems with centralized AI: security and reliability, for one, monopolization and innovation, two, and last but not least, scalability and performance. Looking at the security and reliability issues, we have the problem of single point of failure, failure with centralized systems. This is solved by distributing them and decentralizing them through blockchain. Imagine putting all of your assets in multiple safes instead of one. Obviously, that becomes more secure. Secondly, data privacy risks. Your data is more protected using a ledger on blockchain, where the data is immutable and cannot be changed. Secondly, monopolization and innovation.
Today, you need to rely on models, tools, and setups from large cloud providers with very hard-to-get complicated configurations, often with pre-installed models such as GPT by OpenAI, which for many companies and many industries, is not compliant with regulations. Centralized AI also stifles innovation. It closes it down to a selected group of people and companies with big enough resources. Last but not least, as I mentioned, scalability and performance issues. Centralized AI systems face limitations in computing, power, storage, bandwidth, and especially on large AI models, and they are growing fast. Cost inefficiencies with centralized infrastructure can be very expensive. Very hard to get and expensive big chips to run the big model. This requires substantial investments in hardware. So what are the solutions to these issues? Well, first of all, as mentioned, blockchain enhances security through decentralization, while it also increases accessibility and innovation.
So how can this pair with AI? Well, actually, it's a match made in heaven. It can decrease the cost of distributing computational resources on inference of existing AI model across nodes, meaning running AI on multiple smaller chips instead of bigger ones, and using blockchain and a network to coordinate that and making it secure and safe. Ultimately, we work very hard to make this accessible through a one-click install by launching OODA.ai, our AI app store. We have procured hundreds of open source AI applications, which are easy to install through one-click installs, something most people today know about through App Store and Google Play. We also work with these open source projects, providing them revenue share for linking to their app on our AI app store with an Install on OODA button, similar to Get It on the App Store button.
Last but not least, we are much more cost efficient. Users pay a fraction of the price for these kind of AI applications. Instead of paying per seat or per user on a monthly basis, you pay for the size, more similar to a hosting package. We also have our own productivity suite on our own app store, similar to Apple Music on the Apple App Store, ODASH. It's an AI-powered business intelligence platform already used by several companies. It is GDPR secure, as it uses our own infrastructure, obviously, as all other apps on OODA.ai, and that is adhering to GDPR and security laws in Europe.
We support over 300 different data sources to connect to, and using a simple and intuitive interface, ODASH allows you to talk directly to your data and build graphs and dashboards, export reports, and put on alarms and notifications so that you can make better decision-making for your own business. AI inference, very hot topic today. You might just need a simple access to an AI. You're not interested in a fully built, finished app for customer support or for business intelligence, or any, many of the other things we have in our store. You might be building your own AI app, and you need a large language model to infer with.
Today you have a couple of alternatives, such as GPT and Claude, but as I mentioned before, they are both expensive, and they are not adhering to many of the regulatory laws we have here in Europe. We have many of the biggest open source AI models available on our infrastructure, such as Llama 3, 3.1, Mistral, Mixtral, BLOOM, and more. Again, the open source models linking to us will also receive revenue share for any of the revenue we make on the App Store for selling inference towards their model. Pricing is not token-based. It's not user-based. You pay fixed for a certain size, and we throttle what you can access and how you query, and based on your needs, we can upgrade it upwards and downwards on the package, ultimately making it a fraction of the price of other AI inference APIs out there.
So how does our network work, the OODA AI network? I mentioned before, we're running on smaller GPU chips. NVIDIA H100 chip is currently one of the strongest and largest chip out there. You would need two of those chips to run, for example, a Llama 3.1 70 billion model on a floating-point 16 precision. What we do is that we put together a network of smaller gaming-grade GPUs, and we slice up the model in multiple parts. Each of the slice of the model we put in each of the GPU's video RAM. Then we use blockchain and zero-knowledge proof carrying data protocols to make sure that the inference is working correctly, coordinated, and that no data has been changed as it passes through the inference of each GPU. This way, we can keep it compliant, secure.
Data is immutable using blockchain, but also we know exactly which GPU did what computing in the network. The business model, when you put all this together, is actually quite straightforward. We have our customers, which are buying applications on OODA.ai, or customers which are buying AI inference through OODA.ai as well. For this, the customers are paying a recurring monthly fee, just like any other type of software as a service company. Our OODA blockchain middleware coordinates the deployment of the applications bought and the inference on our AI network, powering these applications. Now, the app providers are getting a rev share paid for using our utility token, which is connected to the blockchain network that I was mentioning before. Same as with the computing providers. OODA currently today has access to over 250 NVIDIA A40 chips.
We can mix it as well with AMD and other future chips in the future. Now, we will welcome more providers to join our network to provide us with GPU and computing time, for which we will be paying using Soda, our utility token to launch on the blockchain. This way, we are at the inflection point between Web2 and Web3. Our customers do not have to learn how Web3 or blockchain works. It is powering the backend. They are purchasing AI applications to a fraction of the cost due to this, by normal means, using typical payment methods such as PayPal, company credit cards, et cetera. Who else has done something similar? Well, for one, Filecoin. We are doing for AI what Filecoin did for storage.
If we look at Filecoin, who distributed in decentralized storage, and compare the Amazon S3 price for storage for 1 TB per month of $20 per month, and Filecoin's $0.2 per month, we can see the drastic change in pricing for the type of same service providing, provided using blockchain and decentralization and distribution. Now, you can imagine in this chart that OpenAI is on the right side as Amazon, and OODA AI is on the left side as Filecoin. What does this actually mean in raw numbers and cost? Let's take that example which I was talking about before.
If you wanna run a Llama 3.1 70 billion parameter floating point 16 precision AI large language model, running it centralized with two chips on the same location, in this case, two NVIDIA H100s, you would receive a total of 160 GB of video RAM. The purchase price for only these chips would amount to $80,000. The monthly rental on popular cloud services, such as AWS or Azure, would amount to $8,200 per month. Now looking at what we have built, running it decentralized in different locations around the world, it's enough to combine three NVIDIA A40s or AMD W7900, and we would have 144 GB of total RAM. This would cost to purchase $12,000, and the monthly rental would be $1,200.
Now, let's break that down into the benefits of decentralized AI and distributed AI, which is what we have built. We have a higher availability of the smaller gaming-grade GPUs. Big chips are quite hard to get, as a lot of the big companies have bought most of them up for training. Now, we don't do training, we do inference. That's what we focus on. So gaming-grade chips working across a network is more than enough for great output and great performance. Now, we can also run floating point 16 precision, which is the highest type of precision currently in large language models. Llama 3.1 70 billion needs 140 GB of RAM. Now, three of these gaming-grade chips amount to that, so we can check that off.
Now, looking at the cost comparison, we can start to see the real benefits here and where we can build in the business model. 85% cheaper hardware costs running using OODA AI. It's 85% cheaper rental cost running using OODA AI. So I think the numbers say a lot. Another interesting thing to mention a bit is that the decentralized networks and the blockchain and DeFi space has been growing immensely and has been maturing in the past 10 years, very fast and a lot. Now, this is just an example of a couple of companies that many of you might know about and their current market caps. Most of them have been founded 10-15 years earlier than the decentralized network and blockchain companies on the right side. So what does that tell us?
That tells us that the decentralized network and the decentralized and DeFi companies are really growing fast and is something to be taken seriously, so we're very excited to be working within this space with AI combined with blockchain and Web3, so last but not least, to summarize it, what does OODA do? What is it that we built that's so special? Well, we built an AI infrastructure platform that takes expensive AI and makes it affordable. It takes less secure AI and makes it more secure. And with that, I want to thank you for listening to me, and I hope, we talk soon again. I wish you all a great continued day. Thank you so much.