Wow, wow, wow, wow. I'm just a little blown away how big this has all gotten here. I feel like I'm way higher this time. If those of you who were here last year recall that, it was definitely a smaller room, and it's much bigger this time. Our attendance in this event right here is up 70%. The show itself was way up, and I just have a few slides because I know that as much as you'd like hearing from me, there's more interesting things coming here. So you've seen it everywhere. AI changes everything, and it's true. It's in the signage. It's in the presentations. It's everywhere, except for this, so kidding aside, I'm not gonna, you know, read out the whole slide, but it's important to remember here that we will be making forward-looking statements. These statements do come with material risks and uncertainties.
So please refer to our filings 10-K and 10-Q with the SEC for more information about those. You'll also see us using non-GAAP financial measures. A couple of the presenters today will be using those. So just as a reminder that they're intended to be used in conjunction with GAAP measures, and those can be found in our financial statements, of course. And then a number of our execs will be talking about for informational purposes about where we're going from a technology and service standpoint. And just a reminder, look, it's not a commitment to deliver any material code or functionality that's being discussed today. So you're gonna see these slides. Each of the presenters will have some or all of these slides. They will just make reference that, as Ken mentioned earlier, here's the slides Ken referred to.
This is what they're talking about are these three slides. So let's talk about the agenda. We're gonna start with Clay in a moment here. And then he'll obviously be talking about OCI and Oracle Infrastructure, followed by Mike, who will take a discussion first, initially, on the application business. And then Mike will then continue on with a deeper dive into Oracle Health and our embedded finance businesses. That will take us to lunch, at which point Larry will take the stage. He'll be talking about AI and database. And then Doug will come up with the financial outlook. And then we'll do a Q&A with Larry, Mike, and Clay. And then so we should end up somewhere around three, two o'clock this afternoon. So with that in mind, let me get out of the way, and let's bring up Clay. Thank you, sir.
Thank you.
Thank you, Ken. Ken still has a minute and 37 seconds, which I was really depending on. Okay. Now I'm late. I don't know what to do. Okay. Thank you all for coming. I really appreciate it. To give you just kind of a quick overview of how I'm gonna go, I'm gonna talk for what's just gonna feel like far too long. We'll then have a short section where Mark Hura will come up and talk for another section of what feels like quite some time, and then I'll have a bit more details on some financials, and then we'll hand it over to Mike, so we've got a few minutes. At the end of this, there's a few things that I want you to walk away with.
I want you to understand the overall growth trajectory of the OCI business. You know, obviously, we've got myself, we've got Mike, we've got Seema, we've got a lot of people. Larry's coming. We're talking about a huge number of our businesses. I'm gonna focus right now on the cloud infrastructure business. I'll be talking about the different segments that make up that business, and when I talk about these segments, obviously, they are, they're the ways in which we think about them, about how they grow, about who the customers are, about the products that they want, how they consume, so we'll explain that to you. I'll explain what's driving the growth in each of these segments and also why those customers are choosing to invest more and more with Oracle.
The main goal is at the end of this, I think you'll understand why we're so excited about where OCI is at and the great growth prospects that come ahead for OCI. Okay. So these are the slides that Ken talked about. I have to say that you can read them. Those are the slides. And one more. And one more. Now we're okay. Now we're done. So, I mentioned different segments. So, you know, this very simple equation of enterprise plus distributed cloud plus cloud natives plus AI infrastructure equals hypergrowth. And I'll go into each of these segments specifically, and I'll talk about, you know, why we break our business up into those segments. The thing to get at a high level, though, is that all of these actually today have very good growth rates, but they're actually accelerating, right?
And a big part of the reason why these different segments are all accelerating is because they're symbiotic. And I'll try to take a minute here to explain what I mean by that. Let's say that you have a customer that shows up, and, you know, they're an enterprise customer, the traditional customers that you think of when you think of Oracle, right? They have a long history with us. They're a database customer. They also have some of our applications. They might move those things to our cloud. As they do that, one of the things that they want is, obviously, the great functionality that we provide as Oracle. They also want other ISVs. Maybe they want some security products. Maybe they want some new advanced networking features. Well, as Oracle, we don't provide all of that, but we have great partnerships.
I'll talk about some of those. And maybe you have a ISV that shows up, and well, they're much more of a cloud-native company. And those, the demand from those enterprise customers brings those ISVs to our platform, right? And those ISVs join, and they can drive more and more overall compute and storage and network and consumption growth. And they're happy because they provide services that our enterprise customers consume. The same thing is true for our distributed cloud customers. The fact that we actually offer, not just the public cloud that we have today, but the fact that, you know, we go out and we offer our Dedicated Regions and our Alloy business, that actually becomes an accelerator because our ISVs want that reach.
So the fact that we have individual customers, say, you know, in Japan that are building their own, you know, clouds that they operate and sell to their customers, our ISV that also drives things like cloud natives. When our AI infrastructure customers show up, they typically start by consuming, right, raw infrastructure, but they quickly move on up the stack. They start consuming more of our compute, storage, and networking products. They also then move on to start doing things like consuming our applications. So as you look through these segments, we're experiencing, you know, this accelerating growth across all of them because as each segment on its own starts to grow, it actually helps the others grow faster. Okay. So let's take a look at the enterprise segment. So, I think you already understand, you know, what we call enterprises.
Here's a great set of logos across the screen. But, you know, as I said, traditional companies, most, most companies have some Oracle, whether it be from database or applications or middleware or industry applications. Those customers are very pleased when they run on top of OCI. So you see the current growth rate, right? 33% year-over-year growth for this segment, from Q1 of FY 2025 to Q1 of FY 2026. However, contrast that with more than 1,500% growth rate in our multi-cloud database business. So the way to think about this is, up until very recently, you could only get the best of our data platform on one cloud, which was OCI. And we like that cloud. We think it's a great cloud. But we actually have extremely popular data platform services. As of today, you can now get that data platform on all of the cloud providers.
And that's why we see this very rapidly growing, right, multi-cloud database business. Okay. A bit more about why there's a margin range here. So when I talk about, you know, 65%-80% as a range for gross margin, the reason for that is actually largely due to the mixture of services that those customers are consuming. So as you can imagine, some customers end up being very, say, networking-heavy. And networking, we think our networking is quite good, and we have a lot of IP in it, but not nearly as much IP in our networking stack as we do in, say, our database services. And so, obviously, based on the differentiation we have in our services, we price them at different margin profiles. And so there's a range of this based on the workloads that a customer brings to us.
And then I briefly wanna talk about when I say contract to scale. What I mean by this is there's a process that enterprises go through when choosing a cloud. Typically, they do a POC. They try things out for a while. They then move forward. There's a contracting process. And then once that's done, there's also now an implementation phase, right? It takes. People don't move their most mission-critical workloads overnight, right? And so as we see the reason I bring up this contract to scale is that we see it move the process of moving from, you know, initial product launch over to pipeline and then actually to committed contracts and then showing up as revenue, that process takes a certain amount of time for these types of customers given the criticality of the workloads that they're bringing to OCI. Okay. There we go.
So a little bit about why they're choosing OCI. Well, the most obvious reason is because it's a combination of our ecosystem, right? We have cloud infrastructure, but we also have the world's best database with Oracle Autonomous AI Database. We also have amazing applications, both horizontal and vertical. And we'll hear more about that in a few minutes. But even beyond Oracle workloads, OCI is the best platform for enterprise workloads. We built OCI to make it very seamless for you to take an existing enterprise application and move that to our platform. That's very different than building a platform that was designed only for new applications to be written in the cloud. You know, we've been talking a lot about AI this week.
We've been talking a lot about the AI data platform that brings together the great work we're doing with our GenAI models, the work that we're doing with our new AI database, and the work that we're doing with our new agent service all packaged together as part of our AI data platform. That being available in all of the clouds makes it very easy for customers to pick Oracle as a place to put their data. And then, of course, enterprises do care about performance, and they care about price. And OCI is by far the best performance at the lowest price. Okay. So here's an example of a specific customer, Nasdaq. A few years ago, they actually moved RegCloud over to OCI and runs exclusively on there. Obviously, far, far before that, they've been a database customer for a long time.
But then more recently, they adopted Oracle Database at AWS, which allows them to then bring their RegCloud to their different environments and be able to rely on X data across all of their different cloud ecosystem. Okay. So let's now take a minute to talk about our distributed cloud. So when I talk about distributed cloud here, I really mean our Dedicated Regions and our Alloys. So Dedicated Region is you can get the entire OCI environment as of yesterday in just three racks, and you can put that in your own data center. A great example for that is someone like a Vodafone who bought six of our Dedicated Regions to take our full cloud experience and run it right next to their network on-premises. The other part of this business that we talk about is our Alloy.
So as an example, take someone like NRI that bought Dedicated Region to begin with, but they're a technology company in Japan. They take our cloud, they add to it their differentiated IP, and they actually then go out and operate in a sovereign way and provide that to their customers in Japan. So as you can see, it's a very good growth rate, right? Growing 77% year-over-year already with an average deal size of $67 million. Now here, right, our gross margin is in a range between 40% and 60%. And it's not because that these things are just fundamentally less profitable. It goes back to what I said about the enterprise section, which is it's all based on the mix of the consumption of the services that those people are consuming.
So when you have, say, a Dedicated Region that's almost exclusively used for database, the margin ends up being quite high. But a lot of our, you know, Dedicated Region and Alloy customers, they're consuming a lot of, you know, essential infrastructure services, and those have a lower margin profile in general. And then when I talk here about how there's 60+ Dedicated and Alloy regions globally, we have a very large pipeline of this. And look, you know, when someone shows up and they want a Dedicated Region, there's a time period. There's a time for us to get it to them. There's a time for them to put it in their data center. There's time to ramp it up. But what we've seen is we're planting a lot of these seeds.
And as you can see here from this growth rate, this part of our business is growing very rapidly, and the customers keep coming back for more. Okay. Why are these people picking our distributed cloud? It's really quite simple. Rather than talking about why our cloud solves their, you know, the technologically, their needs are simple. They're typically in a regulated industry. They need sovereign control, or they have things that they either don't or can't move to the public cloud, and they want the benefits of the cloud right next to it. Our distributed cloud offering is actually the only offering that solves these problems, right? If you go to anywhere else, they have a subset of the cloud. You have to pay a lot of money upfront. With us, you can get it in a very small form factor.
You sign a Universal Credits commitment that allows you flexibility on what services you're going to consume. We can deploy it, like I said, in a small footprint very, very rapidly. And no one else offers a turnkey solution. The fact that, you know, as Oracle, we both are in infrastructure and applications, that's something no one else does. And what that actually enables us to do with Alloy is we could not build Alloy and enable someone else to actually operate and run a cloud if we didn't have all the great work that, Steve and his team had done with Fusion. So when we talk about Alloy, it's not just OCI. It's OCI plus, you know, HCM plus our ERP system plus our quoting system plus Service cloud. That way, we can actually provide a turnkey solution.
And then, of course, another part of the reason why people like our Dedicated Regions, and you'll hear Mark talk about this with, I think, some customer examples later, is that when they buy our Dedicated Region or they buy Alloy, it doesn't just our infrastructure. You can get a Dedicated Region, and then you can also run our differentiated applications, both horizontal and vertical apps, on that same infrastructure wherever you need it. Okay. So here's an example of . So they started a few years ago with a Dedicated Region. They then expanded to have a second Dedicated Region for disaster recovery. They then expanded by adding GPUs to those Dedicated Regions. And then most recently, they expanded again by contracting for an Alloy so they can serve sovereign workloads in the UAE. That's an example of how these customers start small, right?
We make them extremely happy, and then they continue to grow and scale with us. And as we continue to add more and more of these customers, like I said, this business starts small, but they, the customers are seeing so much value, they all keep growing and growing and growing. Okay. I promise you I'm not that bad at clicking. All right. Cloud natives. So when we talk about cloud natives, the way I think about this is these are typically, workloads and customers that have, relatively few number of applications that consume a lot of infrastructure, right? ISVs typically fall into this business, but, you know, companies that have workloads that are very focused on how do you rapidly scale up and scale down and take advantage of all of the benefits of the cloud.
So this segment of our business is growing very rapidly as well with 49% year-over-year growth from Q1 FY 2025- Q1 FY 2026 with an average deal size of almost $100 million, right? Again, gross margin range of 40%-60%. A large part of the reason for this comes down to both the service mix that they choose, but also the size and the scale of their business. So as you can imagine, this group can range from, you know, relatively big fish to very, very, very large fish, what you might call whales. And as part of that, with that size expectation, you come different expectations around overall pricing. One thing to think about here, though, is that these customers tend to ramp faster. Typically, these customers might have some of their own on-premise. They have a platform that they can move very quickly.
You know, Zoom was an example. I don't know if anyone remembers, but COVID hit, and Zoom actually moved from their existing infrastructure as well as in other clouds. They were up and running on OCI in nine days, and so, you know, these are applications that you can ramp very quickly when there's motivation to do so. And so when these customers find out about us, they tend to be very large, and they scale very fast. Okay. So why are these customers choosing OCI? Well, primarily, it's about best price and performance. And the reason it matters so much to these customers is that typically, if you were to take their IT spend, you know, typically for, say, smaller customers, the largest expense for them is labor cost. And their overall, you know, infrastructure spend tends to be a tiny portion.
For these types of customers, by far and away, the biggest expense for them is the actual infrastructure. And that's why having the best price performance and being the most secure matters to them. We also work with these customers through a combination of engineering partnerships. They, you know, and Mark will explain a little bit more about this in a minute. It's not enough that we just say, "Hey, here's what's on the truck." All of them, given their scale and their unique business, they might have some features they need. They might have, some customizations they want of different hardware we haven't considered before. And the fact that as Oracle, we can actually go work with them, figure out how to implement all of that in a standardized way, right?
Because what's important when we do this is that we don't fork our cloud where there's like an Uber Cloud, which is different than a TikTok Cloud, which is different than an OpenAI Cloud. We need to have one cloud that everyone can use, and it's our job to go in and put those features and functionality into the base product. So, there's really important reasons why those customers are choosing us. It's really about those deep engineering relationships. Okay. So an example of one of these companies is Cybereason. You know, as I mentioned, they picked us because of our focus on performance, efficiency, and security. Cybereason is a security company that focuses on endpoint protection. They have an amazing platform. They tested our platform, and then they quickly migrated a significant chunk of their infrastructure from GCP over to OCI.
The reason for that is that moving to OCI saved them more than 40% on their overall infrastructure spend, which directly goes to their bottom line given the type of business that they have and the scale at which they're growing. Okay. So now I want to talk a bit more about AI infrastructure. So when I talk about AI infrastructure, you know, AI means many different things. Here, I'm really talking about companies that want accelerators for either doing training or reasoning. Both fit into this category. We have a lot of these customers, right? If you take a look here, when we say that there's more than 700 of these customers consuming on our platform, that doesn't mean there's 700 customers using AI, just to be clear, right? You hear Mike talk about it. We have thousands and thousands of customers using AI on our platform.
This is people who show up and go, "No, I want some type of accelerated computing, whether it be an NVIDIA GPU or an AMD GPU, or, you know, as we're rolling out different accelerators, any type of accelerated hardware." That's all we're counting for you to fit into this category. This, as you can imagine, is growing very rapidly, right? 117% year-over-year growth in overall annualized consumed revenue. The margin is different, and the reason for the margin range here is less to do with product mix because, in general, these customers actually are pretty well-defined on which types of products they want. A lot of it comes down to the location and how efficient we can be. Sorry about that.
So, as an example, as we continue to be more and more efficient, as we can go in and design better networks, as we can optimize our data center buildout, as we can do things like power capping, that then allows us to have a higher margin. And in some cases, the other part of the reason for the margin range is obviously just like I talked about with cloud natives, there's customers of different sizes, right? Typically, these customers are relatively large to begin with, although there's a lot of very small ones, but they can scale extremely, extremely high, as I think all of you know. Okay. So, why they're picking OCI? Well, part of it is that we move very quickly. We move quickly when things are stood up, but we also move even more quickly before things exist.
So our ability to rapidly deliver the latest in the accelerators, networking matters a lot, right? Fundamentally, when people are buying a cluster, like, yes, the actual accelerator matters, but all of these workloads are operating in a clustered fashion. That's true across training, especially, but even for inferencing, all of these inferencing is happening at a cluster level. And then the fact that we actually have extremely efficient data center designs that allow us to optimize our power usage, which then translates into more either more power available for them or lower costs. And then, it's not enough to just be really good at compute, networking. Storage is also critically important because these clusters are doing something with data, whether that be for training runs or for inferencing. They need access to a huge amount of context for these models to be valuable. All right.
So, this is a slide of an example deal. I'm going to walk you through all of the different details. So, as an example, let's imagine that somebody shows up and they want to buy, you know, approximately a gigawatt of GPU accelerators, right? And they sign a contract for six years, which would be six years, at $10 billion a year. It's a $60 billion TCV. Okay. Now, here I've broken the costs we have into two sections. One is what I'm calling land, data center, and power. That's obviously, you know, the buildings. It's the actual power generation. It's the people cost to run that portion of the business, right? That ends up being about 35% of the cost to deliver that service.
Now, if you add up compute, networking, and storage, right, all the things that you put inside of that data center, that ends up being about 65% of the cost. Okay. So, when, say, this deal is contracted and you have to build a data center from scratch, well, we actually align through the contracts that we do and the way in which we align our overall delivery, we don't pay for land, data center, and power until it's actually delivered to us. Okay? So let's say it takes you a year to build a data center. During that year, we're not paying for anything. Now, there's a point at which, and it's highlighted in this kind of left section before year one here, where the data center is working, and it's our job as Oracle to put all of the compute, storage, and networking inside of it.
All right? During that time period, there's a cost where we're paying for something, but we're not making revenue yet, right? And as we can reduce that time, if I can go from three months to two months to one month, the amount of time that I'm paying for something that I'm not able to then provide to a customer goes way down, right? So in this example of this, this example deal of $60 billion, I've written down here, okay, a 35% margin deal right in the middle of that 30%-40% margin range. But the thing to understand is when I talk about this 35% margin, it doesn't just include each year after it's running. That 35% includes, right, the fact that there's a startup cost upfront.
Now, obviously, if you can imagine, if you were to put a demarcation line before year one starts, we have an expense with no compensating revenue yet. And that can be happening across many different sites, right? We're building not one of these things. We're building 10, 20 of them at a time. And based on the delivery schedule, right, and our ability to ramp and put the capacity in there, you've got a waterfall across many of these different pieces of infrastructure where some are now starting up, some are now giving you a ton of cash flow off. And part of the job that we do is we're constantly laying those things together to understand how our overall business works. So key takeaways from this slide. We spend a lot of our energy aligning our revenue and our expenses.
The ramp-up time for this stuff has costs, but it's minimal, and it's something that we're continually optimizing, and the expense of that ramp-up is included in the gross margin, so when I talk about, "Hey, here's the gross margin range of one of these customers," we don't go in and assume, right, that, "Oh, no, no, some, you're not counting for these ramp-up expenses," the gross margin includes the assumptions around those ramp-up expenses. Okay. Next. I've been talking to a lot of different customers and investors and reporters and analysts about, "Okay, how do we actually go out and we accelerate and scale this, this buildout?" Fundamentally, there is no problem. It's really just an opportunity for us to grow faster, right? It's not, there is no issue. There's just opportunities the way I see it.
So when I think about what a huge amount of my time and energy goes into, it's about how do we actually enable ourselves and the industry to grow faster given the unprecedented demand we're seeing for infrastructure like this. And the way that we're growing faster is that we're actually pulling together. The entire industry is coming together across data center providers, energy providers, hardware providers, and capital providers. This is already happening all day, every day, right? You see constant announcements of new business models being created or different ways to go out and minimize risk and grow this business faster. That's actually what's enabling the current growth. And I think you're going to continue to see from us and others new models that were created to actually help grow this business faster and faster.
And so, you know, to understand the result of what happens when all of these partnerships come together, let's take a look at a video to see what's actually possible. Okay. So, what I'm going to do now is I'm going to kind of just walk you through a little, a few pictures about the video that we just saw to give you some more context. So this is a site in Abilene, Texas. It's 1,100 acres across eight total buildings. So what you're looking at right here, that weird kind of quad thing with this thing in the middle, that's actually what we call one building. I know it, to me, it kind of looks like five buildings, but it turns out that apparently we just don't have enough numbers. We couldn't get to that.
Each of those individual pieces are data halls that are all part of one building. In this location, there are more than 6,400 construction workers on site every day. And once we're complete with this campus, we provide nearly 1,700 jobs, direct jobs and even more indirect jobs. So on site, we have both grid power, and there's an existing substation as well as a new one-gigawatt substation that's going in. But in addition to that, there's also 300 megawatts of gas turbines that are installed to provide power to some of the initial buildings and also as partial backup power for the site. Okay. So, we started building this in May of 2024, and the entire campus will be completed in the middle of next year.
The workloads for the customer for this site, which is OpenAI in the beginning, went live less than a year after the construction started. Each of these buildings is actually mechanically isolated. They're all networked together, so in that giant campus, all of these buildings are actually through a massive set of interconnections, both within the buildings as well as through very secure fiber between them that actually allows the entire cluster to function together as one giant supercluster, and once this is finalized, as of right now, this will be the largest supercomputer that's ever constructed. Okay. I don't know how many of you are familiar with the details of data centers, but liquid cooling is something that's existed for a long time, but it's really gone from zero to a hundred, very, very quickly.
Liquid cooling, so these big pipes are actually the cooling liquid that then go in to cool these accelerators. Sometimes people ask about, "Well, what about the water? Does this use a lot of water?" The reality is no. This water is in a closed-loop system. You know, the actual annual consumption for each one of those buildings that you saw there, which is quite large and is 100 megawatts of power, actually consumes less water per year than a single family home in the same region. So, obviously you have a lot of water when you first start it up. You have to fill up the swimming pool, but then, because it's not evaporating all the time, it's a closed-loop system. Okay. So an example of a customer that is choosing us for AI infrastructure. This is Modal Labs.
They provide a developer platform for AI, ML, and inferencing. They enable customers to easily fine-tune and do batch processing for AI workloads. They choose us because we have amazing infrastructure. We have the best bare metal compute available, and they really appreciate the price-performance advantages that we have combined with our excellent networking. Then what I want to do now is take a few minutes. I'm going to bring Mark Hura up to the stage, and he's going to talk a little bit about some of our customers and why they are choosing OCI across all these different segments. At the end, I'll give a little bit more details on where we see OCI going over the next few years. Mark.
Thanks, man.
Thanks, bud. Are we great? I'm going to go left. I'll go middle. You go left. All right.
We didn't arrange that yet.
No.
Okay. I did my job as Joe is very easy. Hi, Mark.
Hey, Clay.
We have many segments of businesses.
We do.
Let's talk about enterprise. Why are enterprise customers choosing OCI? And give us some examples of what it's like working with us.
Yeah. So, you know, you gave some examples before in terms of why enterprise customers choose Oracle, and that has been our bread and butter for so many years around customers that have deployed Oracle applications, Oracle databases throughout their entire businesses. But then when you think about OCI and how we've defined and developed and deployed a differentiated cloud that serves many different types of workloads. And for enterprises, we think about three different major categories.
One is enterprise customers that have been utilizing Oracle applications and databases that serve either our applications, third-party applications, custom-built applications, or large-scale enterprise data warehouses, and those are customers that take advantage of the price performance and capability that we have in OCI, and I'll give some examples there. The second category is that customers that are looking to exit their data centers and their workloads, and what we have found is that many other clouds require you to redefine and build those applications in a new way to fit onto their cloud, so it becomes a constraint for our customers that are looking to take advantage of what a true cloud can do for their business, and then the third is really bringing pure infrastructure, compute, storage, networking capabilities to enterprise customers that are taking advantage of raw infrastructure capabilities.
Those are three distinct areas of why we're winning, how we're winning, and how we're engaging with our enterprise customers all around the world. And if you think about that first category, an example such as Emerson, you know, they've looked to build and move their entire Oracle EBS infrastructure while they're migrating to Fusion. But in doing so, they brought the enterprise data warehouse and all of their Oracle databases to our cloud. In doing so, they also, what's unique and what we found, all of the boundary applications that support that supply chain, right, that financial system all come into OCI, allowing them to exit their data center, get the performance capability while they start to transform to Fusion at the same time.
And so it's a great example of how we support our customers bringing together the power of the capability of OCI, the performance for them to be able to improve the performance and the speed for their end users around the applications that they're serving. Clarivate is another example where they were living under the constraint of another provider telling them they had to replatform before they moved to the cloud. And the cloud that you designed, Clay, which is bare metal from the core, right, with off-box virtualization and flexible compute for our customers to be able to pick up the VMs that they have and exit that data center without having to replatform, allowing them to focus on the things that are most important for them around delivering services to their customers.
And when we think about just bringing pure infrastructure to our customers in the enterprise, we have so many examples, but one that may be different and unique is like Goldman Sachs, right? Taking advantage of bringing high-performance computing capability to do risk analysis in their business. And, you know, we weren't their original provider, but we are a critical provider of services to them now, where they just, they had to take an opportunity to look at what it meant in OCI and how we are executing and delivering compute capabilities for them and how they were used to doing it before. And the results were staggering in terms of the price, performance, and capability that they were able to get. And it doesn't have any Oracle databases that they're running in that environment.
There's multiple examples of native VMs and entire data center capabilities, Oracle applications, and our entire data platform capabilities for customers or pure infrastructure around enterprises.
Okay. Well, Mark, I agree with you. Great examples. I also kind of explained a bit about our distributed cloud customers, and I used a few examples already, but I think there's probably an example that is more surprising and also a bit different than, you know, the traditional, Telco or, you know, MSP type workload.
Yeah.
Tell us more about some distributed cloud customers.
I mean, I think there's an obvious point, which is sovereign, you know, security, highly regulated industries, banking, healthcare, utilities that can take advantage of a distributed cloud, bringing it to them in terms of the industries that they're serving.
But what's also unique about our distributed cloud. It is our full cloud, which means our customers can take advantage of running our full suite of capabilities. Meta, as an example. You know, before we get into thinking about infrastructure at scale and training and inferencing, Meta is a user of Fusion, where we've deployed a dedicated cloud at Meta's facilities where they run Fusion to run their back office of their business. And so we have the capability to not only bring infrastructure to our customers, but the ability to run our full suite of industry applications as well as data layers for our customers. So it's a different example. It's a unique example that our customers can take advantage of wherever they deploy.
Okay. So we've got enterprise distributed cloud. Sure. What about cloud natives?
Cloud natives.
I mean, cloud natives has been a wonderful spot to truly bring the capability of next-gen cloud to the industry. And you brought up an example around Cybereason. What we have found is these patterns of success that have continued in a variety of different industry segments for cloud natives. Cybereason is one example, but many security companies are running on OCI, right? Whether it's Palo Alto, CrowdStrike, Commvault, SentinelOne, and on and on and on. And there's a reason why, right? As you mentioned before, there's certain types of workloads where we perform at a higher level than where the industry is.
And whether that's the networking requirements from a security company of how they're protecting others, or whether it's the compute intensity that they're leveraging to scale up, scale down to run their businesses efficiently to protect what they're providing to their users. And it's a great example of thinking around, well, wait a second, if the security companies are running on OCI, OCI is the most performant and the most secure cloud out there. Well, that makes a lot of sense. You know, in addition to that, we see a lot of ISVs that come to the OCI as well. It's an opportunity where we've become a critical part of their cost of goods that they provide.
And the technical differentiation that we have and the partnership that we bring to our ISVs is unmatched, where we are working with them to optimize the workloads for OCI to allow them to focus on delivering the services that they bring to their customers so that we are a critical part of their success. But most importantly, it's a highly engineered engagement with the ISVs, and it's unmatched in terms of the performance.
Okay. So I also had Peter from OpenAI on stage with me yesterday. He was very complimentary to your team. And then we just talked about Modal. Give everybody here a little bit more perspective on why we're doing so well in AI infrastructure and why those customers are choosing us.
Yeah.
AI infrastructure, and when we, you know, really started in this space many years ago, I recall a distinct conversation when we were working with NVIDIA, actually, around NVIDIA needing to run their workloads in a cloud and engaging directly with Jensen. And what he said is, "Wow, I had never thought I would see this, but Oracle, you are fast, you are highly technical, right? And you are incredibly capable from an engineering perspective." And he said, "Never lose that advantage." I, you know, it wasn't necessarily something that always aligned in terms of what he thought in the past. And for us, it's something that we just continue to operate with all of the customers that are taking advantage of our AI infrastructure that we're building. Our teams are incredibly fast. We know what we're really good at.
We're highly technical, and then we provide a white-glove service for our customers that is a 24 by 7 engagement. And what we constantly hear over and over again is that we are the best to work with, and we're the easiest in terms of operating. And what happens is this is a pretty tight community, and it spreads very quickly. So as the early-stage companies that we started with, you know, they were telling our story to others, and it's just continued to snowball from there. And obviously we have the largest of the largest AI companies from running infrastructure in the world today. And so again, highly technical, we don't waste anybody's time. We know what we're really good at, and the speed is unmatched.
No, Mark, thank you. I think it's a great summary of why people are choosing OCI. Really appreciate it. Thank you.
All right.
All right. So this is the long-range plan that we provided on our September 9th earnings call, just a few, well, I guess it was last month technically. You know, the thing to understand about this is that when I talk about this business, all these different segments are included in this. So this, this projection about future revenue includes all of these business segments that I'm talking about. The other thing I want you to understand is that when we talk to you about these types of projections, they're projections that we believe in. We have very real reasons across each of these segments across all of these plans that add up together to provide this kind of projection.
The other thing to understand a little bit, and I'll go, I'll go into it a bit more in a second, is that for some segments of our business, there really are actually supply constrained, not demand constrained, specifically AI infrastructure, and I'll talk about what that means and why you're seeing the changes from us that we're making in the industry, so to give you some context here, in 30 days during this quarter, 30 contiguous days, we contracted for $65 billion of additional commitment across infrastructure contracts. Now, across those, it was across seven different contracts from four different customers. None of those customers are OpenAI. I know some people are questioning sometimes, "Hey, is it just OpenAI?" The reality is we think OpenAI is a great customer, but we have many customers. One of these customers is Meta.
And as I share these numbers with you, this is not all of our content. This is not saying, "Hey, this is, we didn't add up all of the deals." This is literally seven deals for customers, all of them other than OpenAI. And it shows the diversity of our customers and the size of their interest in our business. Okay. So, as I mentioned before, most people are assuming that we are demand constrained, but we're not. We're actually supply constrained. And the thing to understand here is that a large part of what I spend my time doing is securing what I consider to be good supply. I'll give you an example of an email I received recently. There was a very nice person who lives in Nebraska who has a cornfield.
They said that they think that would be a really good place to put an AI data center. You know, you're probably laughing, but you cannot imagine the amount of reach-outs that come my way and in the way of my team. It's actually quite a hard job to filter through all of the opportunities and to understand which ones bring together the energy that you need, the hardware capacity that you need, the land that you need, and the capital that you need. Our job is to put together all of those pieces, and only then does that result in supply, and so what we find is that when we actually have good answers around supply in the reasonable timeframe, customers then contract very quickly.
So, you know, when we talk about the business that we're doing, it's not that I have this infinite supply over here of options, and I spend all of my time going to customers asking, "Hey, would you possibly want this?" Instead, the customers have come to us saying, "We would like a lot of stuff." And our job is to go find actual good supply that we can accelerate and deliver on the timeline that they need. That is what we do all the time. And once we have that done, suddenly the customers contract very quickly, and we're contracting with all of the different suppliers to bring together these pieces so we can execute as quickly as possible. So, as part of that, we are updating our long-range plan between now and our fiscal year 2030.
Again, the reason we're updating this is exactly for what I showed you two slides ago. As we find ourselves able to execute, you know, our supply constraints, as we actually can go in, we have customers that want that. And that's part of the reason too, and you see the change in these numbers that it's a little bit easier for us to find supply, not this year or next year, but in subsequent years, so as we're able to find that supply, customers contract for it, we see immense demand, and then we go about delivering that to customers. Okay. The other thing I want people to understand is that you know, I talked about these different segments, and obviously the numbers we just talked about was the all-up number for OCI, but we're also extremely focused on our AI database and our AI data platform business.
We are very confident in our ability to grow this very rapidly. Now, there's a few reasons for that. One of them is, up until recently, as I mentioned before, there was only one cloud selling our AI database and our AI data platform, and now we've transitioned to all of them. The other reason we're very confident in this projection, you'll hear more about that from Larry, later today, is the huge investments that we're making into AI, into building out our AI data platform, only makes the previous investments we have and our customers have made into our database that much more valuable. So, as we go through this next period over a few years, we're rapidly accelerating our cloud database and our AI data platform business. And this is, it's amazing for us to see, right?
A business that is such high margin, that's so valuable to our customers, grow so very quickly. So, with that, I'm out of time, and I'm gonna hand it over, I think, to Mike, and he's gonna talk to you about applications. Thank you very much.
Please welcome to the stage, Mike Sicilia, Steve Miranda, and Mark Hura.
Okay. Hello again. So, we're gonna talk about applications here. As a reminder, we'll do so under Safe Harbor. We will talk about roadmap. We will talk about our future product direction. So, you've heard us say, throughout the event here that AI changes everything.
And I think what's really important to understand is that for all the reasons Clay just described, for everything that we're doing in OCI, our ability to deliver AI to our customers embedded inside of our application stack and the ability to have just an unbelievable time to value, we think is unmatched. We've got, I think we've gotten to a point, Steve and I have been discussing this week, is that in our applications suite across the board, when you think about our Fusion applications, our front of the house applications like CX, our industry applications, we're at the point where there isn't a difference between the AI version and the non-AI version. It just doesn't exist. And we'll go through some examples of some of those things we're seeing, you know, particularly in healthcare.
We just went generally available with a new EHR. It's all AI. You can't choose not to consume AI to actually run the application. Now, if we didn't make all the investments we've made, if Clay and Mark were out there, you know, attracting all these wonderful models to OCI, we would not be able to deliver that as a package service. That's one of the things that I think when we see in the application market, when people say, "Well, there's not an ROI on this stuff." Actually, I think it's, I think it's because they're trying to stitch too many things together. They've got too many vendors involved, and they get right back into this old mantra of a very large implementation, a very large time and materials implementations, which frankly, customers are losing patience for.
That's not at all what we're doing. We're actually winning more customers because we're shifting our focus to outcomes. We're winning retail customers in Germany. We're displacing them from SAP because we have an all-in platform of Fusion merchandising plus our retail, plus our retail applications. Our customers are achieving more. Our customers in healthcare, and I'll go through this example again, it went live with our AI agents. Within three weeks, the ROI in terms of return, you know, the return on time spent with a system decreased was 50%, right? It wasn't small. It was by half, an order of magnitude by half, and the fact that we're delivering that as a service has been really just, we think, game-changing, and along the way, as we rely on all of the OCI tech, we've also been doing quite a bit with the AI Data Platform.
And to make a long story short, good things happen when all the data's in one spot. And that has expanded our ecosystem such that we're not just talking about automating entire industries, but actually automating how those industries communicate with other industries. And we'll go through some of those examples in just a bit. But to dig in, Steve, if you could maybe take us through all the great things in Fusion that we've got with AI to start us off.
I can do just a couple of the great things in Fusion. Mike, thanks. So first off, let me just give you an update on our customers.
So I think you'll find, and hopefully you've seen this week, just by walking around, we have a who's who of the top customers, the top brands, globally by industry, by what I'll call heritage, meaning people who've moved off of EBITDA Suite, PeopleSoft, J.D. Edwards, Siebel, SAP, and a variety of other third-party applications. I used to show this slide really to show some credibility of Fusion and how Fusion is real. I think now it's unquestioned that Fusion is a market leader in terms of cloud-based applications. The reason why I show this slide now is twofold. First off, the growing trend of what's happening is customers are going to phase two, three, and four of their projects. They started in ERP, they're adding HCM. They started in ERP and supply chain, they're adding CX.
They're starting in ERP supply chain, and they're adding industry applications, or they have industry applications and they're adding others. So it's a tremendous leverage to us. But more than that, the referenceability and the case studies we have in the customers. So just this morning, I was talking to a major airline who owns a variety of subsystems who have dozens of ERP systems. And what they wanna do is they wanna consolidate, but they are essentially at corporate headquarters with federated businesses, and they don't wanna, while they wanna bring this system together, they still wanna have different lines of business. I was able to quickly give them, FedEx did that, except FedEx centralized it all, very similar industry to the airline, into a shared service. But they said, "Well, we don't want a shared service." No problem.
That's exactly what DWP and the U.K. government did, brought a bunch of government entities together, but had federated ownership, and as it turns out, this airline had already met with DWP at this conference, shared their use case, and are gonna now connect with them offline and follow that same program, so this is not just a customer base that's buying more products. This is a customer base that's talking to each other and helping us expand and helping others expand and succeed. It's just a tremendous, tremendous, success story for all these customers, including AI and including Oracle AI. Now, Safe Harbor aside, and Mike said we're gonna talk about futures, everything I'm gonna talk about here in Fusion is not futures, is today, so as one example of AI being today, we start first, as always, at Oracle, our own use case.
And we have a, as you guys all know, a world-class finance organization, and we talk a lot about our efficiency of close and our speed of close and reporting. Well, we've just made that much better. Our internal finance group is rolling out our ledger agent. They're rolling out our payment agent. They're using AI to take an already extremely efficient group and fast group and make it more efficient. We've implemented agents across HR, even in functions we never really had, like HR benefits, goals and functions using AI to improve there. And then, for our support business handling the support tickets, all of our external customers now have AI agents both on the front end to deflect SRs and questions and for our assistants on the back end so that our support agents can better find the answers and service our customers.
We've just rolled out those agents already. We're seeing faster time to resolution, less people intervention, meaning less cost to us, and the more accuracy of the resolution, and it's resulting in greater customer satisfaction on the SR surveys. So we're, Safra's mantra, we're paying less and we're doing a better job of it through the AI agents. And those are just three examples at Oracle. Now, what we announced, what I talked about at this conference last year is that we would have 100 AI agents in Fusion. We actually have 600 AI agents, 400+ in Fusion, 200+ in the industry verticals. But that is a massive understatement in the AI because what we've built is an AI ecosystem across Fusion. One are the agents that we built internal to Fusion.
Second, however, we have an Agent Studio that allows our customers to modify what we give them in agents, to build their own agents, to integrate with third parties, and then at this conference, we just announced an Agent Marketplace, the Agent Marketplace allowing you to extend and expand them, and we have over two dozen partners part of the marketplace, each who've contributed already have about a half a dozen agents together, so the six hundred are agents that we've built that does not count agents that our customers are building through the Agent Studio, that does not count what our partners have built. Steve, and I think what's, what's really interesting about that as we discuss is that you, you have kind of this age-old, this age-old question in procurement cycles of build, buy, partner.
I think our answer is, well, how about all three?
Right. Right. You can build your own agents, you can buy partner agents, or you can buy our agents, but you're doing it on the safety and security of a single platform, right? It's not stitch, it's still not this idea of stitching a bunch of things together and bringing a bunch of bespoke. Well, I think that last bit is a significant difference because in theory, could you use a third-party agent framework and build agents on top of Fusion? Absolutely. However, when you build Fusion agents on top of the agent platform or use ours, the context is there. So it knows Mike Sicilia, it knows your role within Oracle, it knows what you can see and what you can't see.
I had no access. I tried to.
Now you have all access. But let me pick a different example.
When you're Steve Miranda, you have somewhat more limited access. So it knows, but the benefits agent example. So if I log into Oracle, we have the benefit agent. I wanna ask it a question. I'm traveling in Europe. I want a prescription. Is it covered or not? I have someone on my team has a leave of absence. They wanna extend it. Do they need to extend COBRA? There's all sorts of benefits questions. Well, the agent's native to Fusion, not only can our customers configure them in all the ways you see here, but it's embedded, contextual, and secure. So it knows the person's role, it knows their HR status, so it knows their tenure, it knows what healthcare plan they've selected, it knows what country they're in. So it knows exactly the benefits policy on which to answer the questions.
If you had a bespoke or build your own, you could certainly do that. But if you did it totally outside of the Oracle ecosystem, you have to code in the security, you have to code in the context, and you have to keep that updated always for every agent you do. It's a much different time to value to your point, and that's where we're seeing the difference in our customers adopting this quickly and moving forward. By the way, we now have over 32,000 people. Actually, I think this is well, I know this is already dated. I think it's almost 40,000 at this point certified on our Agent Studio. This is Oracle internal, Oracle Consulting. These are our partners. To show you a list of the partners, I talked about those two-digital Agent Studio.
These are partners large and small that have already been certified and already delivered, again, at least half a dozen agents each that have been validated by our development team to the Mgent marketplace. Every one of these agents is available to every one of our customers at no additional cost, so these are agents that have contributed to make the implementations easier, make industry functionality easier, make things better across the board, and I said it was a massive understatement, so not only are the six hundred that we built doesn't count what our customers built, doesn't count the marketplace. We had a hackathon on Monday at this event. Mike and I were just talking about it. Mike got a chance to poke his head into it.
We had a little bit over 165 different partners and customers, and it's all sitting around a table in rooms like this. So last year, for perspective, I promised 100 agents in Fusion. On Monday, not from Oracle, partners and customers in a hackathon, they built 109 agents, which will go into the marketplace. So when you talk about the speed and progress across the applications and the speed and progress of consuming AI and consuming all the great things that Clay talked about, that's just a perfect example. Okay. So let's take a look about industries.
Okay. Thanks, Steve. All right. So today already in our industry applications, or you know whether they're edge applications or they're more sophisticated applications like core banking, we have 2,400 customers that are already leveraging Oracle AI, embedded Oracle AI inside these applications today.
They are live across a large variety of industries. And again, as Steve mentioned, we have over 200 AI agents, features and agents just in our industry portfolio live today. We'll eclipse that, and I'll go through the numbers here in just a bit, very quickly, in the coming months in terms of total agents to market. In our view, and we've heard, you know, a lot of feedback that aren't applications in the future really just a collection of agents. And is, you know, what do you think about your stance there? And our view is, well, you're exactly right. And we're actually doing that thing exactly.
Our applications stack, whether they're industry applications, fusion applications, are quickly becoming a collection of AI agents, and our ability and our time to market is second to none. It's not just interesting that we have 2,400 customers live or 200 new AI agents, but if you go back 18 months, both of those numbers were zero. Zero. We had zero AI agents live in our industry applications, and we had zero industry customers live on any industry AI. 18 months later, we're at 2,400 in very sophisticated, highly regulated industries like healthcare, so far, the longest go-live we've had with an AI agent has been 3 weeks. That's been the longest. The shortest has been in a matter of days.
The amount of professional services dollars spent by the grand total of every customer, there were over 250 customers just in healthcare alone live on the AI agents is zero. Zero dollars. Self-directed, self-implemented. The amount of dollars spent on training collectively by all of those customers is zero. Zero training, works out of the box, and keep in mind, these things are working in highly sophisticated clinical applications. These are inpatient running inpatient rooms with no training. Yeah. I think just one point, your point about, is SaaS gonna change or what it is. Yeah. You know, especially in industry applications, but even take the finance example. You know, Maria Smith's team at Oracle, their role isn't to use our user interface and type invoices into the system.
Their role is to report our earnings, to pay our suppliers, to invoice our customers, to collect. Our IP was not our UI to do that. We hit great print in our IP. Sure. Our IP is how efficiently we organize that and get data to our customers. What we've done with these agents, the 20, the 600 +, is we've allowed customers to do that more efficiently. That's what we've always done, and now we have that in a much, much more accelerated way. Yeah. And of course, Steve, it's all much easier when you're a custodian of all the data you need to build these agents, right? And the fact that all of this runs on the Oracle AI Data Platform makes this engineering cycle, this innovation cycle just so much faster.
And the ability to roll this out as a quarterly update. It's we, you know, we really, we're quite excited by our customer's ability to update this. So, as you know, we operate in many different industries, and I'll go through a couple of examples here of agents that I think are really going to be very popular. The first is the embedded AI agent for retail. So this is an intelligent inventory agent. It's designed specifically to help customers bring together data from CRM, from merchandising, from Fusion inventory, from Oracle Analytics Cloud, from Oracle Xstore, which is a point-of-sale system that we have, from our procurement and order management systems in Fusion. The agent is not just a point of intelligence.
It's not just helping retailers decide which product should I put on the shelf and what is the impact on my forecast demand, my planning buys, and moving goods around, selling across channels, my omnichannel strategy. All that, all that, of course, is an output of the agent. The intelligence is built into the agent to figure that out. But it's also a very interesting point of integration. If you think about how much money organizations spend today on integrating applications across the stacks, and agents do it mostly for free. I mean, the ability to traverse data from multiple sources across multiple systems. And by the way, the CX, the merchandising, the inventory, the analytics, the X store, procurement, order management, it doesn't, they don't necessarily have to be all Oracle systems. That's not how the agents are designed.
The agents are designed to pull from multiple systems and put together points of intelligence to give, in this case, the retailer very high visibility so they can flag potential inventory issues and translate complex data into, you know, very clear, transparent recommendations. I'm sure those of you who have covered or understand retail inventory management is one of the costliest and most you know most difficult things. And if they could avoid liquidation of inventory, it's real money back. So, a lot going on in retail. I did mention a bit about healthcare. It's important to know that we've got dozens of AI agents live across our health ecosystem today, many more planned. We're looking at chart review, care navigation, clinical decision support, patient risk predictions, preventative care, and many more.
In fact, our next generation AI EHR is now generally available. It is generally available, and it is also approved by the regulators. This is also another interesting AI story. We've got customers running this in beta. Seema gonna talk a little bit more about this as we go forward and why we're so excited about this. And the feedback has been absolutely tremendous. You remember that these customers are coming from literally a Windows 95 Citrix experience to Agentic AI. That's the leap of technology that we see in healthcare. And the feedback is, it's not just a great business, and it's not just gonna be a great growth engine for applications, but it's actually an emotional piece as well. I mean, the feedback that we get from providers and patients is, "This changes my life.
It changes the way I practice medicine because you've actually given me a tool that helps me, not burdens me." And we've got a lot of competition in that market. But if you break it down and you think about AI as powering EHR, the one question I ask our competitors is, "Well, Steve, how many fuel cell power plants are you building onsite right now?" 'Cause if you're not doing that, probably not gonna have as good of a chance to be closely provisioned to a large language model and apply reasoning models and all the things you actually need to work to make this work at scale to automate an entire hospital. Takes usually the regulatory cycle for approval for electronic health record software today is about two to three years.
It takes about two-to-three years to get through a regulatory cycle. You get the self-fulfilling prophecy. Because the cycle is so long for approval, people end up running very old technology for a very long period of time because the pain of switching is very high. We were able to get through the full regulatory cycle in the United States for this in six months. The reason we got through the regulatory so, so quickly is because we actually used AI to generate a lot of the documentation. Now, we cross-check that we have a baseline. We make sure it's right. We make sure, we make sure it's accurate.
But it's actually saving us a tremendous amount of time and putting us in a position to be able to bring these products to market, particularly in heavily regulated industries like healthcare and banking and utilities, where there's a very large burden of documentation to actually bring products to market. 'Cause in healthcare, you've got federal government regulations, but you also have state-specific regulations. California's different than Texas, for example, in terms of some of the things you need to do for Medicaid compliance. We're actually able to automate all of that process across our industry applications and bring these products to market. As such, we think that regulators are actually happier with that as well. They're actually happier to receive more timely information.
They're actually happier to reduce the burden on them in terms of having to review an incredibly complex set of documentation. I'll switch gears now to banking, which is a really, really interesting business. So, you know, Steve, I mean, I know the Fusion applications are a very popular choice among big banks. You know, they have core MP systems, HCM systems, and the banks have largely had an appetite to move that to the cloud, over the years. Now, when we talk about core banking systems, you know, for lack of a better way to put it, the stuff that moves money around inside the bank, there's still a lot of mainframes, still a lot of minis. There's still a lot of, you know, very big iron running these applications.
And we are definitely, if you look at core banking applications moved to the cloud, let alone moved to AI, okay? Just think about the first step. You're in single-digit percentages in terms of that has been moved to the cloud. But as we said this week, AI changes everything. So the future of what we're building and helping banks move away from large on-premises bespoke applications is with an agentic approach. And we see this as a tremendous opportunity to unlock our core banking installed base and actually to win new customers in the shift to the cloud. Again, very, very difficult in a highly regulated, you know, very sophisticated application space like core banking, unless you're doing all of it.
It's gonna be very difficult for competitors to come in, we think, with piecemeal applications and stitch stuff together to make a core banking AI solution. So just, in the next year at most, just in the next year, just in the core banking space, we'll have 125 agents live. And I'm gonna go through some of those here in just a minute. Just, just in the next year. Steve, last year, made a prediction about 100. We ended up with 400. I think we, I think we have a good chance of being in a very similar situation here where we're even surprised as to how fast we can go. So these are across a wide variety of banking and insurance spaces. They're gonna include things for human interactions. We'll go through some of those a bit.
Domain agents and actually those that are really focused on improving accuracy and processing time. Our agents and features are designed to automate the entire bank and or the insurances that gave me. We're talking about corporate banking, retail banking, revenue management, billing, insurance policy administration, and just got so many terrific things coming out, so I'm gonna start with one of them, which is a huge pain point. Again, I'm not gonna go through all 125. I'm just gonna pick two in the interest of time here, but financial crimes and compliance, so this is a huge pain point for the banks as we talk with them, so you know, many of you may, you know, and your colleagues, for those of you that are with the big banks, spend an incredible amount of time.
It's very costly. It's very time-consuming to investigate financial crimes. Unfortunately, financial crimes are not going away, and you know, Selling It last year said the firm spent $155 billion in 2024 just on the investigation process. Tier one banks have thousands of people staffing investigative teams and are spending hundreds of millions of dollars a year. Leveraging AI, we can improve accuracy and drastically reduce the investigation time. So if you took a look at these three representative banks here, names withheld, but these banks in the chart, look how much they've disclosed that they're actually spending on this. We believe that our agentic approach, and we've been beta testing this with the banks, can save up to 60% of the work. Just in phase one of the agent, we should be able to knock out 60% of the work.
So I'm gonna take a look at some of it here. So this is the output of a case that's already fully being investigated by our financial crimes AI agent. And with the agent, without the agent, the investigator would've spent hours, days, weeks opening this case, digging through the details, understanding the parties, their professions, and whether they had prior activity with the bank. With the AI investigator, the initial groundwork, it's already done. Before the investigator even opens the case, it's done. And we can immediately see, hopefully you can see, on the slide that this individual is, in this case, a retired nurse on a fixed income and that she's had prior cases where suspicious wires were sent to unknown parties and they were blocked. Now, the AI investigator is gonna give the investigator a total overview of the case. It's showing patterns.
It's showing activities and typologies and things like funnel behavior, required repeated wire deposits, withdrawals to high-risk jurisdictions, and then I have the ability to see all of that over time so the investigator runs through this, decides whether or not they agree with AI, with the findings, and if they wanna dig deeper, they're certainly, certainly, able to dig deeper manually, but the interface is gonna show them all the relevant risk factors, and it provides a very clear narrative of what was observed versus what wasn't observed, and if the investigator agrees with the AI agent, they can finalize the decision. If not, they can intervene. The system's gonna guide them through all the various views so they can quickly see what happened and why. Now, the UI is built for the investigator to feel confident about decisions.
This is just exactly the same way our AI works in healthcare, not replacing the doctor, not taking away from the provider, but actually showing the provider, in this case, showing the investigator exactly how the decisions were built, exactly where they came from. The investigator is always in the loop. The person still signs off on this. If you could do your job 60% faster as a result of this, you know, I think we'd all be happy. This is a really powerful shift. We think it's a perfect application of AI today, in large banks. This is all generally available and really much easier for our customers to absorb because we're supplying as a service here the whole stack.
It's OCI, it's the database, it's the core banking applications, it's the analytics, it's the generative AI data platform, large language model integration, which wrote all the narratives here. In this case, again, if you're the custodian of all the data, it's so much easier to bring these type of agents to bank very, very quickly. All right. So I'll show you another one of the number two of the 125 is our retail banking agents. So we're gonna use AI here to improve both customer and bank experience and streamline a very critical business process. So ordinarily, the process of providing a personalized service to an individual customer inside a bank can be quite complex. I mean, banks like all businesses wanna provide as much personalized service as they possibly can.
But with AI, we think we can actually help the banks get to that more personalized consumer experience, when they wanna position exact services like loans or, you know, even before the customers may know they need them. So we've got a video here that we're gonna play to show this agent.
Meet Grace, a retail banking customer who wants her bank to quietly have her back every step of her day. And Davis, her banker, determined to make every interaction meaningful. AI connects them in ways they don't even notice. Grace checks her AI-powered dashboard to review her net worth as she researches personal loan options. Based on her preferences and spend pattern, AI identifies that she will have a cash deficit in the near term.
Accordingly, the product recommendation agent scans through all eligible products and offers the most relevant loan products which suit Grace's requirements and has lined them up for her to review. The personalized product agent chooses offers, helps compare products, and draws up a product summary for easy consumption. With a few taps, the smart application agent lets Grace upload her documents and initiate her application. The personalized repayment plan agent draws up a personalized repayment schedule based on her cash flows. It identifies additional cash availability in November, increases the corresponding installment, and she checks and accepts the plan. The application lands with Davis, the banker. Davis logs in and looks at his AI-powered dashboard where his day is organized and the proposed schedule is planned out for him. He returns to reviewing Grace's loan application.
The application tracker agent clearly categorizes the loan application by risk, completeness, timeliness, and approval probability. The credit decisioning agent also helps improve his due diligence with the responses to a qualitative scorecard based on Grace's application. The agent auto-approves the personal loan based on Grace's credit profile and with a risk-based pricing discount in minutes. Davis uses an AI-first ecosystem of experience and domain agents to streamline tasks, manage risks, and deliver personalized services, reinforcing his role as a custodian of trust. AI also integrates the bank into Grace's lifestyle, offering conversational engagement and tailored services supporting her financial wellness.
Okay. 123 more of those agents, such agents are planned for the next year here. So we'll have a lot more to say about banking again.
It's an industry where I think AI is just going to be a, you know, agentic AI applications are gonna be a tremendous wedge to move a very large, very dated, infrastructure-based, both applications and infrastructure as well, to our cloud and AI and AI solutions. So you see some of the stats across other industries. Again, I won't go through all the industries. I spoke about the clinical AI piece for hospitals. In hospitality, we released an upsell AI agent, which helps hoteliers understand where the upsell wheel may be, what's the sweet spot for engaging with customers, how many weeks prior to check-in, what are the services these people like, and how do we understand how to position.
Just in year one of rolling it out, which was actually on just a small bit of our install base for hotels, hoteliers generated $350 million in upsell. That's what they reported back to us in the upsell wheel from this AI agent. I spoke about the financial crimes and AI investigator. We just rolled out a brand new agent for energy optimization in our Opower stack in very early days. But, we got an immediate feedback from one of our very large energy customers that it saved them $2 million instantly in reduced calls to their call center. So this story goes on and on and on, across the board.
You know, what's really interesting about the banks too, if I go back to that, before I ask Mark for his comments, is the whole stack advantage, I think, is really interesting in banks. We're having conversations not just about how do you take all these very old custom bespoke applications and how do you take everything that's bolted onto them and how do we use AIs and unlock mechanisms and move them, but what are you gonna do with all of your infrastructure? And we think it's different than our competitors. Now, you've heard us say many times before that we have the widest portfolio of applications, of technology services, of cloud services, GPU, CPU on the market.
And that's been something that we've all been quite proud of and that we've been able to be, you know, so good in all those industries. One of the things that sometimes is a little suboptimal in that market is that you have a lot of salespeople. You have a lot of people who are calling on customers in many different pieces. And in the on-premises days and even in early cloud days, that worked because a lot of the buying was bipolar. But now we're seeing a shift in customer buying patterns. And we're seeing customers, as we position the whole stack, really wanting to talk about outcomes and less about all the parts that make up the outcome. In other words, we're selling and positioning to the customer an outcome.
We're selling a 70% reduction in financial crimes. We're selling a 49% reduction in paperwork for healthcare. And in order to do that, we've made a lot of changes about our go-to-market strategy. And I think that those changes are just as important as everything we've done in the product stack so that we can accurately communicate that to customers and engage with customers at the senior level. So, Mark, I'm gonna ask you to kinda walk us through what we've done to address this whole stack advantage.
Perfect. Thanks, Mike. I think AI changes everything. It changes how we develop products. It changes how we deliver products. It changes how our customers consume our AI agents embedded in our applications and our AI database and the capability that exists. It also changes how we go to market.
It's given us a unique opportunity to really transform how we engage our customers to bring the power of our entire portfolio to the industries and the customers that we serve all around the world. There's three key things that I wanna make sure that we all can walk away with, is that we are making Oracle easier to understand, easier to do business with. We have simplified and unified our teams of how we go to market and how we bring the capabilities of the industry suites of applications that have embedded agents and the ability to build and extend on our agent platforms, the ability to deploy our database anywhere and allow our customers to take advantage of the AI database that has AI-native capabilities in our data platform or take advantage of that infrastructure.
And we've done that in a way that allows our customers to work with us in a more seamless fashion. In addition to that, as Mike was describing, the capabilities that come across the entire platform, when we work with our customers around bringing our infrastructure, our data, and our applications together, that one Oracle advantage, when we do that at scale, the opportunity is incredible for our customers to be able to consume and bring outcomes that deliver real value where they're not focused on integrating solutions. They're not focused on deploying differentiated capabilities. They're not worrying about where their data is moving and how is it secured or not. When we bring that entire value, we do truly bring an advantage to our customers to focus on the things that matter most to them, which is the services that they are providing to their customers or their employees.
And we also, it also gives us a unique advantage to help our customers accelerate on their AI transformations with the capabilities that we have throughout the portfolio. So let me start with the simpler unified approach to the go-to-market. At the top level, it's bringing the Oracle brand to our customers. It's Oracle to our customer. It's Oracle to governments. And in some cases, it's Oracle to countries around how we bring the full capability. And that may be, in certain instances, bringing our cloud to the customer with the full stack capabilities that exist. If we really think about our commercial teams, and AI didn't allow us to say we're gonna reduce the commercial teams, it allowed us to deploy our resources so that we actually have more sellers in front of our customers that are bringing meaningful solutions for them.
In our applications teams, we bring together our industry suites and our Fusion suites together for our customers. If we're going into a retailer, we bring the entire suite of merchandising, right, capabilities around inventory management, the back office around financial consolidation, close, human capital, but also, we have the ability to bring other solutions, a retailer that may be online, but also stores that are deployed not only in the United States, but all over the world, our engineering and construction platform, because they need to build out those capabilities and need project management resources. We bring some capabilities around financing solutions, loan and lease applications that these customers need. Also, they may have a call center, well, guess what? Our communications business has call center capabilities to allow that customer to take advantage of the AI inherent capabilities.
So when we bring one resource to our customer around our full application suite, they can take advantage of the integrated capabilities that exist across that entire platform, which is a unique differentiation that there is no other competitor in the industry that has that capability. In addition to that, when we think about our AI data platform, and you'll hear from Larry later around the innovation around native AI capabilities in the Oracle AI database, and we announced and launched the AI data platform to allow our customers to take all that enterprise data. You know, our applications are data generators. They bring immense value around incremental data that is generated around a customer buying something, around an employee and how they work. Well, that data needs to go somewhere.
And in that data platform, the ability to bring the latest large language models to inference off of that data where that customer wants that data to be securely in that environment allows us to help customers transform. Well, in the past, when customers bought on-premise, we had a person that was selling to them on-premise. We had a person that was providing them and selling them hardware. We had a person that was trying to sell cloud databases and capabilities. Well, no longer it's one person that brings the entirety of our data platform to our customer, but giving them choice where they want to run database anywhere, on-premises, in OCI, in our multi-cloud partner capabilities, but also cloud at their, at their facility as well. In our infrastructure business, it's not just good enough to bring those Oracle workloads to OCI. We've developed a differentiated cloud.
We've gone from a disruptor in the industry, right, from an underdog to a disruptor, and bringing compute, storage, network capabilities at a differentiated level to our customers allows us to be the destination cloud for many customers that have critical capabilities around high networking, high compute-intensive workloads, but also we become the destination cloud for training and inferencing where we now have the ability to bring those models to the customers wherever they choose to securely and inference off of their private data, but again the opportunity to bring the entire stack to our customers is differentiated that no one else has the capabilities of this entire stack. We've never truly taken advantage of bringing it all together, and we are now.
This is an opportunity for us to continue to execute that scale across the globe of how we engage with every customer across every industry that we serve. But when we bring the one Oracle to our customers, we have incredible scale and advantage. Not only do our customers benefit from the integrated solutions and the capabilities, but Oracle also benefits. Let me give you some examples. If our customer base, if you can see here, 55% of the customers own a single product pillar, and let me explain that. Let's just call that 1X of spend. That may mean they own our database capabilities, or they may have a Fusion application. They may have ERP, or they may have HCM. They may be an infrastructure user and not amongst the other products, but that doesn't mean they're using all the products in that pillar.
They spend 1X with Oracle. As customers move down the path and have two pillars of those three where they may actually be using our database on OCI, or they may be using database with a Cloud@Customer. They may be a Fusion user that's using OCI for integration capabilities to other applications. When they use two of those pillars, they spend eight times more than the 55% of that customer base. As you move down the path and go to three where they're utilizing HCM and our data platform, you know, with a Cloud@Customer as an example, they spend 25 more than the average base, and that doesn't mean they're using everything out of the suite. It doesn't mean they're using the retail solution with merchandising, you know, and warehouse management and Fusion across HCM and ERP and CX and supply chain.
It just may mean they're using one of those products. But truly, as you go up the scale where customers are truly taking advantage of the entire suite, where a utility is taking advantage of our operational solutions and capabilities that cut across their customer care and meter data management, they're operating the grid. They're taking advantage of our energy efficiency solutions, managing their financial, financial capabilities, their customer experience, their human capital, running it on OCI and taking advantage of our data platform across their business. 2% of our customers in that category, 150x on average. That's on average. That means customers are spending three, four, 500x as well when they truly take advantage of the entire suite. That is why we have changed our go-to-market to simplify how we're engaging with our customers and executing.
And it's not only how we're engaging, but we're also transforming how we're allowing them to consume our products. You know, we started this with OCI, and we have a Universal Credit because customers don't know exactly what they're gonna use out of the products. They may flex in their compute. They may need more network. They may have some different types of storage. Well, we just announced our multi-cloud credit capability for our customers to deploy their database anywhere. It's one price no matter where they choose to deploy. And we're gonna continue to modernize and do the same with our applications for our customers and make it easier to consume and move up the stack for them to get more value out of our solutions. And there's just some example.
Yeah. Sorry. I think, Mark, just to clarify too, the customers are actually spending less.
They're just spending more with Oracle.
Correct. Because what Mike talked about, the single stack advantage and all the engineering and putting all the pieces together, instead of customers having to stitch it together and spending the money between that, they get two components, three components, and more that Mark shared, and it's not, you know, and in any one of these categories, on a standalone product basis, Oracle is still, right, rated as a top product and capability in that area. So bringing it together does not sacrifice the customer's capabilities, but it truly takes advantage of all of the integration that exists across the platform where customers can, can blend together different solutions and take advantage of capabilities across industries.
And whether that's in financial services along with our Fusion platform and our data platform, it allows them to spend their time, energy, and effort focusing on the things that matter most to them around their transformation, around their AI deployments, and not worrying about integrations, not worrying about patching, not worrying about continuing to make sure things are gonna work together. What cloud is it in? Where is it in? How does that need to work? We're allowing them to take advantage of the capabilities and the work that we are doing. And here's just a smattering of examples that you see here, which it doesn't cover every industry that we're in, but you could see it covers a variety of industries, whether it's in the energy industry and utilities, transportation, logistics, healthcare, financial services, hospitality, communications, high tech. It cuts across every one of these industries.
Whether customers choose to start with the entire platform, like Exelon, and Mike, you had the CEO of Exelon with you on stage on Monday, or it's AtlantiCare that's focused on deploying our latest AI-enabled health applications to have a data strategy running on OCI where they're focused on providing outcomes to patients and not trying to worry about securing all of their applications and managing that infrastructure. Or whether it's maybe Federal that's deploying the SaaS core banking solutions as well as our Fusion suite, but their CEO talked about the need to have a foundation of data across their entire business running that on OCI. Or Avis. You know, Avis has deployed Fusion across their platform. They recently brought their entire data stack from another proprietary environment into the Oracle AI database running on OCI, a trillion rows of data.
Now they're deploying AI agents embedded in Fusion and extending those with the agent platform and, in a very short period of time, realized the importance of how they can quickly tap into that trillion rows of data that they have in the Oracle AI database to deliver outcomes for their operations, their employees, and their customers to transform their experience, all running on OCI. That is the Oracle advantage when our customers take advantage of every layer of the stack.
And whether that's in OCI, where we've gone from an underdog to a disruptor, where we train and inference on the largest, large language models in the world, having the most secure, scalable, and deployable performance platform in the industry, to our AI data platform where we have the AI database with AI capabilities native to allow our customers to transform all of their data to be enterprise-ready for AI capabilities and our most complete suite of industry applications. It is a simpler, easier approach to the marketplace. It is truly differentiated that no one else has to allow us to accelerate our growth. But more importantly, it is to help our customers transform on their AI journeys. It truly is an unprecedented time, and AI chan`` everything of how we go to market as well.
Mark, quick question.
Early days in rolling this out, what's the feedback been from our customers?
Steve, I know you mentioned you heard some of it as well in terms of our new approach to customer engagement. The feedback has been amazing. The conversation quickly transforms to, "I didn't know that you had all of those capabilities because it wasn't necessarily brought to me in the past, because it was so many different people." Now I can see where you've developed, how you develop, and more importantly, how you can help me drive outcomes. It's amazing to see our customers, even many of the customers that were here speaking this week, learn so much more about what we have to offer and by doing it in a much simpler approach. And we're seeing our pipeline grow.
We're seeing customers choose more of the full suite of our capabilities, and they're selecting to run it on our infrastructure as well.
Perfect. Mark, Steve, thank you very much.
I just had the feedback of simplicity too. So, yeah, it's great. Great.
Thanks very much, guys.
Thank you. All right.
Thanks. Thank you.
Okay. So, we're gonna take a deeper dive into health, and then also at the end talk about what health has to do with our banking business, which I think is quite interesting. So to talk more about healthcare, I'm pleased to welcome Executive Vice President and General Manager of Oracle Health Globally, Seema Verma. Welcome. Thank you. So, Seema, healthcare continues to be front page news everywhere in the world, usually not for great reasons.
You have an interesting perspective as the former administrator of the largest payer in the world, which is the United States government, for healthcare. And for several years now, you've been engaging with both our commercial customers and our government customers around the world. You know, what's going on? What's top of mind for everybody in terms of the challenges and problems with costs and, you know, all the other things we read about?
So, you know, I've been in healthcare for a long time, but I think that this particular period I think is becoming more challenging. And let me just start with some high-level issues that I don't think we contemplated, not in this country or anywhere in the world.
First, that starts with just the aging population and people are living longer, which is a good thing, but we really haven't prepared the system. We don't have the workforce, so that's becoming very challenging, and I think most of us would say, "Hey, it's hard to get a doctor's appointment," so just getting into the system, and then the other part of it is we're seeing more disease. We look at in terms of mental health, we're seeing cancer happening in younger ages, so the system in and of itself is very stressed. At the same time, we're seeing costs just continue to rise, and for governments, this government, all governments around the world, this is just becoming unsustainable, so if we look at healthcare costs for states and a state budget, Medicaid is actually their largest budget item.
For the federal government, it's, you know, a third of the federal budget. And these are entitlement programs. So costs are continuing to go up. And I think there's a fatigue from government around how do we, how do we solve these problems? And if you look at the history, you know, just in the United States, the history of healthcare legislation and how government has tried to solve these problems is they've thrown money at the problem, right? So they've expanded programs, they've added more services. We think about the Part D program, the prescription drug program, adding more people to our existing entitlement programs. I think we're getting to a point, kind of a tipping point of fatigue with, with how the system operates.
And I think for the very first time, which you saw with the legislation that passed with the big, beautiful bill, was that this is kind of a pushback, right? So, in a long time, we've been adding, and this is the first time there's sort of a pushback where I think the government is saying there's fatigue. If we look at in the U.K., the same thing with the NHS, right? It's like we can't continue to sustain these programs. And so we think about what's going on and what is the problem here, like what is the underlying problem? And then there's lots of reasons of what's driving costs. But one of the things I think we could all agree on is that there's a lot of inefficiency in the system.
And that inefficiency kind of gets tied back to, we have a lot of people. This is a very people-oriented business doing a lot of work, and a lot of that work is very manual. It's very redundant. So there's, and if we go back to sort of say, why is it manual? Why is it redundant? I, you know, I would venture to say a lot of this is a data problem, and healthcare is inefficient because the data that we have is not. Sometimes it's too much, sometimes it's too little, and we don't have data at the right place in the right time. So I'll give you a couple of examples, like prior authorizations, right? So in this country, you wanna get a service, you need to have this, you know, the, the insurance company agree that this is an appropriate service.
And the back and forth that goes on is just a lot of time, a lot of energy. And so we're spending about 25%-30% of our healthcare dollar just on administrative costs. So it's just beginning to a point where it's unsustainable. Just this past, I think we're hearing that premiums are at an all-time high in the United States, one of the largest increases. So for all of the money that government has put into it, we're still seeing poor health outcomes, we're seeing more disease, and the system is just becoming more and more unsustainable.
Well, lots of problems.
Yeah. Yeah. Yeah.
Long, long list of problems. But as I say, you can't fix the problem unless you understand the problem. So maybe you can give us a little bit of inspiration.
Right.
And dive into how you think AI both in health and life sciences. I mean, I think because both have similar challenges and why you're so excited in the work that you've been leading for our team around AI, why you think it's helped solve these problems.
Yeah. So, you know, that is kind of a grim view of what's going on in healthcare. But I'll tell you what, I couldn't be more excited for where we are in healthcare, and that is the opportunity to use AI to help solve some of these challenges. I think we're at a point now where government can't do it, and they really do need the private sector. So we think about, you know, great moments in history, right? So we think about what shapes history.
A lot of it is government, a lot of it is war, but, you know, it's also technology, and technology has an opportunity to really reshape the future here for, for health. And I can't think of a better industry that can really benefit from AI because AI can help. We talked about a lot of the people problem, the data problem, and AI can really come in and, and help with a lot of those tasks, right, with agentic AI. But even just the understanding of the data and having that real-time data and not just a ton of data, but understanding what it means, right? So one of the issues, if we think about it in terms of healthcare costs, are drug costs, right? So expensive because it takes, it's very difficult to, to conduct a clinical trial.
And we have what, 1%-2% of the population even participates in clinical trials. Getting people, finding people, matching them to a trial is an enormous task. It costs the system a lot of money. And that's why drug, drugs cost so much money. There's a ton of money in development. But what if we could use AI to match patients, right? Right. So the doctor knows, you look at the patient, the doctor knows right away this patient qualifies for a clinical trial, and we can easily enroll the person in a clinical trial. Same thing with prior authorization. The doctor decides they wanna prescribe a certain test or a medication. That whole process can be completely automated.
So I think there's this incredible opportunity to automate a lot of the manual work and also to make sense of the data that we have and bring intelligence to the bedside.
Yeah. No, I certainly share your enthusiasm. I mean, I've had the great honor of being with you in many of these discussions with governments throughout the world, as well as large healthcare organizations, well, and frankly, small healthcare organizations too, in the United States, and you know, I think you're right. I think at first there was some skepticism around AI, as there always is in clinical settings.
But now, just with the, you know, the proof points that we've delivered and the things that we've delivered, even in early days, the line of people lined up to consume the AI is longer than I think that we could, the line has formed more quickly than we could have ever imagined. And maybe you could talk specifically about some of the products that we're bringing to market and again, why we think they're so, you know, have such a dramatic impact on this huge inefficiency and cost problem in healthcare.
Right. I think you're exactly right though, right? People, I think in the healthcare system, they're getting to a point where they understand that the problems that they have are not going to be solved by government anymore, right?
There's not gonna be this big bolus of cash that solves the problem and that they need to become more efficient, and there's a lot of enthusiasm around AI and what AI can do. I think there's also recognition that the technology that they've had in the healthcare system isn't going to cut it, right? You can't take a 1990s database and think that you can bolt on AI and that it's gonna work well, so the industry's been so excited about that all of the applications that we're bringing to the market. In our recent healthcare conference, we had the CEO of the Mayo Clinic, the CEO of the Cleveland Clinic, and just a lot of enthusiasm about what we're building,
but by the way, there are our competitors, two biggest, some of our competitors, biggest pro.
That's exactly right.
I think that our competitors' customers are starting to understand the power, the value of AI, and that they need a platform that can allow them to use AI to build their own agents, to bring in other agents, and the current products that are out there in the market are not quite cutting it, and they're not able to do it. I think the other thing that I really appreciate, and we can talk about the products in a second, but I think that our approach and Oracle's approach to AI and how we are using AI just in the last few months here has become very apparent that what we're bringing to the market is way superior. So you can probably tell them more about how we're building the AI and how that's better.
Yeah.
Well, we're gonna show a demo here maybe in just a minute. But as I said earlier, you know, in a way, you've gotta have the data in one spot to actually have you know any kind of an AI strategy, let alone in the clinical setting. And what you said here was really important. I think it's important to underscore that there actually isn't more money to put into the healthcare system. This is true globally. I mean, you know, governments and payers are running out of money to put into the healthcare system. At the same time, we need to deliver higher quality services to our customers. And the providers need to deliver higher quality, better outcome services, better outcome-oriented services to their patients.
But they gotta take a bunch of money out of the system at the same time. It's an incredible challenge. There really is no practical solution to that except for AI. I mean, we're hundreds of thousands of people short in terms of clinical providers just in the United States. We're not gonna manufacture, with the birth rate where it is. I don't think we're gonna manufacture enough people very, very quickly here to solve that problem. So we've been thinking long and hard about how do we solve this problem? How do we deliver better outcomes? And how do we actually take cost out? How do we help our healthcare customers spend less at the same time? And that's an important part of what we do. So we started. I'm gonna show a couple examples.
Yep.
Both the patient example as well as the provider example. This is a patient example. Here I am as a patient, and I just got my cholesterol report. What's the first thing you do when you get a lab in isolation? You get your doctor, you get a test, it comes back. First thing you do is paste it into some search engine or paste it into GPT or some other large language model and say, "What does this actually mean?" That's the first mistake, right? Right. Because the first mistake is you're only pasting that particular lab and that particular metric in isolation and none of your longitudinal record.
So the lab result, while it may not be normal, might still be okay clinically when you consider everything else you have going on and your other factors over time. So what we're doing is we're actually putting patients at the, and this is another problem. This has gotten even worse because a lot of states have these prompt notification laws where you actually get your test results sometimes before your doctor gets, which can be a little scary, right? You know, because if you're not a doctor, you don't know how to read this, it's a little scary. And then, you know, we got this situation where people are Googling this stuff. And what happens is that the doctor's call centers are completely overwhelmed as a result.
So when we're actually doing this, and this is based on our integration, we announced this at our health conference recently in Orlando. This is based upon coming back to what Clay said. It all starts with OCI. If we didn't have OCI and we didn't have the large language models actually running in very close proximity to our health applications, we wouldn't be able to do this. But when you ask a question and you receive a diagnosis, we're actually considering your entire Longitudinal Health record. We're considering all everything about you, and we're giving the providers guidelines as to what they're comfortable and not comfortable communicating to customers. And it's all in plain English. None of it is in medical jargon. So you can see that with these embedded engagement tools, the patient portal really becomes a basic hub for healthcare services.
It's a trusted partner in care because it's automated using large language models. It's automated using RAG by taking your Longitudinal Health Record and making the searches contextual, but it also has all the guide rails of the clinician in it so that we're not gonna actually get into a situation where we're communicating something clinically that a doctor would not wanna communicate themselves, and I can speak to it. I can talk to it. It's all completely AI powered. You can see here, I started to type, I started to talk, and I'm asking very simple questions. How does this compare to my last test? Questions in English, not questions about LDL versus HDL versus VLDL and all these other things. What does this metric mean? What does that metric mean?
Just speak in English. This is very helpful for patients, but it also saves the providers a ton of time and a ton of money in answering questions, direct questions from consumers in isolation. As I move forward, the integrated clinical assistant is also gonna guide the patient on the content that matters most. So you get this first, you wanna figure out what's wrong, if anything at all. The next question you have is, what should I do? What should I tell my doctor? What am I supposed to ask? You asked a very simple question. What should I ask my doctor? And the generative AI is gonna help the customer understand, help the patient in this understand, and actually make recommendations to say, this is how you should communicate with your doctor. These are the questions you should ask.
Particularly helpful sometimes for patients who have this situation where you go into a provider, the provider has 10 minutes or 15 minutes, and you get out, you say, I forgot to ask, I forgot to ask that. I meant to ask this. I meant to ask that. So all of this is done before the visit. And then actually drafts a note and said, Would you like me to draft a note to the doctor so you don't forget about this when you go into the provider? When you go into the provider, when you go into your next appointment, the provider has all of your questions ahead of time, has all of your contextual information, and patients are in a state of far less worry because they've got this whole process automated.
Again, we rolled that out, and without all the infrastructure, and without the, you know, the GPU infrastructure and the large language models running on top of it, we wouldn't be able to deliver that whole thing as a service.
Yeah, and I think it's an important piece, right, so we're looking at this portal, and let me kind of go up one level higher, right? Which is to say the engagement by us bringing these tools together, by bringing AI directly to the patient, right? What we're gonna see with our portal is engagement at a level that we don't normally see with most portals. Today, most of us have four or five portals on our phone because our healthcare data is in many different places. So, you know, the point that you made about bringing together the longitudinal record, that is not happening today.
We have our data in all these different places. So by Oracle bringing it all together and then also bringing AI, what we expect to see in our patient portal is a high level of engagement. And then we don't wanna just, you know, keep that to Oracle and our providers' customers, but we wanna open that up as well. So we're gonna allow the healthcare ecosystem to be able to connect directly with our portal as well. So we sort of see in the future that this portal becomes. This is your go-to place for all of your healthcare information. You should be able to connect it with wearables, different devices, maybe new, maybe Weight Watchers, whatever that is that you wanna connect your healthcare data with. So it kind of becomes a one-stop shop for the customer.
And we think, you know, we've been talking to a lot of the other healthcare companies out there, and, you know, they wanna partner with us because they wanna be able to have a place where they know patients are gonna be engaging. One of the biggest challenges for healthcare providers is that patient engagement. So I think that's the power of the portal that we're bringing.
Yeah. It's been stunning to see the engagement, you know, the engagement before and after engagement with this new breed of AI technology. And on the same token, we're rolling out the same thing for providers. Right. So if we flip the script now and you're in your doctor's office, and one of the first questions the doctor's saying, well, what can I help you with today?
And you sort of feel like saying, well, aren't you really supposed to know? I mean, shouldn't you kind of know that before we show up here? And oftentimes a doctor has the same problem. There's an overwhelming amount of information. They can't read everybody's chart. They can't read everything. And they actually wanna have that same consumer-like experience with the application. They wanna ask a very simple question. What should I know about William before I walk into the room? And that's the simple question that they ask. And now they get a summary here of the patient's last visit, of things that may be happening with social determinants of health, you know, not just medical records, but everything we know about that patient. Briefly summarized so the doctor walks into a far more engaging interaction with this customer.
And this is a preview of, well, this is actually generally available now. This is our new electronic health record. And you can see, just like we saw in the financial crimes investigation agent, where you have these timelines over time, that at any given point, I can click on this and see what's happened with this patient over time in a very easy-to-use screen. This is all runs on tablets. It runs on mobile phones. The form factor automatically resizes itself. And again, it's all built with AI built-in. There is not a non-AI version of this. The AI is the underlying mechanism that powers the entire EHR. We just, you know, went through, as I said, went through regulatory approval. And you're gonna see the same benefit that the consumer gets. The doctor gets the same as well.
Doctor wants to know about A1C trending over time. They have the same problem where you got a stack of PDFs and a stack of a stack of papers that are very difficult to get longitudinal information about, and therefore very difficult to have an intelligent, contextually aware conversation with patients. All of this is now available, and if you'd like, as a provider, all the screens are really not necessary because all of it can run in the background with a listener, and that's how most of our providers have actually chosen to update the new technology. You simply place the listener in the room. The listener can be a mobile device, and the whole thing is automated. The whole conversation between patient and provider is automated. The orders, the labs, the summaries, the patient discharge notes.
And just like our financial crimes investigation AI agent at the end, the doctor still has to sign off on it. It's just a, it's orders of magnitude different experience for patients and providers with healthcare systems. And, you know, we're just.
Yeah.
Hugely excited by it.
I think it's a game changer for the industry, right? So let me just go dig in a little bit deeper. You talked about the patient summary, right? And that's a short summary. But think about what it's like for the doctor. They're, let's say, dealing with a 70-year-old patient, right? There's multiple comorbidities. We think about the Veterans Administration that we're working with, right? And they're gonna extend the Oracle EHR across their entire enterprise. And this is really important because you can think about the complexity.
And they're having to go through sometimes thousands of pages of healthcare information. And a lot of times things get missed or we're not addressing something. The data's there, but the physician may not see it. They're in a very short period of time. So the ability to have that summary is a game changer. Yep. And like you said, the SDOH, all of those different things. And the automation, I think the other piece that's important is autonomous coding, right? So the other thing on the screen there is that it'll recommend which codes. And this is an important piece 'cause if we think about it for the healthcare industry, we're spending 25%-30% on administrative costs. A lot of that is around billing, and the cost to collect for providers is about 5%.
So all of that money just to be able to get the bill paid. And we're automating that entire process. So if we think about it, the whole billing process, at least in the United States, was built off a paper system, right? And then when we went to digital records, we just basically took that process and digitized this process. But with AI, we don't need to do all those things. We don't need all of the different middlemen to be able to get this claim paid, right? We can actually do all of that with AI. And so our EHR sort of starts as it's almost a misnomer to call it an electronic health record 'cause it's really a system of intelligence. And it goes way beyond just the patient-doctor interaction.
We started there, but we're trying to address all the problems across the entire ecosystem, so that autonomous coding piece is really solving a lot of the friction between payers and providers. Not only are we gonna try to make, you know, claims payment easier, but even just the prior authorization, all of those things that happen between the payers and providers, we're introducing those solutions, and it's ironic that it's actually the payers that are coming to us and they're saying, look, you know, after the big change event, there's a lot of concern about security, and so being able to have the Oracle Cloud in the middle of this helping with those transactions is really important to the industry. It's safer, it's more secure, and also we can use AI to automate a lot of those processes, so that's one piece on the payer side.
And then there's a whole other piece with clinical trials, right? So if we think about it, most clinical trials and a lot of research is only happening at big, large academic centers. So if you're lucky to go to one of those, you may get connected with a clinical trial. If not, you know, you never know what's gonna happen. And so what we're able to do with AI is we're able to do that matching inside the clinical trials. But where we're going is to be able to use that EHR, not to actually do the clinical trial right inside the electronic medical record. Today, with clinical trials, what's happening is that they have to go to an entirely different system. So they actually physically move the data from the electronic health record into the electronic data capture for the clinical trial.
So you can imagine a lot of the back-and-forth work and the retyping, and you gotta have somebody to check in all the information. So, you know, our electronic health record is not only addressing pay, it's actually providing payer solutions and also the ability to help clinical trials and research on that end of the spectrum. So it's doing more than just provide, you know, helping the doctor and the patient. It's a much broader view.
Yep. Well, that's terrific. I mean, and thank you for giving us hope and for sharing the solutions because I really do agree that, you know, bringing this whole consumer-like experience to healthcare, both for patients and providers, is what is going to be game-changing.
And obviously your passion and knowledge in the space has been a big driver in helping us get to this point. You know, something else interesting happened while we were bringing all this data together for healthcare providers. And that is, we actually started to realize that there was more utility in the data, in the collection of data on the AI data platform that we've used to power these healthcare systems than just for the providers themselves. But actually, as we've become, you know, intimately close to these providers, we realized, particularly in the United States, that a lot of healthcare providers have cash flow problems. And, for those that are publicly traded, when they report out their cash flow in days.
I mean, report out their available cash in days, not weeks, months, years, days. That's how much cash they have on hand. And they've gotta make decisions around payroll, ICU, you know, equipment for the ICU. These are real decisions that very big healthcare systems have to make. We looked at the data that we had, and we thought, you know, this might be interesting, and we asked questions about how do you finance these systems? How do they, well, the banks don't really know a whole lot about what we do. They don't know about our receivable flow. And it's not that they had a receivables problem, but they actually just have a cash flow problem.
It's not that they're not gonna get the money, but when they're gonna get the money is an acute problem, no pun intended, for healthcare systems, so we see the same kind of transactional flow happening across the board in our whole applications. You can see some of the stats there, but we particularly leaned into a couple of industries, and we're gonna talk about a few of them here, where the traditional financing models are very complex, and the financing models really are complex because the banks, the financer in this case, in this situation, does not have access to real-time data, and the data that they do have access to has very little qualitative information about it, you know, very little qualitative information associated with.
So we, through the course of conversations with the healthcare providers, construction companies, we're gonna talk about a few different examples here, and the banks, we actually found out that the AI data platform is a wonderful vehicle for banks to learn a lot more about their customers and for customers to have a much better relationship with their banks. So, we're gonna talk about embedded applications that benefit, why it benefits the banks and also why it benefits the business, or in this case, the healthcare provider. So with that, I'm very pleased and honored to have Lia, who's the managing director of Global Head of Payments and Embedded Finance at JPMorgan, and Jeff, the managing director and head of global lending for trade and supply chain finance at Bank of America, to come talk with us more about this exciting platform.
Hi, Mike.
Lia? Lia? Sue?
Yes. Thank you. Thank you. Sue?
So thank you, Lia. Thank you, Jeff, you know, for being here. I gave a very brief example, a very brief introduction to embedded finance and, you know, a little bit about healthcare, but I know it's more than healthcare involved. Jeff, maybe just go down the road. I'll start with you. Can you give us a little bit of a preview as to why this excitement is so exciting, you know, and what we're actually doing here together to better serve our collective customers?
Yeah. So, Mike, as you alluded to before, so if you think about healthcare providers, there are clients as well.
And, for years, you know, they've had working capital issues, and they come to their bank and say, "Look, we've got a working capital issue." We look at the model and we say, "Okay, it's not a payer problem. There's no shortage of customers," and what you realize quickly is what you have is just a working capital problem or a cash flow problem, and if you think about the way that banks help our clients solve these things, you know, each industry becomes kind of a little bit different because it's got its own characteristics, and we think about healthcare. Healthcare, the receivables can be hard, so the banks can do certain things, certainly have credit lines. You can, in some cases, hospitals can access the municipal bond market.
But some ways that we traditionally help to free up cash flow and accelerate working capital are difficult for the banks because we don't have what we need in terms of like, "Okay, who's gonna pay this and when and how much?" And for years, we chased that data, right? And, you know, Seema talked a little bit earlier about the data that's being followed to get a better health outcome. Well, we at the banks are sort of chasing that same data. Somebody could think, "Well, what does the bank care about a certain procedure and how much of that gets paid and when?" And the reason that's important to us is because if we've got this receivable and we're trying to accelerate cash, we need to know what's gonna be paid and when.
And it became clear to us as we went through this process, and Mike, you alluded to this earlier, that some of the data that you had aggregated, consolidated, and then were very good at being connected to was exactly the same data that we would need to provide additional working capital options. And as we thought about that, we said, "Okay, well, if we could do that, then we could layer in some of the AI predictive models on top of that, to be able to say, 'Okay, well, now we think we know with a bunch of procedures that happened what the expected payment will be and when,' right?" And any of you that have experience with healthcare know that the original invoiced amount has very little relationship with what's actually paid at the end of the day.
So we needed a way to get from point A to point B. And it became clear to us also as we saw that the availability of data that was coming in that Oracle had access to, not only that, but also you were very good at layering on that AI at the end of the process to say, "We're gonna give you a predictive model that tells you what's expected to be paid." And that solves our big problem, which is we don't know how to get from point A to point B in terms of what's invoiced and what's paid. And by doing this, we're able to help accelerate cash.
Thank you. Your perspective, Lia.
Yep. Perfect. Great to be here.
So, you know, at JPMorgan, we are partnering with Oracle to embed payment services and financial services directly into the Oracle ecosystem, so that, you know, in Oracle solution like industry applications, Oracle can provide those payment services and relevant financial services to their clients directly, in a very seamless, integrated, and scalable way, and I think for the banks, in my view, it's a very meaningful evolution of how financial services are accessed and distributed, right, really through a platform approach, and by doing so, I think there's tremendous benefits to our clients and to Oracle's clients as well, right? Think of it as a very streamlined, automated workflow so that the end clients have ultimate data and visibility and transparency of where the payments are.
And then to Jeff's point, also, the data, you know, plays a huge role here in delivering those benefits, especially in our strategic partnership. Like, as I think about at JPMorgan, we, in the payments division, right, we process about more than $10 trillion of payments every single day. So we have a tremendous amount of data around, you know, who pays whom and when and how and where, right, in which currency. And that coupled with the enormous amount of data in the Oracle, right, industry applications, with that partnership, we know a lot about every single client. And I think that, that is a very powerful way to serve our clients, to have that intelligence to, to serve our clients.
Yeah, well, that's very helpful. Thank you.
You know, Jeff, when we first started talking about this partnership, you know, you gave me some stats about how much money was sitting on the sidelines in each one of these industries waiting to invest. But the problem was that, you know, the people willing to invest didn't know enough about the industry and couldn't get enough of the qualitative metrics to feel comfortable about this investment. So this changes that, we think. But ultimately, how does it benefit you as a bank? What's the ultimate benefit to the bank? How does it change your unit economics? How does it change your investing philosophy?
So we can kind of look at it along two vectors. So the first is we've got clients who are looking for better working capital management, so we can help them with that, right?
I think that's the first thing, right, as we serve our clients. And we think that's a good thing because, as you know, as you both have noted, you've got hospitals in the United States alone that are struggling financially, right? And it's not because of lack of customers. It's not because of lack of high-quality payers. It's for free cash flow and working capital reasons often. So we can help. The second thing is part of our business is connecting our clients to investment options, right? As we think about this, we think about the opportunity to take some of what is out there, predictable receivables in a way that hadn't been there before with high-quality payers. We can translate that into investment opportunities for the capital markets.
Now, I can tell you the capital markets have, to your point, Mike, been waiting on the sidelines for great investment options that are always looking for things that are, you know, all over the risk profile. And for us to have ways to be able to wrap things like receivables and say to them, "Look, not only do we have this investment option for you, but it's coming from a bank in a regulated environment with a risk management profile under a regulated environment, and we've got it, and we're putting it out into the market," that means something to an investor, right? It means, okay, so there's been work that has been done behind the scenes to get us to the point where we think we have an investable asset.
And that work, again, I think the reality was five years or six years ago, there just wasn't enough out there. We were chasing data in a thousand different ways. And people underestimate, I think, what I always call the mechanical side of that. You know, we can do credit analysis pretty easily, but what we can't always do is say, like, how will the mechanical parts of this, where is the receivable being created, where is it housed, how will it get from wherever it sits to the bank system so that we can then do something with it and wrap it and send it to the markets? All of that stuff is impactful to an investor, right?
A smart investor looks at all of that and says, "Look, I don't wanna take my chances that this receivable becomes something else in the course of its lifetime." And I think as we track just historical data, performance data, you know, all of the data that becomes available, all of that starts to become solved, and then we've got investable assets. So we've got on one side of it, we're helping our clients be better payers. And on the other side of it, we're creating investable assets for our clients.
So is it, are these, are these investment products that you can create as a result, are these products that you previously couldn't create or you had to create with a different rating or different risk profile associated to them?
You know, it's a really, it's a really good question. So we're in the business of pricing risk, right?
So part of that is, as I said, the credit risk, which is easy, but the other part of it is just as important. And that's, you know, people underestimate that side of it. And I think the reality is if the receivables were really hard, the banks just kind of sat on the sidelines. So as you think about just some industries like healthcare, like construction, where there was uncertainty of payment from a bank's perspective, 'cause remember, by the time the data got to us, we only had access to small portions of it, right? So all of it just looked like it was very unreliable. So in the context of a regulated, risk-managed bank environment, there just wasn't a way we could get comfortable with the predictability of outcomes.
So for us, you know, we parked that part of our toolkit, and then we moved on to what we think we could do, right? Create a facility or municipal bond market or whatever it might have been, but that's the difference between what we had before and what we have.
Yeah. I think as we've talked, you know, Jeff, it's been quite interesting to understand the bank's philosophy on all this, and it's like, it's not that, and we're gonna talk about construction and retail and more than healthcare in a second, but it's not that the bank necessarily is interested that Medicare is going to reimburse the provider for a hip replacement.
What you're more interested in is to make sure that person doesn't get readmitted to the hospital in the next 30 days with an infection as a result of that, because that actually changes the reimbursement from Medicare to a value-based arrangement. That's the level of data that you need real time. Obviously, for one patient, easy, millions, hundreds of millions of patients. Yeah. If you'd have to scale that over time, very difficult to do without an AI data platform, you know, kind of constantly monitoring and scanning for anything that could change the risk profile as well.
This is to your point, you know, people might think to themselves, "Why would a bank be interested in that?" We're very interested in that because it's predictability of outcome. It's the predictability of us getting repaid and an investor getting repaid. So it's very important to us.
So Lia. Mike, I was just gonna add one more thing, right? Go ahead. So before we leave healthcare, right? Is that the government in and of itself is now moving more towards those value-based payments, more capitated payments where they're not, providers are not getting paid on more real time. It's more after a year of you've taken risk. And so for providers, the need for cash flow has really increased dramatically. So I know our providers are very happy about this solution as well.
Yeah. So many we could go on for a very long, just on healthcare alone, there's just so many interesting dynamics. And, Lia, I'm gonna shift gears for a little bit. Retail, you know, restaurants.
Yep.
Construction, all of these other industries where we're managing, you know, all of these transactions and, you know, either to point of sale or payment or, you know, money changing hands using the Oracle platform technologies today, across so many different industries. How do you see the different needs in embedded payments, you know, across these different vertical industries? How do you see that evolving and why is the automation so important to you?
Yeah. No, great question. I think see, across all the industry verticals, I think there are common themes, right? No matter which industry vertical, we want the payments to go out in a very secure way. I think we want to know, or Oracle and the clients want to know the transparency. Where is my money? If something goes wrong, they wanna know where the payment is stuck, right?
So I think there are foundational basic needs of security, transparency, right, and the optimization of the pay payments. Those are universal. Then to your point, like, for different industry verticals, there are nuances. We talked a lot about healthcare. Seema, you mentioned it's a very complicated ecosystem with payers and providers and regulation and HIPAA compliance. So that really, really requires a lot of the payments handling, right, to for data privacy and whatnot. You think about another vertical, consumer retail you mentioned, it's really fast evolving into, right, already omnichannel, right, in person, in-store experiences with, you know, in-app purchases with online. Like, how do we deliver that payment experience in a holistic way? And today we talked a lot about AI, and people talk about agentic commerce.
And in the future, maybe all the agents will do all the purchasing and whatnot on your behalf. Then we need to think about fraud and authentication and liability shift. And then what all those very interesting future business models, current business models that are evolving very rapidly. And I think with the power of data, combined, you know, in our partnership and with the AI models on top of the data, that's where I, what I meant earlier, like the payment intelligence, right? How do we deliver that insight into our Oracle clients so that they know the fraud prevention, they know how to position cash, how to really mobilize all the transactions to add value to the ecosystem?
Yeah. And so in some ways, you know, this automated embedding, you know, into the applications, it becomes another distribution channel, you know, for the bank of, you know, compliant distribution channel. We know a lot about the customers. We can automate KYC, all the things that need to be done. We've got that data, all that data available as part of our industry applications just becomes a wonderful distribution channel and frankly, far easier for consumers to consume because it's just built into everything that we do.
Totally. It's like a pre-built enterprise-grade, right, banking in the payments core that's already compliant and extensible. Yeah. So I think that is really where we're headed in co-creating this model.
Well, we appreciate it. So I'll ask you both the same question.
We're not the only technology vendor in the world. Why, why, why partner with us? Why Oracle? What, you know, what's special about this relationship between Oracle and the banks?
You know, look, you go with what works, right? And so there is, as I said, you know, we spend a lot of time chasing data to try and sort of predict outcomes, right? As I said, we're in the business of pricing risk, and predictability is a big part of what we do. And, you know, I can tell you we've burned a lot of calories on this in the banking world trying to sort of figure out how we can get from point A to point B in certain industry verticals.
You know, one of the things about, as I said earlier, when each sort of financing opportunity grows up in a certain industry, it becomes bespoke to that industry to a large extent. And it's hard to move from one to the other, right? You have to have a level of expertise in each industry. And what we realized, I think, about Oracle is you're doing a lot of the really hard work. You know, we don't always need to know, we talk about healthcare, like following data to get to a better health outcome is harder than what it is that we need to do. We're just sort of looking for, at the end of the day, we need predictability of payment. That's a part of it, but it's not the hardest part of it, right?
So you guys are doing the hardest work, but a lot of that is stuff that's really, really useful to us. And what we found is it doesn't look that different in healthcare as it does in construction, as it does in retail. So now not only have we been able to solve and unlock some of these industry verticals, we've actually been able to migrate from vertical to vertical because we realize, like, predictability of outcome in an Oracle-built model often looks the same across industries, but it's the same usefulness to us.
I think it's a terrific point, Jeff. It's like we're not building a financing vehicle that is specific to an industry. We're building a platform that has industry context built into it.
It's gonna give you the data that you need, but it actually scales across all of the industries that we serve, right?
And that's not what we expected at first, right? We expected to kind of stay industry by industry. And it was only after going down this road a bit that we realized, you know, this is more scalable and malleable than we thought it was, which I think is gonna be a really good outcome,
right. I think for us, the partnership with Oracle is really critical in terms of we share a lot of common values, right? I think Mikey talked about in your keynote yesterday, at the beginning of the conference, right? Scalability, security, full stack advantage. I mean, those are things that we look at our, you know, JPMorgan's platform, we share the same value.
And then we think by partnering with Oracle, that's a force multiplier, right? The scale is just gonna be that much more, even more global, more scalable, more extensible. And the security is really a critical, critical component. I think you also mentioned, right, whenever you add in a new component into the system, there's risk. By integrating the two, you know, global platforms in a very digital, right, API-driven way and a modular way, I think that creates this tremendous platform and infrastructure layer, to serve our clients, even better and more seamlessly.
Yeah. Well, that's much appreciated. And we do resonate with the shared values. I visited both of your organizations in person many times now, and even been on the trading floors.
I can tell you that the passion is very contagious for this access to real-time data. So much appreciated. To be clear, you know, we're going to market together right now in construction and restaurants and healthcare, and certainly plan so much more to come. I'll finish where I started is that good things happen when the data's all in one place. Our ability to aggregate operational data, both qualitative and quantitative data, and to deliver that as a service to you so that you can deliver that as a service back to our collective customers. We can't thank you enough for being a part of that. We can't thank you enough for pushing us and inspiring us and helping us to find this product. I'm so excited about what's to come. Thank you, everyone.
Thank you, guys.
We will now take a short break. Lunch is located next door. Our programming will resume momentarily. Please take your seats at this time and silence your mobile devices. Thank you. Please welcome to the stage, Larry Ellison.
Hi, everybody. Let's see. Okay, so everyone is very excited about AI. It does extraordinary things. In fact, my son and I were just consulting it on a legal matter this morning because anyway, there was an interesting dispute between a couple of investment bankers as to certain rules about how much stock foreign corporations could own under certain circumstances, and quickly ran off and asked a multimodal AI model and got an answer almost immediately. They're quite extraordinary.
They certainly know all the laws in the U.K., all the laws in the United States, the rules in the New York Stock Exchange, the SEC, all of that. It's trained on all of that data, and all that data is publicly available. What is less common and what people want to do but really can't do very easily right now is use their AI models, ChatGPT, Grok, Llama, Anthropic, all of them, to reason not on publicly available data where the models have been trained on the internet, all of the internet, but private data. The private trading records of an investment bank on Wall Street or the private genomic data of a genetic engineering company that's analyzing genomes, and genomes, it's an enormous amount of data, and they'll do gene sequencing on plants.
I'm actually going to show it in a minute because I think it's such an interesting example. We're working on a lot of plant genetics, and I'll describe a couple of the projects we have ongoing. But people don't realize how big plant genomes are. The human genome is right around three billion base pairs, actually a little short of three billion base pairs. The wheat genome is 15 billion base pairs, 15 billion base pairs. Because wheat's been around a lot longer than human beings, and wheat's been evolving over time. And when you sequence wheat, you get this massive information. And by the way, there are all these different varieties of—I didn't know this a year ago, two years ago. There are all these different varieties of wheat grown all over the world based on variations in soil and variations in climate.
They all do things slightly differently. They do photosynthesis slightly differently. The genomes are different. What you want to do, if you want to optimize wheat, if you want to increase yield or make it drought resistant, you want to really look around at all the varieties of wheat and analyze and understand the genomes of wheat. You've spent a lot of money to sequence all those wheat plants. That's your proprietary information. Your business is to produce this new variety of wheat that is going to be drought resistant, higher yield. That's your business. You don't want to share that information with other people. How do you do that? How do you do that? Oracle ran a project inside. The first thing we did, we took all of our customer data, all of our proprietary customer data.
We have a lot of customer information. We have a lot of people using Oracle in the cloud every day where we keep track of what they're doing. We're curious about what features of Fusion Applications they use more frequently, what features they tend not to use as frequently, what features are not so easy to use. People tend to make mistakes in that. So we monitor all of this. And we find someone's making a bunch of mistakes using a particular feature. The feature is not easy enough to use. And we want that insight, and we want to go ahead and fix it. But back to the basic problem.
The basic problem that has not been generally solved is how do you take these fabulous reasoning models and allow them to reason on your private data, whether it's plant genomics or customer usage in the cloud and what features are easy and what features are error-prone? How do you take your private data, make that available to AI models while keeping that private data private? And can we make that easy to do? Because everybody, everybody wants to do that. And we've been working on this problem for some time. And we call it, and we have a new version of the Oracle Database called the Oracle AI Database. And we didn't name it the AI Database just because AI is fashionable, as I said the other day.
We did it because the AI, our new database has a lot of new features to solve this problem, to solve this exact problem, to solve this one problem. How do you make all of your private data accessible to AI models for reasoning while keeping that data private without compromising data privacy in any way? And we've done that, and we call it the AI Database. And by the way, just the latest version of the Oracle Database. This is not an all-new database. This is not an AI Database in the sense that Pinecone is. It's just a vector database for AI. By the way, we added vectors to our database. But it is still the full Oracle Database. It has all the Oracle security features, all the Oracle high-performance features, all the recovery features, all of that.
But we've added all of this AI capability to the Oracle Database. So it's highly secure, highly reliable, highly scalable, very fast. But it makes your private data easily accessible by the AI models. Okay. And again, we also decided that if we're going to do that, if we're going to make the data, we actually should make it easy in the Oracle Cloud for users to pick their preferred AI model. We use a variety of AI models. I mean, again, Anthropic tends to be pretty good at code generation. So if you're doing programming, Anthropic is pretty good at that. ChatGPT is a phenomenal legal expert. I can just go into this. These models are somewhat different. And depending on the application, you might use one model or you might use a different model.
So, we decided to make all of the popular AI models available inside of our cloud. So, Oracle, if you go to OCI, you can get ChatGPT 5.0. You can get Google Gemini. You can get xAI, Grok 4, Heavy. You can get the latest versions of Llama from Meta. And because the Oracle AI Database, to be an AI database, to do reasoning, Oracle doesn't do the reasoning. You need the model on top of Oracle to do the reasoning. So, when you configure the Oracle AI Data Platform, you pick one of these models, and we then give you a private version of that model sitting on top of the Oracle Database with your private data in it. And so it's all there. You just configure the model you want. I mean, basically, you click on the name of the model you want.
We configure it and put it on top of your database. And then the Oracle Database will make all of the data, all of your private data that you authorize, it'll make it all available to the AI model. How does it do that? Well, it uses a technology called RAG, Retrieval Augmented Generation. It simply allows the AI model to read the database. There's an MCP server. The AI models are designed to be able to go and read anything on the internet that's publicly available. But they also can read private data, whether it's in a file system or in a database, different kinds of databases. It can do all of that. It can go out and look at that data, read it, and actually understand it. So that's what we did.
I'm going to go into a little bit of detail on how it works. What we added to the Oracle Database, obviously, we had RAG capability, but right now, the Oracle Database can vectorize all of your data in the database. It can vectorize text. It can vectorize images. It can vectorize videos, and what AI models understand, the format of data that AI models understand are vectors, and for example, a very famous AI search is vectorize a particular movie and then show me movies that are similar in content to that movie, and that's a vector search. Find me a movie where the vector is similar to the other movie or other movies somebody else watched. They use it for recommendation engines, but also for gene sequencing, for gene sequencing, there are genetics, there are gene sequences to do with photosynthesis.
You find a set of gene sequences that are doing photosynthesis, and you say, "Find me all the other genomes that I have. Find me the gene sequences that are involved with photosynthesis." And we can find that part of the genome. Searching through the 15 billion base pairs, we can immediately zoom in on that part of the genome. And then you can say, "Show me the differences between how this plant does photosynthesis and how this other plant does photosynthesis." And you do that with something called vector search. And there's a whole bunch of vector mathematics that has vector spaces and vector distances and all of that. But that's what we do. That's what we've done to make this an AI database is allow you to vectorize all of your data, all the different types of data.
Once you've done that, if you look at what we were able to do, and I say the first project doing this inside of Oracle, the first of our private data that we decided to make available to an AI multimodal model was our customer data. Because I'm not sure for us there's anything more valuable than our customer data. And then we started asking questions once we vectorized our customer data. And for example, you'll see what I mean by valuable. What Oracle customers, this is my second group under the second bullet, first line under the second bullet, what Oracle customers are likely to buy another Oracle product in the next six months? We'd like to know who they are. And we'd like to know, second line, what Oracle product are they most likely to buy?
And those are the kinds of questions you can ask using a reasoning model. And it tells you. And then you can ask it, but you can also have agents associated with this. It doesn't just have to be ask a one-step question.
You can actually ask this, "Okay, well, let's send emails to all of those prospective buyers, and let's show them the three best Oracle references for you, in other words, in your industry, in your country that have bought the same product and used it successfully." So as you look at this, I mean, you would build, if you were a company building CRM software or sometimes called CX software, customer engagement software, the ability to enable your customers to ask these kinds of questions, to build an CX application suite where you could do this kind of AI marketing, if you will, AI reasoning on top of the customer data, and qualify leads and then pursue leads with agents. You qualify leads through the reasoning process, and then through the agent process, you would go ahead and pursue those leads.
That's how you'd go about building the next generation of CX applications. And that's what we're doing. And we do all of that while maintaining the strict privacy of your customer data or your genomic data or whatever. And medical genomic data is very, very sensitive. So you have to keep it private. So the Oracle Database security model, which we've worked on for decades, is what we rely on to keep this private because the vectors are inside the Oracle Database. And we use a security model, again, that we've been working on for a long period of time to keep your data private while making it accessible to AI models. Next slide. Oh, I'll press my clicker button. Okay, there. Great. Okay. So as we're building the next generation of CX products, we are the owner of Java.
The Supreme Court might disagree with that, and Google probably would too because they won the case, but we bought Sun who developed Java. We are the primary maintainer of Java, and it is the world's most popular programming language to this day. However, in the age of AI, you can use something like Anthropic or ChatGPT or Grok to write code, and you can just declare your intent, in other words, just say what you want the program to do, and the AI will generate the step-by-step process to do it, so it won't be conventional coding. It's called vibe coding, an interesting term, which is just, I guess, field vibe, a very modern term for coding, but it's really just, what do you want the program to do? You just tell me what you want the program to do, and don't worry, I'll generate it.
Now, we have been doing this for a while, doing code generation for a while, and there's a big debate inside of Oracle, and there are people on both sides of the debate. I'll tell you what side I'm on, which is programming in English. In other words, declaring what you want the program to do in English. English is a notoriously imprecise way to communicate. It's not like mathematics. There's a lot of ambiguity in English. To make English perfectly clear, perfectly precise is very difficult, so we think you could have an alternative declarative programming language, which was designed to declare intent of what a program should do. You could create that language with great precision, and then with great precision, you could generate the code. Now, the jury's out.
People are declaring intent in English and generating code, and people are declaring intent in specialized languages to declare intent designed to be precise and designed specifically for code generation. I've done both. I'm a great believer in a more specialized language. I think once you learn the specialized language, you're much more productive in generating code, and our experience is, I mean, these are huge, huge improvements in productivity, 10x productivity if you generate the code rather than writing the code. One of the reasons we felt we could take on something like Cerner, and we knew we were going to have to rewrite all the Cerner code. We're going to have to rewrite it all, and it took decades for Cerner to write that code, and we thought we could rewrite it all in a few years because we weren't going to rewrite it.
We were going to generate it using AI. And that's what we've done and built very complicated and interesting agents using AI. And in fact, in the future, what are computer programs? Computer programs are a collection of agents connected by workflow. That's what they'll be. Okay. This is a big leap from code generation and making your private data accessible to AI models. This is a greenhouse that we have developed. It's version three of a greenhouse that we're developing. Actually, it's Danny Hillis' team. Oracle made an investment in what's called Applied Invention. Danny Hillis, by the way, invented thinking machines when he was an undergraduate at MIT. Actually, Danny and I used to be competitors. I was working on a computer with a Caltech team doing something called nCUBE, which was massively parallel computing. It was parallel computing.
It was, if you will, the precursor to NVIDIA and vector processing in computing. It was doing a lot of calculations in parallel, which obviously is very important now. Unfortunately, as Danny and I discussed, because we both failed, nCUBE failed and thinking machines failed, we both lost a lot of money and a lot of time on this. We learned a lot, but we were about, I was going to say 20 years too early, but it was more than that. I don't even want to talk about it. I'm going to do the calculations, but anyway, Danny is now, we bought his company, and one of the things that we're working on is, and he's really responsible for, amongst other things, robotics inside of Oracle because robotics is a very special case of AI. The leading AI model for robotics is very easy to figure out.
It's owned by Elon Musk, but it's not Grok. It's the AI model he built at Tesla, and the first popular common robots in the world, I mean, yeah, there are robots. There are general-purpose robots. Yes, there are robots that assemble PCs and laptops and desktop PCs and a lot of electronics and iPhones and all of that. There are tons of special-purpose robots, but the first one I would call kind of general-purpose robot is the self-driving robot that Tesla has created, and then Elon is working on his second group of robots, which are humanoid robots, so he's going from four-wheel robots to two-legged, two-armed robots, general-purpose robots using full AI model, real-time AI models, and we're not creating the AI models from scratch. We're using those models to automate a variety of things.
This is a greenhouse, by the way, that has no people in it. That yellow thing over there, they're very large. The reason there are no people in it, there are a few reasons. Is one is the atmosphere inside of this greenhouse. By the way, there's no structure. This building is held up by air pressure. So there's positive air pressure inside of the building, and that holds up the roof, which is ETFE, which is a kind of plastic that lets through. It's the most transparent material in the world. It lets through more light than any other material, more light than glass or other forms of plastic. If you talk to Danny and said, "Well, what do greenhouses do?" Well, basically, they convert sunlight and CO2 into food.
We don't let people in here because people can contaminate the plants, and people aren't going to like all the CO2 and all the humidity we have in here. The robots will move the plants from the growing area into the harvesting area. AI decides when you harvest the plants. There are cameras. AI decides. We can grow different crops in here. It's the only greenhouse you can have lots of different crops inside the same greenhouse because it's completely computerized. The nutrition, if you're growing strawberries, is very different than the nutrition when you're growing lettuce. The heating requirements, we heat the plants from the bottom. We don't heat the whole building. Anyway, I'm not going to give you all the details. Each area is carefully climate-controlled, atmospherically controlled. The hydroponic nutrition is all computer-controlled.
And it's designed to produce food at a much higher quality and a much lower cost than currently other forms of indoor growing. By the way, when you grow indoors, you use 90% less water, which is truly incredible. By the way, this is also a habitat for Mars. So if you think about an interesting use for a greenhouse, I don't think it's a huge market. I don't think our Martian market's going to be gigantic. It's not in our numbers, by the way. None of the Martian consumption of our technology is in any of our numbers. That's all upside. So what does the greenhouse do? The greenhouse grows food. And if you're going to have people living on Mars, you need to grow food. But it also converts CO2 into oxygen, doesn't it?
So if you're going to have people living on Mars, they're going to have to breathe, and you can either ship the oxygen from Earth to Mars, or you can create the oxygen from CO2 on Mars, and the people can create the CO2 that the plants will consume. So this is a kind of combination. It's in tent, by the way. Our first market, we really have focused on Earth, but just notice that this thing would be a perfect habitat for Mars or other places you want to go, as I say. It's really inside our business plan. This is what the whole building looks like on the outside. The green areas are the harvesting areas. The robots will move the plants into the harvesting and packaging areas, and again, there are no people in the growing areas at all. It's kind of interesting.
As long as we're talking about plants, I want to give you an example of the kind of things we're doing with our database to make it work better with AI, and in addition to being able to vectorize all of the data, which makes the data easy for AI to consume, we've also created special data types. Let me get my verb tense correct. We are in the process of creating special data types for DNA, and the DNA searches, by the way, that is wrong and it's my fault. It says the wheat genome is 15 million base pairs. It's 15 billion base pairs. Human being, 3 billion base pairs, and it's got all of the historic genes, all the genes that became obsolete during the hundreds of millions of years of evolution of wheat on the planet Earth. It's just a grass.
Wheat on the planet Earth is recorded in that genome. That's why it's so large. We don't delete. When genes suddenly stop being used, they don't get deleted. So there's a whole history of how the wheat evolved. That's very interesting that you capture when you gene sequence wheat or you gene sequence plants or animals for that matter, you capture their evolutionary history. And the things we're doing, another team is doing, not Danny's team, is looking at wheat photosynthesis. And if you can improve wheat photosynthesis, and they have, they have improved wheat photosynthesis using AI. They've changed the genome. They've used CRISPR-Cas9, a gene editing technology, to improve photosynthesis. When you improve photosynthesis, you convert more CO2 and more sunlight into food. So the yield per acre of this wheat we created, they created. I didn't have much to do with it.
I didn't have much to do with it. Actually, it's 20% more yield per acre. So we're producing more food in the same space. Another thing that we're looking at, and that's already working today. So the increased yield through using AI to figure out how to improve photosynthesis. And once we figure out how to do it, you use CRISPR-Cas9 to go ahead and do it and actually do the gene edits. The thing we're looking at right now is not just converting CO2 into food, but also converting CO2 into calcium carbonate. I'm sure all of you will be fascinated by the fact that the coral reefs are made up of calcium carbonate. The plants that live in coral reefs actually secrete calcium carbonate, their own skeleton. Well, we have skeletons too, right? They also have a lot of calcium in them.
We are now engineering a version of wheat that will convert CO2 out of the atmosphere into calcium carbonate, an inert version of CO2. Therefore, you can take vast, you can have varieties of wheat that take vast amounts of CO2 out of the atmosphere and deposit it in the ground as little tiny microscopic pieces of sand, calcium carbonate sand, and you can manage the level of CO2 in the atmosphere pretty much to whatever you want it to be at no cost. At no cost, and the world right now is working on this problem of figuring out what to do with CO2 in the atmosphere, and they have all these interesting ideas, including getting rid of all fossil fuels, which is very difficult, but there are other ways to tackle the problem that might be much easier and much more cost-effective.
And then, by the way, if you look at the markets, you can actually figure out if you're a farmer. You can use satellite imagery, AI and satellite imagery to look at your wheat fields and figure out exactly how much CO2 you're taking out of the atmosphere. And then you can get carbon credits. You can trade it for carbon credits. Okay. Kind of an interesting approach to, initially, a much simpler approach to managing CO2 in the atmosphere than what we're currently doing. Another interesting example that we're working on, again, with plant genomics, using DNA data types, using vector search, using all these AI capabilities. Right now, we fertilize almost all the plants that we fertilize in North America and Europe throughout. We fertilize them with nitrogen fertilizer, huge amounts of nitrogen fertilizer.
Because even though there's tons of nitrogen in the atmosphere, nitrogen is the most common element in our atmosphere. We breathe it in. We don't use it, but we breathe it in every time we take a breath. But some plants, soybeans, for example, actually get their nitrogen directly from the atmosphere. Well, we can engineer corn or wheat or other plants to get their nitrogen from the atmosphere. So we don't need fertilizer. And fertilizer does incredible environmental damage. During rains, it runs off into rivers. A bunch of nitrogen runs into the rivers and into lakes. You get these big plankton, these big blooms of plants in the lake and in the rivers. And plus, a lot of farms can't afford to buy fertilizer. Therefore, their yields are half what the farm yield should be. Well, we can fix that.
We can fix that by having versions of corn, version of these grains that fix the nitrogen from the atmosphere. So that's my presentation on what we're doing with the Oracle database. The primary thing, make it easy for people to use AI models on top of their private data while keeping it private. And then we have all of these advanced technologies that we're adding right now so that if you are in the business of genomics, and by the way, that's every pathology department, that's all of medicine is in the business of genomics. All of agriculture is in the business of genomics, these two enormous businesses. Your database has to fully understand DNA and have operators that can operate on these enormous data types, these enormous genomic data types. That's what we've been doing. That's what we're currently doing.
I think, again, fully ignoring Mars, there's enormous upside to our database business over the next five years. We think it's going to be one of our fastest-growing businesses, and we don't think, and we're happy to talk about this later in Q&A, we don't think we have a lot of competitors left in the database business. It's fascinating. Our primary competitors in database, let's say Databricks, you could say, is new, or it's kind of the newest one, Snowflake. They don't even do transactions. They're query-only systems. There aren't a lot of new database technologies being invested in. We're kind of the only game in town. We think that gives us an enormous opportunity to increase our franchise in the database business over the next five years and the dawn of the AI era if we can merge our database technology with the latest AI technology.
Thank you very much.
Please welcome to the stage, Doug Kehring.
I think Larry used the magic word, upside. So I'll be presenting our updated financial outlook, which I know I think everyone's been anxiously waiting to hear. But let's start off with the usual exciting stuff that Ken went over earlier. It's funny now that they're asking me to certify the financials. This actually means a lot to me. So please pay attention. Okay? This one reminds you about our forward-looking statements. And this one is reminding you that we'll be using some non-GAAP financial measures in my presentation. Okay. You've heard the strategy throughout the day from Larry, Clay, Mike, and the rest of our management team. We have an unbelievably strong and deep set of enterprise technologies for both cloud and AI. And we have the vision, leadership, and experience to execute this. It's now time to see how all of this impacts our financials.
Building on our long-term expectations can be boiled down to a few simple steps. First, we work extremely hard to turn the customer momentum we are seeing, as evidenced by the amazing enthusiasm you've seen here at AI World, into an accelerating RPO backlog. Second, we then deploy our operational expertise to provide capacity to customers that help us turn that backlog into accelerating revenue growth. Third, we leverage this growing footprint, scale, and utilization rates of our data centers to turn this larger revenue into profit growth, and the result is that we are raising our long-term financial outlook again. I'm going to quickly walk you through the process of how we arrived at these new figures. The best way to think about Oracle these days is as a hyper-growth company. Our remaining performance obligations, or RPO, is the clearest indicator of the revenue that's about to come.
As we announced on Q1 earnings, we highlighted two things on this topic. First, that our RPO balance now exceeds $455 billion, up 359% year- over- year, and second, that we expect RPO to likely exceed $500 billion. In fact, already through the first month and a half of Q2, we've signed several additional large contracts, as Clay mentioned during his presentation, which put us over the $500 billion mark. The demand we are seeing is really hard to comprehend. To put it in perspective, our RPO balance is up nearly 10x since fiscal year 2022. Clearly, the customer demand is strong, but it's also enduring. The biggest impediment to growth right now isn't so much finding customer opportunities, but rather executing on these opportunities by converting that demand into revenue as soon as possible.
As our data center operations engine has revved up and we've become more experienced at it, we are bringing on capacity faster and faster and with more efficiency, as Clay discussed earlier. The ramping has already started, as you've seen over the last couple of years, with cloud growing as a percentage of total revenue from 20% in FY 2020 to 44% in FY 2025. As cloud crosses the 50% mark as a percentage of our total revenue, the revenue growth rate is further accelerating, as evidenced by the forecasted 16% growth rate for fiscal year 2026. To put this growth rate in perspective, the last time Oracle grew this fast organically was over 15 years ago.
As well, when you look at the expected revenue growth rates over the next 12 months, for companies in the S&P 500 with more than $50 billion of revenue, there are less than five companies growing faster than Oracle. And we aren't even close to seeing the peak growth rate yet. As our revenue begins to accelerate, so does our operating income growth. The reason is that our pricing discipline, coupled with scale efficiencies, enables us to gain significant profit leverage as our revenue grows. As the utilization rate of each data center increases, they contribute more profits. And the rest of our operating expenses have grown much more slowly than revenue, also helping further our profit growth. Now, before I turn to the updated financial outlook, I wanted to revisit the figures that we presented at last year's financial analyst meeting.
As you may recall, we announced last year our expectation to reach over $100 billion in total revenue by FY 2029, nearly double the revenue from FY 2025, while simultaneously accelerating our profit growth. But that was last year. As we dive into this year's outlook, I want to start by explaining what is guiding us as we work to deliver the financial outlook that I'm about to share. First, every customer is put through the prospect of the lens of both a revenue opportunity and a profit opportunity. I've read a lot of stories that are speculating that Oracle is chasing revenue for revenue's sake. But let's be crystal clear. We only pursue opportunities where we have a clear line of sight to attractive margins that reward us for our intellectual property and the activity we bring to customers.
Second, as we work to build capacity, we are pursuing a range of financing options to support our growth. We pay careful attention to our cash flow, our debt ratings, our debt capacity, and the various funding mechanisms that are at our disposal. These all factor into how we strategically grow revenue and profits faster. Third, we are working diligently to constantly match our expenses as our revenue ramps in our data centers. This operational discipline is critical to be able to deliver profits to our shareholders faster. And fourth, our overall cost focus on every aspect of the company will help us deliver higher profits from our revenue base. That has not changed, and it will not change. And finally, if all of this works in harmony, as we expect, the result is superior investor returns for our shareholders. So here goes.
Clay showed you earlier our updated infrastructure revenue targets, which are even higher than what we shared on our Q1 earnings call. Building on that, along with the RPO backlog that we've already signed, the customer opportunity pipeline, and the strength of our competitive differentiators, we see much more revenue upside in the next five years than just a year ago. Our updated revenue target is to reach $225 billion by fiscal year 2030. This represents a CAGR of over 31% over the next five years. Our pipeline is very deep, and we could see more large-scale opportunities signed over the next 12 months, which could change this forecast and outlook further, and in terms of profit growth, we forecast reaching $21 of EPS by fiscal year 2030. This represents a CAGR of 28% over the next five years, consistent with my comments that revenue and profits are symbiotic.
These figures are stunning, with both revenue and EPS growing nearly 4x over the next five years. Now, before I turn it back to Larry, Clay, and Mike for Q&A, it's important to note that these figures are as of this moment in time. If we see additional demand that enables us to grow revenue and profits faster, we will accelerate near-term investments in order to capture additional market share. As Larry recently said, "AI is a much bigger deal than the industrial revolution, electricity, and everything that has come before." We are extremely well-positioned for this opportunity, and we will pursue more growth so long as it fits our profitability expectations. Thank you.
Please welcome back to the stage Larry Ellison, Clay Magouyrk, and Mike Sicilia.
Give me a moment to recover from those numbers. Okay. Oh my God, this is nuts. Okay, no questions. Thank you all very much.
Thank you. Oh, sure, I'll stand up. Hey, guys. Jackson Ader at KeyBanc Capital Markets. Thanks for doing this. Yeah, this is great. All right, let's start with those numbers. I guess first, I'm curious, Clay, when you gave the gigawatt illustrative example, right? Like we're going to make $60 billion, it's going to cost $39 billion, something like that. Is that purely illustrative? Is that an average? And then your largest customers, whether it's Meta, OpenAI, or what have you, are they even close to that type of gross margin, or do they come in below that? Thanks.
Sure. So first, it is illustrative. But the reason why it's more illustrative than exact details is because a gigawatt is changing very quickly, right? Is it a gigawatt of H200s? Is it a gigawatt of GB200? Is it a gigawatt of GB300? Are we talking about MI355? What's the mixture of the number of GPUs to the amount of storage compared to the amount of general-purpose compute? So obviously, it doesn't sound like a big difference, but that can be plus or minus 10%-20% of revenue both ways. In terms of the margin profile, no, it's very illustrative of even the very largest customers. So we are very committed.
When I gave you that range of margin, it wasn't like, "This is the margin, and there's an asterisk, and this is only for the customers that aren't driving all of the revenue growth." That would be counterproductive for me. It'd be counterproductive for you. That absolutely is illustrative of even the very biggest deals that we're doing. I feel like you're confused.
Microphone back. No. Okay, no, that is very helpful. And then I'm curious, Larry, maybe this makes more sense for you. Just if we think, and I'll stand up again, I'm sorry. If we think about what AI can do to your internal operations, we're looking at $21 in EPS in five years. What does that imply your internal usage of AI does for your operating expense growth in that kind of timeframe?
Yeah, I think, I mean, we've underestimated the positive impact of AI for internal use. I don't think we've, but I'll let the guys comment on that. But I don't think we do have estimates. I mean, clearly, our programmer is going to be more productive. We're going to produce more product. We're going to be more ambitious. We're going to write more programs. We're going from industry suites to, if you will, entire ecosystems, where you look at the healthcare ecosystem as hospitals plus government regulators, plus pharma companies, individual patients, its entire ecosystems. And we'll be able to do that. So a lot of the productivity will be we'll have more comprehensive suites of software. But I would still say we've not fully accounted for the scale of the productivity gains, but I'll let these guys respond to that because they actually run the business.
Yeah, a couple of examples I think that are relevant. Just internally, I shared earlier in the healthcare section, usually these regulatory cycles take two to three years to get through approval. We got through a regulatory cycle in six months because we're able to automate all the documentation required for the regulatory. Now, that doesn't really displace any people, but allows us to get to market far more quickly and capture revenue much earlier in a cycle than we would have had. To Larry's point, we start to look at certain functions across the company in our engineering space, and we do have some internal estimates that we've been going through, and frankly, they keep changing for the better. They actually keep getting better. Code generation, QA, support ticketing, support tickets. I mean, all these operations are large global scale operations.
At this point, we're really focused on productivity enhancements. How can we make people more productive so that we don't have to bring on lots of, a lot more labor to get the same thing done? Early days in terms of where we'll go in terms of the labor force in general, but I would say we're certainly optimistic that, particularly for at least industry development, I can tell you very specifically that we hit a baseline in cost that scales dramatically, and we don't need to add additional cost from a labor perspective to get not just the same amount done, but actually about twice or three times as much done. That's the kind of scale factor that we're looking at.
Let me add one more thing. We really can't just look internally at how AI is going to change Oracle and make us more productive. It's also going to change our customers. It's going to change the FDA. What happens if the FDA can get through clinical trials in half the time they used to do? What happens if pharma companies can design drugs in half the time and a quarter of the cost than they used to do it? It's not just us changing internally. The entire ecosystem called the planet Earth is going to start becoming more efficient, and there are going to be these incredible effects on the entire economy. We will be a much more prosperous and wealthier world because of artificial intelligence, because of robotics, because of drug design, because of government agencies and regulators making use of these technologies.
As they say, AI changes everything. It's going to make us much more efficient across the board.
All right, thank you.
Wonderful. Thanks so much, Rishi Jaluria, RBC. Really appreciate the session. A lot of great detail and obviously amazing to see these sort of numbers. One question that I think a lot of us have been kind of weighing on the architecture side is, and maybe it's an overly simplistic way, but what goes into as we think about the balance of power shifting from training to inferencing, fine-tuning, reasoning, et cetera, how easy is it to repurpose that architecture and really eke out the most efficiency out of all that you're building? Maybe if you could walk us through, that'd be helpful. Thank you.
Sure. Well, look, in my keynote yesterday, Peter from OpenAI, I think, did a good job of addressing that exact question. So for an even better answer, we can send you the video link. But one of the things he described is that especially for these large model providers, it's critically important that they have flexible architecture. And the reason for that is that there's this concept that, hey, models get trained, and then they get copied off somewhere, and then reasoning happens. But actually, models are constantly being updated. And also, as a provider, you need the flexibility to shift back and forth, right? You're doing research on new models. You're doing training updates on existing models, and you also have customer demand. Let's say that you're a provider and you have a viral event where suddenly you get a huge amount of demand.
You need the ability to say, "Okay, let's not do that training run and use that for inferencing." So obviously, it's better if the infrastructure is flexible. Well, it turns out you can build it flexibly, and that's exactly what we're doing, right? So you build it for the maximum requirements, which is really the highest kind of demand training workloads. And if you do it well, it's a huge amount of effort around optimizing the power and the data center design, optimizing the networking design. Then fundamentally, that infrastructure is both very capable of doing the reasoning, as well as it's also very cost-effective at doing the reasoning.
Great. Thanks very much, Tyler Radke from Citi. Congrats again to you, Clay and Mike, on the well-deserved promotions. Clay, I wanted to go back to one of the slides from your opening presentation around the AI database forecast. I think you're expecting $20 billion of revenue in FY 2030, and I was just wondering if you could kind of unpack the or stack rank the key drivers of that. Is that kind of the traditional database migrations to the cloud, and then what is it going to kind of take for the AI startups that are buying your infrastructure to start using your databases?
I don't know if you saw it, but I signed this up for $20 billion.
I saw it.
Okay, good. Well, honestly, I say, Larry, why don't you start? I think it's actually a great question of why are you so excited about people adopting our AI data platform?
Yeah. Again, I think there's a lot of hype. It's an incredible number. The $20 billion is a pretty solid growth rate over the next five years. That said, we think everyone is going to want to do reasoning on top of, by the way, reasoning and inference. I'm using reasoning and inferencing the same way. Inferencing is kind of the AI models do more than inferencing. They do deduction, they do inferencing, they do rules, they do mathematical calculations. So reasoning might be a little more modern word than inferencing, but it's applied AI. It's actually using the models to reason. I don't know who's not going to do that. Now, the question is, how quickly can we deliver it, transfer the technology to our customers? Again, it's the same problem. It's not going to be a demand problem, that is for sure.
It's going to be, how quickly can we transfer this technology to our customers and help them be successful and get started using this technology? Now, the good news, a lot of our customers, well, customers are very familiar with the Oracle Database. And it's not that they have to relearn the Oracle Database. They just have to take advantage of these new features of the Oracle Database. And we've got great distribution of the Oracle Database right now. You can get it, obviously, you can get it in Azure. By the way, that alone, just multi-cloud alone, is going to drive a huge amount of adoption. It wasn't long ago. The only cloud you could get the Oracle Database in was OCI. And OCI is a great cloud, but not everyone is in OCI.
The fact that we're building all of these data centers—we started with Azure, and we have quite a few data centers in Azure. If you look at the numbers, if you unpack our numbers, that 1,000% + growth rate we have in multi-cloud is being driven by Azure, who were the first ones that signed up for multi-cloud, and we got the data centers all built and running for them. We're not anywhere near at scale at Google, who came second, and then Amazon, who came third. Just scaling out those data centers, just the normal migration—what you asked for, the normal migration that's going to be enabled because of multi-cloud—is going to, I think, easily get us to $20 billion. I'm not trying to. Believe me, I don't want to raise those numbers up at all.
But Clay said it at $20 billion. I'll take it. But I think multi-cloud alone might get us there. Well, if multi-cloud alone gets us there, what about the use of it as a vector database for AI? It's very hard to extrapolate because we have no points. We really don't know how big that's going to be. But I think it's going to be all of our, it's going to be everybody. I think it's not just an accelerated business on its own as a platform service, but keep in mind, it's the same platform that we build all of our applications on top of as well. So the $20 billion is just the revenue that we'll collect from that platform service alone.
We're also very optimistic that this continues to propel our application growth because all of the agentic AI agents that we spoke about earlier today with Steve and Mark and Seema, all of that's built on top of the AI platform as well. And if not for that, we wouldn't be able to go as fast to grow our applications business as quickly as we are too. So it all sort of works, it all works together.
Yeah, I think that's so interesting what Mike is saying because you say, who are the first users of the AI data platform? It would be our internal application groups or our first users. And one of the advantages I've always thought Oracle had, and you could also say one of the disadvantages Oracle had, and I'll come back to that, is that we do applied technology. We do applications.
We use our tech, and we also build the tech. We build data centers. We train models. We build databases. We build all of this. We build code generators. We do all this stuff to enable the creation of applications, and then we actually create the applications. None of the other three cloud vendors do that. They are primarily tech platforms. They don't build large-scale applications. We get all of these insights from building these large-scale applications where Mike's team will say that, "Gee, it would really be nice if I could do this more easily," and not that these guys ever, not that anyone ever asks for more around here, but the database guys get asked for additional features, more ease of use, more automation and backups. That's how we got so much better at the tech.
We had our captive, very large-scale captive applications team that was putting demands on the tech and helping guide some of the new features that we developed. And so as we build the AI data platform, we also test the AI data platform on our own applications. And then what our customers get is something we've already used successfully, rather than them being guinea pigs very, very early on. And then it's continuous improvement. We constantly make it better because we improve it, we use it, we get insights, we improve it again, all of this other stuff. So yeah, the database is now inextricably linked to our AI strategy and inextricably linked to our application strategy. So all the pieces at Oracle now are fitting together. Why some people thought it was negative, people said, "Oracle's trying to do too much." Right?
Said, "Oracle really needs to focus either on tech and spin off the application business." I remember hearing that a long time ago, or they should just do applications and not try to do because no one can do both, well, so far, we're the only one doing both, but one is more than zero, so we think it's working out very well for us.
Hi there. This is Patrick Colville from Scotiabank. Great to be here and really exciting time to be part of the Oracle story. I've got one for Clay, please, and one for Larry. So Clay, I think one of the big standout announcements this week from AI World was the AMD partnership. So my question is, I guess, what was the logic there for that AMD partnership? And then also, what does it mean for Oracle's Nvidia partnership? Because that's been just tremendous for both firms. And then, Larry, if I may, love the long-term targets. One of the questions we get from investors is, what happens beyond fiscal 30? And I guess the reason we get asked.
I missed the funny part. We couldn't hear it up here.
He said, "What's beyond fiscal 30?" So if you could just make up like a 31 projection and a 32, he would love it.
Oh, yeah. Well, wait till we talk about 40, man. That's going to be awesome. I'm looking forward to that.
So do these AI labs, do the tier one AI labs beyond the midterm start in sourcing infrastructure, or do they lean even more heavily on Neo Clouds and can Oracle accelerate share gains beyond the midterm?
Sure. Okay, well, let's start with the question. So look, I'm also very excited about the AMD announcement. And you said, "Okay, so what does it mean for AMD? What does it mean for NVIDIA? What does it mean for us?" Well, first, look, we have an amazing relationship with NVIDIA. NVIDIA, I think everyone would agree, did a really good job of getting this industry started. And as a tech industry and as AI, we wouldn't be where we are today without the work that NVIDIA did both on the accelerator side and on the networking side. Now, I'm glad that AMD is doing a good job because I think that we don't have a shortage of demand. We have a shortage of supply.
And so the reality, I think, that the biggest thing that I find myself having conversations with people across all areas is they're using a scarcity mindset in a world that doesn't have that problem. So for me, it's like, "Oh, well, if AMD does well, does that mean that NVIDIA is not going to do well?" Or I look at it like, imagine there was infinite AI demand. What if we had options and choices that allowed us to scale even faster? So our AMD partnership is amazing. The reason we're doing it with them is they've been a great partner across our CPU business, our networking business. They make a good product, and our customers want it. So it's just good for all of us. But then you ask kind of the second question, and then I'll let you finish, Larry.
You can make it think longer about 2031 and 2032. The reality is that I think that there's this perspective that these AI companies are only coming to us because they're temporarily out of luck, and they don't want to be doing that. We do a really good job, and we do a good job that actually complements and supplements what these people are doing. I would advise you to go talk to these people. You saw Peter. Go watch my keynote from yesterday with Peter on stage. He doesn't need more problems. He's got a lot of work to do to be able to give the infrastructure he needs to all of his researchers and all the demand of this technology. Everyone wants help.
So I don't think that there's a shift at 2031 or 2032 where suddenly people go, "Oh, there's no more AI infrastructure business for companies like Oracle." I think it's more interesting to figure out, well, we just talked about how we have so much unexpected demand from this inferencing and reasoning and how that then goes out and makes everyone's lives better. Those are going to need computers too. So no, I don't see a step shift. The reason we give the forecast out is we can only see so far.
I think I'm going to let Mike talk about the future because I think the future, to understand what the world's going to look like, you have to understand the automation and how the world's going to change. This is back. Okay, Oracle using AI, how is that going to impact our financials using it internally? Well, let's look at our medical business, and then I'm going to let Mike go into detail about that. We need to work with the FDA to make it more efficient to a drug that works to get it approved more quickly. We need to do a better job of delivering healthcare all over the world with modular, working with companies building modular hospitals all over the world, and I know in some countries, healthcare is considered a human right. In other countries, healthcare is hardly existent.
As we have a chance to democratize a lot of these technologies, make governments more efficient. Sometimes governments are big. Corporations might not be very optimally efficient. AI is going to help that. Governments will be more efficient. Poorer countries will get food supplies. They will get energy supplies. They will get hospitals, but they'll get this next generation of smart hospitals. Mike, maybe you can go into again the world, how we benefit as the world gets better.
I think, what does it look like beyond 2030? I think we're rapidly heading towards self-implementing, self-learning, self-healing systems. You can certainly get your head around what that means for the GDP if you just look at, with Larry's point, something like healthcare and hospitals, and I'll expand on that in a second. What does it mean in terms of Oracle? It's hard to say. I can't imagine that there's anything negative to say about it because I like our chances of being able to deliver that full system ecosystem. But let's just talk about clinical trials for a second and compare and contrast. COVID-19 was not that long ago. We were involved as the technology provider for basically every COVID-19 vaccine and every therapeutic that was coming to market. And here's how it worked. We were shoulder to shoulder with our customers.
And what that meant was we had people that would help gather the documentation at the pharma, and they literally rode buses, caravans of buses to Washington, D.C. with stacks of paper. The bus was half filled with people and half filled with paper. The people had to read the paper on the way to make sure that it was complete. And that's how it worked in COVID-19. It's exactly how it worked from the most sophisticated pharma to the startup biotech pharmas, all of whom were involved in trying to create vaccines and therapeutics. Now, fast forward to now, and we're working with regulators across the world, not just the United States, to actually accept electronic documentation. To accept electronic documentation so that we can take proof points, we can take efficacy points, safety points from these clinical trials, and we can get drugs to market far more faster.
What that means is they're actually cheaper because there's a lot of time and money that's wasted in very long latency processes around documentation, around reading, around all this stuff. The next phase of it is, and we're not that far away from this, is that how do we have clinical trials that are reliant on just real-world data? Now, right now, the gold standard for clinical trials is double-blind placebo-controlled trials. And there's not a bad thing. That's not a bad thing. It's led us to some wonderful therapeutics and some wonderful vaccines along the way. But what if you could capture data in real time from everybody who's taking a particular pharmaceutical? Not only do you change how fast you get to market, but you actually have a clinical trial that never ends. Because right now, clinical trials are thought of as a project.
It's a project, and it ends, and once the project ends, we actually stop tracking those people. We actually don't know what the contraindications are of those people when they start to take other medications or they start to develop other pre-existing conditions down the road. I think that that's an example of not just changing the way an industry works, but actually benefiting humanity in a way that is really just, it's hard to put a monetary value on what that means just yet, but I will tell you that we are very, very quickly approaching the spot where the technology is not the barrier to make any of that happen, but there's one key. You have to have all the data in one spot. You have to have an AI data platform to actually make that work.
You've got to be able to get all that data very quickly. You've got to be able to collect it from multiple endpoints in real time. And I like our chances to do that. That's just one example, Larry. I have one industry. I could go on about all 22, but.
No, the fact that what you've done, I mean, it impacted us. The fact that hospitals suddenly have access, they're connected to the banks, and we've connected hospitals to banks. Now the banks can look at, if you will, a bunch of receivables and see, are those receivables likely to, is the payer likely to pay? The insurance company likely to pay the hospital, and can I safely make a loan to the hospital, so again, these ecosystems, as they get connected, so that means the liquidity you're providing to the medical business, maybe just a couple of minutes on that, and I love the example. There's an Oracle example that did directly affect our quarter.
Absolutely.
It's so interesting.
Can I give that? I'll give that.
Yeah, yeah, you got it.
All right. So we had the banks here earlier. We had JP Morgan and Bank of America here earlier, and they went into quite a bit of detail about the ecosystem. So I think to your point, Larry, we're not just talking about automating the healthcare system. We're actually talking about creating the ecosystem that didn't exist before. That ecosystem is the automated, real-time, autonomous connection between the banks and the providers. We had, as we mentioned, lots of very big hospital systems have cash flow problems. Not necessarily a long-term receivables problem, but a point-in-time problem, which is they don't have the cash to pay certain bills, including sometimes our bills at any given point, money that they owe us. So we actually connected one of these banks that was up here today with the hospital system.
And based upon the real-time debt, based upon the real-time access to the receivables position, the quality of the payers, all the things I talked about earlier, how many readmissions we had for hip infection, a lot of stuff. Based upon all that data, which they never had access to, they actually created a municipal bond offering for that particular hospital. And actually, we're able to, for the first time, rate it, actually rate it. One of the hard problems the banks have is actually they don't have a rate to debt here because it's a little bit of a crystal ball to say, what's the timeliness, what's the repayment factor, and what's the risk profile associated? That customer, in the quarter, was able to pay a bill that they owed to us.
Now, fortunately, what's more important, they were able to pay the doctors and nurses and everybody else that was in there, but as a side effect, they were actually able to pay us a bill that was, shall we say, rather late, so these type of ecosystems benefit everybody, and the bank actually got a customer that they didn't have before, so that's the kind of situation that works out in terms of these automated data platforms.
Yeah. I mean, the fact, the liquidity that information is able to provide, that the banks can use AI to consume that, to come up with bond ratings, to decide what the interest rate should be on the receivable, figuring all of that out. With all the information, you know, the receivable figure, you can charge a lower interest rate because it's more likely to be paid.
I showed you a picture of greenhouses, the food business, but we got really interesting modular hospitals that, again, we're not doing these, but partners are doing them, and the people of the modular hospitals are coming to us and saying, we'd like to incorporate all of the Oracle technology into these modular hospitals, which we will be building all over the world, and people all over the world, Healthcare is a huge market. A lot of people can't afford it right now. As the world gets wealthier, more people will get good healthcare. The global economy is going to get much bigger. Help us make our numbers.
Siti Panigrahi from Mizuho Securities. Larry, I want to ask you about enterprise application software. How is that going to evolve in this AI world? Do you expect this to be rewritten like we see in prior architectural systems, like client-server, three-tier cloud applications were rewritten? Do you think in this agentic world, enterprise applications have to be rewritten again? If so, how is Oracle positioned?
Yeah, we're constantly rewriting our applications. I remember I gave this speech. Every three months, we come up with a new version of our applications, which is just unbelievable if you think about what it used to be like. And SAP customers would upgrade every 20 years, even if they didn't need to. Well, actually, sometimes they wouldn't upgrade in 20 years. 30 is fine. Because you know what some of those SAP implementations? An excess of $1 billion to put in SAP. You don't want to do that every three months. So we're continuously rewriting our applications.
We have 600 AI agents live right now across our portfolio of applications. And we have 2,400 customers that are already consuming those AI agents. That's across the industry applications and the fusion applications. As I mentioned earlier, in banking alone, within the next year, we'll have an additional 126 agents live, just in banking, not counting everything else that we do. And by the way, if history is any teacher for us, and in the AI world, history is not that long ago, Steve, right here on this stage, last year predicted that we would create 100 new fusion applications over the last 12 months. We actually created 400 and had 400 go live. So that's the pace and scale that we're moving. Will applications become a collection of AI agents? Yes. And that's exactly what we're doing. That's exactly what we're doing.
And certainly, again, at the expense of repeating myself, much easier to do when you've got the underlying infrastructure and the AI data platform on top of it. And we are the custodian of the world's most valuable data. So our agents are incredibly rich, and they're very easy to adopt. They're just part of what we do in our quarterly release. There's not a special AI agent release vehicle and a special implementation plan. You just take it out of the box, and it works.
Larry, you've historically seen value in software companies when the market has been skeptical, and you've never been afraid to price software for the value it delivers. Now we're in a time when the market's questioning the terminal value of application software and is worried about the seat-based model. Do you think AI will shift value away from the application layer to other layers? And how do you think software should be priced?
Interesting question. Do I think the seat-based model works? Yeah. I mean, I think it's a combination. There really are two models for pricing on applications right now. There's kind of CPU consumption. I mean, how much? And then there's the supply side. How much CPU did you use? And then maybe storage, but primarily CPU. How much CPU, GPU did you use on the supply side? And then on the consumption side, how many different people used it? Those are the two models. And I think we'll continue oscillating between those two models. I think for certain kinds of AI, very complex reasoning, I think we're going to go by the supply side model. You're going to pay for GPU usage. For some of the more agentic things, automatically fill out my expense reports for me. Something as mundane as that.
Give me a couple of choices for a doctor's appointment visit in the NHS in the U.K. Show me the soonest appointment I can get and show me with my preferred doctor, how soon I can see my friend. I'll decide between those, or you figure out and decide between those and schedule it for me. That might still be down to how many individuals are using it versus what the costs are of actually supplying the service. And I think, again, we've been constantly adjusting that. This is nothing new. We've been adjusting between these two models for many, many years. What I don't think makes sense, by the way, is people saying, well, here's my application, and here's the really cool AI stuff you buy separately. I wouldn't even know how to build an application like that.
If you don't buy the AI stuff, your application won't work at all. I mean, it is your application. So I don't understand the separate charge stuff because the AI is going to be so dominant. I think that's a transitory model. That model will disappear from most of the application companies who are using it right now during the introduction of AI. We're more in the middle of this converting our applications to AI where they don't work without the AI parts.
All right. Excuse me. It's John DiFucci from Guggenheim. And Larry, thanks for that answer. That was actually my question, that last part too. I really appreciate it. But another question I had was, AI changes everything. In Oracle, some of the stuff you showed us today is changing the world of healthcare and maybe agriculture. I mean, these are things that Oracle's always run a business for profit. But I see something more here, to me anyway. Having covered Oracle for 26 years, it's Oracle sort of doing good for the world, not that you haven't before. Technology itself does it. But I guess going back to running a business for profit, because there's a lot to do with Cerner. You acknowledge you have to rewrite everything, and you're doing that.
But when should we expect something like Cerner, which is huge, to get to the profit margins we'd expect for a software company, if ever? And how should we think about that?
Well, I'll give you a very high-level answer because I work at a very high-level space. And he's the guy who builds those applications. I'll let him go into the detail. But I think we will have largely finished the rewrite of all of Cerner next year. Everything will be new. And we will have a comprehensive, and it's not just Cerner that we rewrote. We're writing agents for payers. We're writing agents for clinical trials and all of these other things. So it's going to be much bigger than Cerner ever was. Cerner automated hospitals and clinics. This is automating the entire healthcare ecosystem down to individual patients and individual doctors, individual nurses. The code will be in place next year. And I really don't understand how a small company can compete with what we're doing in healthcare. But Mike, just take that.
So in terms of doing good for the world, you're absolutely right. I mean, and I'll tell you that there's nothing more inspiring, and there's nothing that'll tug at your heartstrings more than walking around a VA hospital and working with people, which we do every day, who are caring for those who serve our country, the United States. And that's a mission for us that will continue to be a mission. And we will deliver for all of those people and actually all hospitals worldwide. So you're absolutely right. There is something here that's a mission, and we've got people rallying around that mission. In terms of thinking about how the business evolves profitably, I don't think we should stop and just think about Cerner. I think we should think about the entire ecosystem of what we built.
We had customers, two of Epic, our primary competitor in the EHR market, two of their largest customers in the world, the CEOs, were on stage with Siemens just three weeks ago at our Orlando conference, and they were talking about the AI data platform. We're not just talking about the EHR. We're talking about all of it, and it's going to take all of it to fix at least the American healthcare system. The American healthcare system, there's no more money to go into the American healthcare system. We said earlier, to be successful, we have to take money out of the American healthcare system, but I think we can make more money doing that because the only way to take money out of the American healthcare system is to be able to automate the entire process. Just the EHR is not enough. It's got to be supply chain.
It's got to be HR. It's got to be finance. It's got to be the banking relationship. It's all of that. And when you look at our Fusion business, healthcare is actually one of our most popular businesses for HCM and ERP applications. In fact, we're doing wonderfully well in the healthcare business with our Fusion business. When you look outside the United States, we are the largest provider, and our growth profiles are excellent outside the United States and just the pure EHR market before you even count all the rest of it. We just won a deal yesterday in the Middle East that was signed here, displaced SAP with Fusion, with our Oracle Health EHR, with OCI for all their bespoke workloads. People are buying all of it. So we no longer think about the Cerner margin, so to speak.
In the first couple of years of acquisition, obviously, we keep those things separate, but we don't think about it as Cerner anymore. We think about it as the Oracle Health ecosystem, and when you put that all together, we're actually already operating at a clip, which is not diluting the company, not diluted to the company margin, and I think to the point that Larry made, I just don't see how a competition is going to keep up with what we're doing. Because I go all the way back to sort of first principles, and this is what I said to some analysts at our health conference. How many fuel cell power plants is Epic building? That's the first question I asked. How many large language models, generative AI models, are positioned on their cloud stack? Second question I asked. Then I stopped.
And I said, you get the point. Unless you're going to do all of it, you're actually not going to do that. And I think the healthcare system in the United States is tiring of point solutions. They're tiring of very large system integrators. They don't have the money. Just don't have the money. The only answer is complete system automation. And for those reasons, I think we're going to be just fine, if not wonderful in healthcare. And to your most important point, going to do well for the world in the process.
Let me just add one thing that I've said several times before, that is, what is our model for all of this? My model, at least in my head, is Musk's law. I don't think Elon ever created Musk's law. I mean, he didn't write it down, but he just did it. And if you look at Tesla, he had to build an electric car ecosystem. The problem was not building an electric car. I mean, if you build an electric car, you've got to be able to recharge that car in Sweden and Norway and Vietnam and all these other places. How do you build a global charging system? How do you do that? Building an electric car is really easy. Rivian can do it. A lot of people can do it. I can build an electric car in my basement.
But how do you build 10 million of them, 20 million of them a year? How do you build those factories? The largest building ever built in human history was in Austin, Texas, is the Tesla factory, filled with robots. You've got to build robots. The robots he's building, the Optimus robots, the first inspiration, the first use of those Optimus robots are in Tesla factories. He had to create the back of the Model Y is one piece of steel. It's lighter. So he had to create all new stamping machines for doing this. Different machines, different factory automation, supply chain, battery technology, battery science. Had to do the whole ecosystem for an electric car to change transportation. But he did it. And it works pretty well. Those cars are rather inexpensive, considering they had these incredible computers in them to drive them.
And he had to solve one of the hardest AI problems of all time, which is self-driving, real-time AI. So you take what he did at Tesla, and that's how at least we look at the ecosystem for healthcare. It's not simply a hospital or a clinic. It's the entire patients. How do patients make appointments? How do payers decide to authorize that hip replacement or not? What's the process they go through? How does a regulator approve a new drug? How does a pharma design a new drug? That's the entire health ecosystem, by the way, a good deal more complicated and larger and a bigger opportunity than the electric car market, even if the electric cars are fancy electric cars and fully robotic. So the way we're approaching these problems, the way Mike's teams are approaching these problems, is to look at automating the entire ecosystem.
Our HR teams changed when we bought Cerner because we now had to train doctors, train nurses, schedule people differently. They were partially gig economy, partially hospital employees, supply chain, inventory, all the shipping. Keeping track of inventory in hospitals is unbelievably complicated. I'm not going into all of the details, but we had to design RFID tags and RFID readers. There's security in hospital systems. It's doing the entire ecosystem. If you do the entire ecosystem, you get a much better result. It's a much, much bigger opportunity.
Thank you. I just wanted to say congrats to Mike and Clay. Obviously, you guys are going to be great partners with Larry. It's the only time I'll say this, but I can kind of miss Saf for a little bit. She's here. She's here.
Unless you folks, Larry, Mike, Clay, unless you want to keep going, this will be the last question. But if you want to keep going, they'll stay here all day with you. So your call.
It's up to these guys. I've got nothing to do. I'm retired. Actually, not true. It's really not. Just Saf was just shaking her head. No, no, no. No, no. Yeah. OK. So you get to retire. I don't. OK. Something's very wrong with that. I know you're not retired. But you're sitting there and somehow I'm still sitting here. OK. All right. I'll figure it out in time. I think you guys want to keep going a little bit? Up to you guys.
Yep.
OK.
Hey, guys.
Fire away.
Awesome. I'm over here. Larry, Brad Zelnick with Deutsche Bank to your left. I actually have two questions. Larry, my first one, I've been coming here for many years. And every year, I look forward to seeing you and the team. Every year, I get a year older, and you seem to stay the same age.
If it were only true.
My first question, if you have one or two tips for how we all stay young, I would love to hear it. My second question.
Oh, my God!
For you and for Clay.
I'll send you the research papers.
OK. As we think about Stargate, can you talk about how strategic AI is to governments around the world and how Oracle is working with them hand in hand to make it all a reality? Thanks.
Yeah. Well, I mean, I think if you imagine for a minute that you were a government, and let's imagine that what we've all been saying is at least mostly true, you would want to make sure that you have that technology available inside your country and to your citizens. Otherwise, I think in the same way that people have for the past few hundred years, well, how do I manage my food supply? How do I manage my energy generation? How do I make sure I have the right telecommunications? AI is technology that we think is even more valuable than that. You need to have access to it. And so we're in constant conversations with different governments in multiple different areas. I think one aspect of it is around how do you make sure that your citizens are getting access to the AI?
That's about bringing access to those models to those different geographies. There's the conversation about, well, how do we get things like sovereign AI? Part of the way in which we solve that, if you think when I was talking about our distributed cloud situation, we have a set of technologies that enable us to actually deploy regions globally to enable them for individual customers, but also for local sovereign operators. So as an example, the one I used earlier about e&, they not only have a sovereign Alloy that they're using to serve the needs of the UAE government, they also have access to the latest and greatest GPUs inside that environment.
So I think it's critically important for all of these governments to have that technology, not just from an infrastructure perspective, but then also making sure that because if you don't have the infrastructure, how can you suddenly then deploy the latest and greatest EHR on top that makes use of all that AI? And as Larry said, we don't have a non-AI version anymore. Now, specifically to the question about Stargate, I think lots and lots of countries are very interested in how do we get that Sovereign AI. And we have a good relationship with OpenAI as kind of a, I think, due to our flexibility and our speed of being that partner of choice to go out and then deploy local Sovereign AI infrastructure facilities for them. And so there was UAE Stargate. There's things that we're working on right now in Africa.
I just had a conversation with a customer yesterday out of Latin America. It's very interested in the same thing, and part of what I think makes us a great partner in those sovereign AI conversations is that we can scale up and we can scale down. Not all countries need a 500-megawatt OpenAI deployment. Some of them might be very well served by five megawatts, well, we can do that, and we can do it quickly, and our relationship with the top leading model providers actually works to our advantage because then we can go deploy the infrastructure, and then suddenly the AI is available, both in terms of sovereignty as well as in local secure access.
So I think Oracle is extremely well poised across all the different layers of the stack for Sovereign AI and enabling all the world's citizens to get access to that technology in a secure and controlled way.
Hey, guys. Alex Zukin with Wolfe Research. Thank you for an amazing day full of truly unbelievable numbers. I have maybe just a quick three-part question, which is.
Only three parts.
A quick three-part question.
Is there a prequel?
I have a really fast nine-part question.
Yes. I'm just sorry. I'm just enjoying myself up here.
As we look at the pacing of reasoning versus training, when does that inflection point happen? In 2030, what does that workload ratio look like? Because it's important for margins. We talk about demand and supply. How do we think about the pacing of CapEx to facilitate that? And then any backstory on how you're able to kind of overjump or jump over some of these other hyperscalers to win and become the preferred vendor for some of the model companies as they scale beyond what we all thought was possible at a given timescale?
Those questions have almost nothing to do with the. I feel like it's like, would you like a scoop of ice cream on the salad with some bread? I've already forgot. Did you remember the first part of the question?
Yeah.
I'll start with the last part. Why aren't people picking us? At least I remember that part of the question. I can answer it quickly. Look, the reason that customers are picking us is for the reasons that we say customers are picking us and for the same reasons that those customers also stand up and tell people like you why they're picking us, and like I said, Peter from OpenAI was on stage yesterday explaining exactly why they pick us. It's because we are extremely fast at getting things done. Everyone has excess needs. If you can be somebody that has an answer that says, OK, unique capacity, if you go two years, two years, two years, eight months, OK, well, I like eight months better than two years. OK.
If you have unique requirements where you show up and you're like, hey, I have custom requirements around some of my networking or the way in which I want to power cap this infrastructure or I want to co-design it with you, or perhaps I want to deploy it in a location that might have lower cost. OK. No, no, no, yes. OK. So if someone's saying yes to the things that you need and they're delivering very, very rapidly, oh, and they're extremely high performance and very cost efficient, that's why people are picking us. And that's why it's not just one customer. It's all of the customers, both very, very large customers and small customers. When I showed the details of that 700-plus AI infrastructure customers, that's why they're all choosing OCI. Now, not 100% of the market. We do definitely have competition.
But as we continue to execute and as I think Peter said yesterday, this community is very small. As this information gets out there more and more, all of those people are coming to us with more and more capacity demands. Now, you're going to have to repeat the other two sections because I already forgot.
Yeah. Larry, you want to talk about training versus reasoning and how you think it changes?
Everyone's going to do reasoning, and very few people are going to do training. So it will definitely cross over. I wish. I think it's a great question. Actually, it's quite a fabulous question because I don't think we have enough data points yet to be able to figure out when it's going to cross over. I think what's going to help is the AI. Believe it or not, I think it has a lot to do with how good our data platform is. If our AI data platform is as good as we think it is, it's going to help people get to inferencing/applied AI/reasoning, all the same thing, using the models as opposed to training the models, which is, again, everybody. So we have to make it easy for everybody to do on their own private data.
Five years from now, will we still be spending more money on training than on reasoning? I don't know. Anyone? Clay, you want to take a shot at that?
Well, I'm going to have a complicated answer, and then Mike's going to have a useful answer. I think it's an interesting question. I think it's also a question that's unanswerable. So look, if you think about what training does, we take together all of the information that humans are willing to share, and then we train something on it. Eventually, you run out of data. Well, how does data get created? Well, people use reasoning to create data, and they write it down. If you actually look at where AI models are going, much of the data that they're going to be training on is actually data that they're doing their own reasoning on. So I think this question, if you actually look at what I think in five years what most training of AI models looks like, it looks like AI models thinking.
Interesting answer.
And then AI models using the results of that and going, "Is this better or not?" If you look at how RLHF, reinforcement learning with human feedback, works today in these AI models, what's happening is you train a model, and then you give the results out to human beings who score it. And then you feed that back into the training data. Guess what happens as the models get really smart? You don't pay a bunch of people to score the outputs. You have models score the outputs. So I think that that's kind of what I think Peter yesterday was trying to say was like, "Hey, you think of these as different things. They are the same thing." And in the same way that we don't think of our brain as like, "Well, are we using our reasoning function right now? Are we learning?" You're constantly reasoning.
As you're reasoning, you're also learning. It's going to be this iterative cycle.
Let me reframe the question, so I think it's really simple in ways much forgive me. I think simpler to understand. When will people creating AI models be taking in more money than they're spending?
I think that actually depends less on how much money they're taking in because that's growing very, very rapidly. I think it's when do we reach diminishing returns on spending extra money, how it makes it more valuable? If I had the answer to that, we would do a different company.
Yeah. Even the simpler version of the question is still hard. Is the same question and still hard too.
When we created.
Still hard to answer. I think the problem I have, and I agree with everything Clay said, it gets very blurry because every time you ask AI a question, it's learning something. It's training itself. So yeah, what is training? What is inferencing/reasoning? How fast does, let's just pick one example, OpenAI or Grok. How long are they in hypergrowth mode where they're spending money faster than their revenue? Because that's very common in the early, obviously, railroads spend a lot of money. When does it cross over? When do passenger tickets exceed laying of track? I don't think we have quite enough data yet on inferencing and the speed of inference and growth. Though one of the most interesting calls we ever got was, hey, do you guys have any capacity? Well, where? Anywhere? Well, how much are you looking for? All of it?
You want to buy all the capacity we're not using everywhere in the world?
Yeah?
OK. I've never heard that one before. But I haven't heard a lot of this stuff before. This is very strange. I remember I called Safra. I said, "Someone just called and said, what the hell is going on?" I mean, it's not like I was trying to sell anything. They called us. They just wanted to buy everything we had. So I don't think we have quite enough data. I think we'll understand it a lot better, believe it or not, a year from now. I mean, it sounds funny. But I think, Mike. It's really a philosophical multi-part question. It's a great question. How does synthetic data feed into this? Is it really retraining? Or is that inferencing?
And then the next frontier or two is how does private data fit into all this? And do we take some of that scale and create very rapid small language models too for enterprises? I think that's going to continue to evolve over the next, I would say, two years, but probably not even at the right moment. We're going to have clarity on that. But I think either way, we're going to be in a good position to serve those markets.
I'm going to say this is the last question for me.
Yeah.
You guys just keep going. It's fine.
If he leaves, I'm leaving.
Mike.
OK.
Yes.
We work together as a group.
OK.
You can't just walk off and leave me and Mike out here.
Watch me.
Well.