Excellent. Thank you everyone for joining us. My name is Keith Weiss. I run the U.S. Software Research effort here at Morgan Stanley. I'm really pleased to have with us from Microsoft, Scott Guthrie, Executive Vice President of Cloud and AI. I think there's a couple other things behind that. You have a wide scope of responsibility, but really a super interesting conversation ahead, given everything that's going on with Microsoft. Before we get into that, something uninteresting, our disclosure statement. For important disclosures, please see the Morgan Stanley Research Disclosure website at www.morganstanley.com/researchdisclosures. If you have any questions, please reach out to your Morgan Stanley sales representative. Excellent. Thank you so much for joining us.
My pleasure.
I think it's a fascinating time to be talking to you given everything that's going on within the Microsoft ecosystem. I think we should just start with what everyone's super excited about. From an investor standpoint, especially for people who have been paying close attention, the pace of innovation and what we're seeing in terms of AI and generative AI capabilities coming into the solution portfolio has really impressed people and impressed them to the upside of what our expectations are. Can you talk to us about how this came together? This is something that didn't come into the forefront over the last month or two.
This is something you guys have been putting together for years. Can you talk to us about sort of how you guys have built out the AI capability within Microsoft and now how we're seeing it expressed in across the product divisions?
Yeah. I mean, I think it's, it's an exciting time. I mean, I think, you know, we sort of call it the Age of AI that we're entering, and it's probably gonna be the most significant technology transformation in any of our lifetimes. You know, we've all experienced lots of big ones, but I do think, you know, it's gonna be over the next couple years, so that's not a statement around the next quarter or two. I do think this is, you know, very, very profound and really gonna change how business works, how society works going forward. It's, it's kind of been amazing on the technology side. I mean, this has been a bet that we've made, you know, going back many years now.
Yeah.
A deep partnership with the OpenAI team, and I know Sam's gonna be here at the event I think later this week.
Yes.
You know, it's been a great partnership where, you know, we kind of made some mutual bets on building kinda what we call the AI supercomputer, which is kind of a service inside Azure that we provide, which is really optimized around these very, very large language model trainings. You know, we kind of did jointly, you know, a whole bunch of architecture work kind of designing, you know, how they were gonna build the models, how we were gonna build the infrastructure, and really built something pretty special that allows these large language models to be trained very fast and iteratively. You know, kudos to the OpenAI team.
They really pioneered, you know, a tremendous amount of kind of new ways of thinking about building these models and, you know, the combination I think has really been magic, the last six months and, you know, I think the road ahead's gonna be pretty exciting, you know, as we start to move from training these models, providing these models, to really embedding this now into every single app and experience.
Right.
Even at Microsoft, you've seen, even this week, you know, yesterday we announced our Dynamics 365 Copilot.
Right
And our Power Platform Copilot. We shipped our GitHub Copilot last year, and you're gonna see us kind of infuse this AI deeply, you know, throughout all of our applications, and it's, I think gonna be great for customers.
Right.
And really the next foundation of computing.
Right. If we think about it kind of structurally within Microsoft, it's not just the OpenAI partnership. You guys have a lot of your own kind of AI research that you do in-house. You acquire some interesting technology with Nuance and sort of their DAX platform. From what I understand, there's a centralized kind of AI kind of core functionality, and then it's up to the product teams to figure out sort of how to expose that through their own solutions. Is that the correct way to think about it?
A little bit, yeah. I mean, there's a core kind of what we call AI platform that we're building.
Right.
It's the same platform we offer to our external customers and partners. The nice thing is, you know, what, you know, Office 365 or Nuance or Dynamics or GitHub are using is the same platform infrastructure and the same capabilities that, you know, any external partner or customer can leverage as well, and we kind of believe that first party and third party symmetry is important. So there's a lot that we share.
Right.
You know, part of the opportunity with these large language models is the ability to kind of have them know a lot of stuff about a lot of things and be able to be used in lots of different domains. Then what we've built, you know, with our Azure OpenAI service as an example, is the ability for organizations or our internal teams to kind of provide fine-tunings on the model specific for a use case to make it even better. You know, that promise of being able to leverage the large language models, which is trained in the public web.
Right.
And then that ability for, say, a Morgan Stanley or another customer to be able to take proprietary data and tune it even further and know that that model is only gonna be used by you, not by us, it's not gonna benefit anyone else, and you're gonna control access, you're gonna have, you know, the business model around it, you know, is what I think enterprise customers in particular are looking for, and all of the SaaS ISVs and startups that are gonna serve those customers are gonna need.
Right.
I think, you know, I don't wanna claim it's perfect, but I think we've got something special there and the fact that we're able to use it now for our first party products, and we're able to offer it to all of our customers and partners, you know, I think speaks to opportunity in the years ahead.
Right. That's important both from the side of the equation. If you're talking about like large enterprises, particularly in sort of regulated industries, you bring about both sort of like the security and sort of the data residency side of the equation, also the AI, if you will, gets smarter to your particular business process. It's meant to solve the or answer the question specific to your business.
Yeah, I mean, AI learns with data.
Yeah.
You know, it's, you know, one of the things you need to think about with AI is as you provide data to that model and it learns, you know, who owns that model? Who owns that data? That's why, you know, the trust is so important, I think, in this AI age.
Okay.
Again, our promise is your data is your data. It's not our data. You get to monetize it, you get to control it. And the foundation models won't learn from your data.
Okay.
You know? Your instance learns, but not the models that are shared by others. I think that promise, you know, and frankly the trust that we've earned over the years at Microsoft, and you earn trust in drips, and you can lose it in buckets, so it's really important to focus on that trust.
Okay.
I think that, you know, that puts us in a good position, I think, where, you know, I think people look at us versus maybe some of the other big tech companies and say, "Yeah, I feel like I can trust Microsoft."
Yeah.
You know, I think that hopefully makes us very good stewards in the years ahead, where people look at us as the trusted partner that's gonna help them fully leverage this AI.
Right.
To its maximum potential.
Got it. One of the reasons I was so excited to talk to you at this point in time is I think there's some really kind of foundational questions that investors are asking about the underlying technology, and one of the biggest ones is kind of the, how to think about competition, how to think about what's going to make one model better than the other. Let me ask you the question. Like, how should we think about judging whether a GPT model from OpenAI is better than what Google's bringing to the equation or what Amazon's bringing to the equation? What are gonna be the parameters of that competition?
I think there's gonna be two aspects. I mean, one is gonna be on the raw technical capability of the model. You know, obviously we're gonna be very focused on making sure that the base model at the get-go is super competitive. I think, you know, I think the whole world probably didn't know what large language model, or only a small part of the world.
Okay.
Knew what large language models were a year ago. You know, I think ChatGPT, you know, some of the things we've done with Bing, GitHub Copilot, like suddenly the world's woken up to, wow, this is pretty amazing stuff.
Yeah.
You know, we're gonna continue to see large language innovation, model innovations in the years ahead. I think the other thing to keep think about in differentiation to your point is also gonna be the signal of use cases of people actually using it. You know, if you look at GitHub Copilot, which was the first really widely used large language model service in the world.
Okay.
And you look at the accuracy of the code that was generated last July versus what it is today, it's dramatically better today. It's because, you know, as people are using it, the model's getting better, the accuracy is getting better. Sometimes first time to market or, you know, first to market and that signal improvement can really start to differentiate these models beyond even the base capabilities that are in it.
Okay.
I think that's, you know, that's partly why you're seeing us move as quickly as we are, you know, whether it's GitHub, whether it's Microsoft 365, whether it's Dynamics, whether it's Power Platform, Nuance, et cetera, there's literally nothing in our portfolio that we're not very aggressively looking to leverage AI, partly because we also wanna get that signal going. When you have hundreds of millions of commercial customers using your products.
Okay.
That's a lot of good signal that's gonna improve them, and that, I think, is gonna further differentiate.
Right.
Our models, versus others hopefully in the market.
Got it. That's a good segue to the conversation on Bing and the new Edge browser. On one side of the equation, I think the Bing announcement and the capabilities impressed a lot of people and really was a great marketing event for Microsoft and bringing it to sort of the entire world, if you will. Like that, these capabilities exist, and they exist within Microsoft, there's very quickly some blowback about sort of, well, some of the answers weren't right. Right? Sydney emerged kind of over the weekend, and it kind of freaked some people out. It sounds like that's part of the learning process, that that's part of making these models better, is they have to get that usage up.
Yeah. I think, you know, you know, the day we announced the new Bing, you know, one of the things that we were very clear on was, you know, this is gonna be an evolution, and we're gonna learn and evolve, and we won't get everything right, and we're gonna keep improving, and we're gonna do it really fast. You know, that's been the model or the approach the team has taken on the Bing side. I think they've done a good job of reacting fast. You know, in some cases, people are doing 200 prompts to try to cause the model to say something strange. You know, credit to the team that, you know, they're reacting fast.
We take it very seriously in terms of making sure, you know, AI is responsible, it's safe. That's kind of core to our DNA, and that's partly why it's not available to everyone yet.
Okay.
It's, you know, we start with a cohort of users. We learn, we improve. You know, we're gonna make sure that we deliver this technology in a really safe, responsible way.
Got it. In terms of the sort of monetization avenues on a go forward basis, I think one of the competitive advantages that it seems like Microsoft has is all these avenues that you can sort of bring it out, you could productize it through. Bing being one of them. Like you were saying, like GitHub and the whole developer platform is another one. Teams Premium. The one that we haven't heard from yet, Office and the overall productivity suite, but I guess that's at two com. There's the March 16th event. Is it fair to say it's a kind of a similar kind of perspective?
One of the things which is interesting that of all of these kind of innovations you've been putting out, they all have a price point behind it. Like this isn't sort of innovation in the overall kind of software moving forward, but Teams Premium's a SKU. Should the expectation be that's going to be the route forward across the entire portfolio?
I think you're gonna see. I mean, I guess the way I'd look at it would be, you know, what is the productivity win you're giving to the business, whether it's around, you know, making an employee, you know, more productive or making up a specific business process more effective. You know, you take the example of say, GitHub Copilot, since that's a product that's GA today. You know, we're now seeing that developers using GitHub Copilot are 55% more productive with it.
Okay.
Y ou know, on tasks, and that's sort of measured by independent studies. 40% of the code they're checking in is now AI generated and unmodified.
Okay.
You know, if I talk to a CIO or to a CTO, and sort of say, "If I can give you 55% more developers overnight."
Okay.
"Would you take that?" You know, they're all looking for talent, they're all constrained in terms of the number of engineers they can hire. They're gladly gonna pay for that. We have a, you know, a good price point for that, I think, that is sort of a no-brainer for them to pay, and it's a great service and a great business. I think, you know, there's gonna be lots of opportunities here. When you look at AI, it's gonna be additive. You know, we're sort of adding new scenarios and taking cost out, enabling organizations to move faster, enabling them to be more productive.
I think, you know, from a, from a margin perspective or from a revenue perspective to your price point, you know, I think these things are gonna be additive to our overall business.
Okay.
In a lot of cases, you know, some of these use cases, you know, cost a lot of money for an organization. You know, think of Nuance with healthcare, you know, physicians and physician visits. You know, physicians only see so many patients a day. If you can help them see significantly more patients and have a better experience, that is worth a lot. You know, similarly, say, call centers and customer support. If you can deflect a case without even having to have a human answer it, you have happier customers.
Okay.
You've taken a lot of cost out of the system. I think for each of these things, there'll be different ways we monetize, in a world where we're sort of delivering massive productivity wins. You know, customers want to pay it.
Okay.
It ultimately takes their overall cost down.
Okay.
You know, I think there'll be different opportunities for us to add additional value going forward.
Got it. Outstanding. Today we're seeing a lot of this functionality be exposed through kind of existing applications. On go forward basis, do you think this changes the paradigm of how people build applications, and could it potentially shift that pendulum that we see between sort of buy versus build applications more towards the build side of the equation?
Yeah, I think there's gonna be, I think in the short run, I think, you know, the fastest way to get some of this AI value is gonna be through finished apps. Again, like the Copilot experiences that we're launching.
Okay.
You know, because people are already trained on the apps and it just augments the apps, integrates with it, lets them move faster. I think there's a huge opportunity there. I think we're also gonna see then the next generation of apps that are gonna be built on the raw APIs and the services around it that are gonna re-envision pretty much every experience that we see. I've told this story a few times about e-commerce. You know, we kind of all take for granted, you go in a web browser to an e-commerce site, there's categories on the left, you click, there's products, you click, you read the reviews, the price point, is it in stock, you add it to your cart, you check out. You know, that was kind of codified 30 years ago, and it hasn't changed dramatically.
Okay.
You know, I think we're gonna be in a place probably two holiday seasons from now where instead of browsing, I'm probably gonna have a text box and I'm gonna say, "I wanna buy a gift for a family member. You know, here's the price point. I want it delivered by December 19th. You know, this is what they like." It's gonna find the products for me.
Okay.
You know, that's gonna be a very different user experience, it's gonna be a very different question and answer experience, I'm gonna be able to ask it questions about the product versus scroll through hundreds of comments and reviews. You know, I think every organization needs to start to be thinking about, "Okay, how do I reinvent how I do, you know, retail, wealth management."
Okay.
"Manufacturing, routing, you know, just customer support?" You know, I think that is. Some of it's gonna be build, where people are gonna build it themselves, and I think, you know, big brands are gonna need to have more control, and a lot of it's gonna be, buy components of it and how do you compose them together. I think part of what we're trying to do with the Microsoft Cloud, is we do both.
Also, you know, being able to point to the fact that, "Hey, how we built GitHub Copilot, or how we're building the Teams Premium, or how we're building the Dynamics, you can use the exact same APIs that we are." You know, gives us an opportunity to also talk credibly to other software vendors and to other enterprises about how they can do the same thing we are. I think, you know, I think there's a big opportunity.
Okay. One of the presumptions I made in an earlier question was that the ChatGPT or the Bing announcement, more succinctly, was a great marketing event for Microsoft. Is that correct? Has that spurred more customer conversations for you guys? Maybe more broadly, where are we in terms of the customer conversation around these generative AI models, or more AI more broadly? Like how far into this opportunity do you think we are?
I think we're still early innings. I mean, I think, you know, the thing that's been great about, I think, ChatGPT and then also around Bing is the fact that, like, end users can do it. You know, the number of people I've talked to who, you know, maybe haven't used all of the new products from all the tech companies, but seem to have tried those two, you know, and said, "Hey, you know, we're using it now," you know, "My, my children are using it for homework which they're not supposed to.
Right.
You know, or, you know, "We're using it in a variety of use cases that kind of." You know, I hear more and more interesting ones, you know, I think has actually made what has been a very technical concept, large foundational AI models.
Okay.
You know, transformer-based learning. You know, like most people didn't know what in the world that even meant, you know, 12 months ago, and yet, you know, hundreds of millions of people have heard of ChatGPT and Bing now and tried it. I think that's. You know, I think that is actually making it much more real, which is giving us an opportunity to then say, "Hey, let's show you how you can use this in customer support. Let's show you."
Okay.
"How we can use it with developer productivity. Let's show you how we can use it for, sales productivity." It's a good conversation starter.
Right.
You know, and again, I think people are looking for solutions that integrate with their workflows that they already have and help them kind of accelerate even more.
Got it. One of the things that investors are struggling with a little bit here is, it seems like just a massive opportunity ahead of Microsoft, and it's something that Satya has talked about is, I mean, this is what's expanding sort of IT as a percentage of GDP from 5%-1 0% over the next 10 years in kind of his way that he lays out that market opportunity. In the near term, we're talking about cloud optimizations. In the near term, we're seeing sort of Azure growth decelerate. Like, can you give us some, well, one, can you give us some kind of perspective on kind of what do those cloud optimizations mean? Like, what are customers doing?
Are they changing their views on how they wanna use cloud fundamentally, or is this more of a short-term tactical impact that's just about sort of getting in line with budgets?
Well, I think, you know, cloud optimization has been sort of a core part of the cloud journey for 10-plus years now.
Okay.
I don't think it's necessarily new per se. I mean, in general, you know, the typical pattern we've seen going back many years is, you know, you either migrate a workload to the cloud or you build a new workload in the cloud. You then optimize it, and then you reinvest the savings, and you rinse and repeat with the next workload or the next use case. You know, we like that optimization process. I mean, I think, you know, w e have dedicated teams at Microsoft that help our customers with it because we know it earns loyalty, we know it earns confidence to move more. You know, at the end of the day, if we can help our clients and customers and partners get more out of their investments, you know, we kind of know they'll spend more with us.
Okay.
They'll invest more in digital technology, which increases the overall size of the pie. I, you know, in general, we like that. The types of optimizations people do, you know, sometimes when they're first moving, they have a large test environment, and they kind of go like, "Okay. Can I do more, you know, test in production and I can shrink that test environment a little bit?" "Can I take advantage of things like Reserved Instances or new cost savings plans that we have in Azure to get, you know, better commit to a longer term benefit and reduce my per unit cost a little bit on an ongoing basis?" Then there's sometimes where they kind of right-size VMs or right-size databases, that's all natural.
There's only so much optimization you can do until you're done.
Right.
You know, while people are optimizing, it's not like they're gonna optimize it down to zero. At some point, you get done, and then you go onto the next workload. You know, what's happened in the last, you know, six months or nine months has been, as the macro situation's been uncertain, people have been, you know, I'd say optimizing even more.
Okay.
They are sometimes holding onto those savings a little bit longer before they you know, reinvest it. You know, I haven't heard really from any customer a long-term change in terms of cloud. There's always more workloads, there's always more use cases, and as we've been talking about with AI, you know, if you're not constantly reinventing yourself with digital technology, you're gonna be under severe competitive pressure. You know, I don't so much worry about the long term.
Okay.
It does lead obviously to short-term questions in terms of.
Okay.
That cloud optimization journey and what exactly the impact is. Again, longer term, I'm not hearing any changes.
Right. And that concept of, like, optimization can only take place so long, is that what Satya's talking about when he says that he thinks this optimization activity lasts for a year, but it's unlikely to go significantly beyond that?
I think we're trying to be a little careful in terms of giving guidance on a specific quarter.
Mm-hmm.
Or annualized basis. I'm not trying to say that because, you know, I think the reality is, you know, we'll see. I think at some point when you optimize a specific workload, you can't optimize it anymore. There is a finite amount on a workload basis, and that's why, you know, again, we ultimately feel very good in terms of long term for cloud and don't see any kind of strategic shifts that our clients are making. It's more a case of, you know, it is gonna have at times a dampening effect in the shorter term. Again, the more they optimize, the more value they get, the more they generally in the long term wanna invest with us more.
Again, I look at AI and I look at other new use cases that are coming out and hear a lot of excitement around people saying, "Yeah, I gotta be ultimately doing more of this and this and this.
Okay.
I do think those reinvestment savings, you know, for me is not in doubt. It's, you know, obviously we're all wondering, you know, exactly the win on a quarter-by-quarter basis, but, you know, again, I have confidence in the long term.
Got it. Got it. I wanna ask a question about Azure gross margins, but it's kind of like a roundabout question. When you guys did the Bing announcement, Satya pretty sort of directly went sort of after kind of the gross margins of the key competitor there in terms of Google, saying that this is going to be a lower gross margin business and we're willing to spend on that. It didn't take long for investors to say, "Well, if it's lower gross margins for search, is this gonna be lower gross margins for Azure and the other cloud businesses as they roll more of this AI functionality beneath those platforms?" Is that the case? Does that like.
Because this is more computationally intensive, because you have to bring in GPUs to do the training, that this is going to be a compressive impact on overall cloud gross margins for Microsoft?
I think overall, I mean, the thing I would probably point to, is gonna be the fact that these are new workloads and the fact that it really opens up more top line revenue. For a lot of these use cases, take a developer. If you can make that developer 55% more productive, I gotta believe there's a lot of gross margin in there.
Okay.
Because that ultimately translates into, you know, real opportunity to transform how an organization gets productivity out of their employees. I do think we're gonna see, depending on the use case for AI, you know, different ways that will monetize different margin structures. You know, I ultimately look at if we can keep growing productivity dramatically for businesses, you know, that probably is gonna be, you know, definitely long-term opportunity from a TAM revenue perspective. I think it's gonna allow us to maintain some good margins as we do it.
Got it.
You know, obviously that'll continue to evolve in the quarters and years ahead.
Got it.
As people take maximum advantage of it. I think the other thing on AI, that's worth thinking about is that AI has gravity, meaning, you can't go faster than the speed of light. If you've got an application or a database and you've got an AI model, you know, the further they're apart, the slower the network path and the calls between them are.
Okay.
You know, for a lot of these use cases, take something like GitHub Copilot, you know, it's not like we're making one AI inference, like we're doing it on every keystroke. You know, the further that is apart, the slower the experience. I do think we're also gonna see as we look at AI opportunities with Azure specifically, is there's both the direct Azure AI model opportunity, but there's also the fact that people are gonna increasingly wanna move their apps and their databases into Azure to be close to those large models. You know, that's also gonna be an opportunity for us to not only sell more AI, but also to sell more VMs, more storage, more databases, more everything.
You know, I think we also see that as a real opportunity both with customers that we have today, but also there's a lot of customers that we don't have today. You know, take this particular zip code has not been our strongest because it's very much open source-based developers.
Right.
Which has not been Azure and Microsoft's historical strength. You know, this is creating a conversation where people are like, "Look, we wanna take advantage of these large language models. Like, we wanna talk about how we could use them."
Okay.
You know, we need to first show our value with the models. We need to first show the capabilities of Azure. I do think this is gonna be a great door opener for a lot of customers that haven't really given us a hard look yet, historically, because they don't, you know, they already had a cloud. The fact that the OpenAI models run exclusively on Azure, you know, I think ultimately is gonna be a big differentiator for us.
Got it. Both through sort of pricing and volume, you can make up for any kind of higher compute intensity needed in this type of process.
We're continuing to just sort of really optimize these models.
Yeah.
You know, I think, you know, a lot of people were surprised last week when OpenAI lowered their prices by a factor of 10.
Right.
That's partly 'cause they found a way to lower the cost of inferencing.
Right.
Correspondingly.
Right.
They, you know, they said, "Hey, we can get a lot more revenue, open up a lot more opportunities when it's more effective." You know, that model that's cost optimized didn't exist 30 days ago.
Right.
We're in still early innings.
Right.
And there's, you know, there's still a lot of optimization and learnings that we're doing and, you know, it's gonna be a fairly dynamic, but I think it's gonna be exciting because it's, it really is, y eah, again, just looking at some of the ChatGPT use cases that, you know, people have posted about, or the Bing use cases people have posted about, you know, they're already using it in fascinating ways that none of us, I think, would've thought of a year ago.
Right.
Yeah, I think we're gonna see far more use cases over the next year.
Right. That optimization activity that you were talking about that is bringing down AI inference costs, I would assume that's really just kind of getting started right now. It's a motion, it's a muscle memory that Microsoft has. I mean, you guys have been driving up cloud growth margins and Azure growth margins pretty materially over the past couple of years. One of the questions I get from investors is, does the fact that you guys don't have a GPU design of your own, does that inhibit sort of how far you can go on that sort of optimization curve? Google has their own GPUs Amazon has their own design for GPUs, but Microsoft doesn't. Is that any significant inhibitor in terms of how efficient you could get?
Well, in general, I mean, I think we're looking for how do we optimize everything, you know, whether it's the silicon GPUs, whether it's the network interconnects, whether it's data center designs, whether it's server hardware, whether it's fiber. You know, some of those things we're doing organically. You know, two of the acquisitions we've done in the last six months are, you know, hollow core fiber through a company called Luminosity and Fungible, which does storage and IO optimization for DSPs. You know, these are very specific scenarios that maybe five years ago would not have made any sense for us to be investing in.
Now, at the scale that we're operating on, it makes a lot of sense to invest in, and you're gonna continue to see us innovate both organically and inorganically to kind of optimize every layer of the stack. You know, part of the reason why not just OpenAI, but if you look at a lot of the other large language model startups out there, are using Azure, is because we have some pretty differentiated hardware with our AI supercomputer. You know, that again includes silicon, hardware, network, power, data center, and a whole bunch of other design elements, and you're gonna continue to see us innovate in that. You know, we're gonna be looking for opportunities in every layer of the stack to do optimizations.
And I think the partnership we have with OpenAI and the fact that we're building all of our own apps deeply taking advantage of these models, is also giving us real signal on what optimization really is gonna matter and how do we, you know, continue to be on the bleeding edge of optimizations. You know, I do think that signal for us has helped a lot in terms of what we've built. We wouldn't have built it the way we built it without the partnership we had and without some of these early applications. You know, I think that ultimately, hopefully is gonna help, again, both at the platform layer and at the app layer, give us some good differentiation in the years ahead.
Outstanding. Unfortunately that takes us to the end of our allotted time. I could have this conversation all day long, but thank you so much, Scott, for joining us.