Welcome, everyone. It's so great to see a really packed room here, and folks on the webcast as well. Welcome to our investor and analyst Q&A at Adobe Summit 2023. This is Jonathan Vaas speaking, VP of Investor Relations. I'm gonna do some quick housekeeping. I'll try to keep it quick. We'll introduce our speakers, then we'll jump right in with Q&A. We're gonna try to leave a little bit of time at the end just for some meet and greet as we usually do. So with the timing of this, it's been four years, believe it or not, since we've had an Adobe Summit in person. Just had our Q1 earnings last week. I couldn't be more glad about the timing and the innovations that we just announced this morning.
I think there's a lot of things that are probably on your mind, so we're really glad to just have this hour to converse with all of you. In terms of the housekeeping, from earnings last week, you saw the legal disclaimers in our press release, the non-GAAP disclosures. You will hear forward-looking statements discussed in this meeting, and all of the legal disclaimers from the earnings release apply to this. You can look at the risk factors in our SEC filings for more information, and you can find the reconciliations between GAAP and non-GAAP numbers on our investor relations website and in our SEC filings and in the press release as well. With that I'll. Oh, one more thing.
For attendees in the room that registered with the analyst meeting code, you will have received or will be receiving today an invite to participate in the Firefly beta. For folks on the webcast, if you're interested, you can go to the website and reach out, give us your information, and they'll put you in the queue to get access to participate in the beta. We're really excited for all of you to start playing around with the tool, seeing the innovation and being part of that conversation and give us your feedback. I'm just gonna introduce so folks on the webcast as well can visualize our speakers up on the stage.
They'll each share kind of a brief opening statement of on what's on their mind, and then we'll jump right into Q&A. Nearest to me and moving to your left, we have Dan Durn, Executive Vice President and CFO of Adobe. Anil Chakravarthy, President of Digital Experience, sitting to his right. Shantanu Narayen, Chairman and CEO of Adobe, and then David Wadhwani, President of Digital Media. Shantanu, maybe if you just
Sure.
wanted to say a few words.
I will. Do I have to say the financial disclaimer again as well as, you know, the reconciliation between GAAP and non-GAAP? No, I'm just kidding.
I'll jump in and say it after every question.
I'll echo what Jonathan first said. Thank you for being here. I mean, I think there's nothing like being at a summit to actually both hear the innovation that we're delivering as well as I'm sure you all do, which is engage with customers to hear what's top of mind for them. We're really pleased to see as many customers back here in person as we have. As you know, we also take this on the road, and we're gonna be doing this, you know, next in Europe. I think today was a really exciting day because we've been talking a lot about, you know, our vision for content and everything associated with creating that or managing that or monetizing that or delivering that or managing that.
Hopefully you saw a lot of cross-cloud innovation that really brings both of these together and enables Adobe to address customer needs like no other company I think can. Some of this, clearly as we engage with our customers, we'll talk about what our thinking is. You know, some of them are in beta, and I know David will talk about Firefly and beta, but I'm also really proud of the thoughtful way in which we both innovated on the product, but equally important, innovated on how we will engage with our customers and monetize that moving forward. On the digital experience, given this is Summit, the things that Anil will talk about, content supply chain and what we've done around that, what we're doing around product analytics.
I think we've always had, you know, the vision for being that central nervous system for the enterprise as it relates to engaging with customers. The Real-Time Customer Data Platform, that business we keep talking about how it's achieving escape velocity. With that, let me have Anil talk about all the amazing innovation that's happening in his group.
Thank you, Shantanu. Thank you all for being here. It was obviously great to see a packed house here at Summit. A few key announcements that I'll recap. We are driving this new era of what we call experience-led growth, driven by content, data, and AI, including generative AI. Some of the big announcements that we are really powering. As Shantanu mentioned, the Adobe Experience Platform and the three apps we already have on it are reaching their escape velocity and really growing extremely well, the Real-Time CDP, Journey Optimizer, and Customer Journey Analytics. We were really thrilled to announce today a fourth app, which is our Product Analytics, because digital marketing and digital products are really converging.
Being able to get that integrated view of what users are doing with different digital products and being able to optimize that for the customer experience is critical. Only we can do that because we have both of those datasets in an integrated manner in the Adobe Experience Platform. That was a big announcement for us. Second big announcement was really around the content supply chain, the end-to-end view of how you can produce content, very effectively personalized content to drive campaigns. The amount of content demands are increasing extremely quickly, and no company wants to invest manual resources, whether it's in-house teams or agencies, et cetera, to keep up with that. They need an automated solution.
It's perfect for us to bring our Creative Cloud as well as our Experience Cloud products together and take advantage of all of the new innovations around Firefly and generative AI. That will make the content production much more effective.
Much more efficient. Those were two of the big announcements that we wanted to highlight. We also, in addition to that, have a number of generative AI initiatives within the Digital Experience Cloud, where I believe we're in the forefront. For example, whether it's marketing copy generation, new audience generation, campaign generation, all of these things are possible with the generative AI. We're going to take the best of what's available in the industry and integrate that with our apps. Very thrilled to be making all these announcements at Summit.
Maybe David?
Yeah.
Are you already asking a question?
I have a four-part question. you know, I have.
From the Digital Media perspective, and you know, first of all, just excited to be here and starting to see Ashanu and Neil talk about how these clouds are coming together. But if you take a step back, we have, you know, we're coming off a lot of momentum in the business, right? We saw, you know, incredible demand for both Creative Cloud and Document Cloud applications and solutions. We have a set of new businesses, like Substance and Frame.io and others starting to really take shape. On the back of that, we have, you know, the momentum we're seeing with Adobe Express.
I think today's announcement with Firefly is really an accelerant to all of those businesses, largely because the way we've built this technology and the way we're introducing it to market. The content and the quality of the content is incredibly good. It's really built in a way that is safe for commercial use, which we think takes it from something that has been a point of interest for a lot of people and individuals to something that can be leveraged by organizations. Three, it's built from the ground up for embeddability into not just the creative products and the document products, but also the DX products, as you started to see on stage.
Four, we're doing it in a way that is really pulling, you know, interest and excitement with our creative community into every decision that we're making. I think if we navigate these waters in that way, we put ourselves in an incredibly advantaged position to do great things. Then just quickly closing on the points of interception that we see with DX. We spend a lot of time in the creative business talking to all of you about the creator economy, solopreneurs, individual creatives, and the way we sell into enterprises and mid-market. That continues to be a core focus for us.
As we start integrating, you know, everything from Adobe Express to Firefly and the series of generative models that we're going to be releasing in the weeks and months ahead, more deeply into the digital experience process, we become much more a part of the automation that's happening in these organizations. Very excited about everything we're doing.
Strong start to the year. You all heard it on the earnings call last week. When you think about the top line, $4.66 billion, $410 million from a net new ARR standpoint, both significantly above the midpoints of our guide. Both of our businesses, our DX business and our DME business, up 14% year-over-year. RPO up 13%. Document Cloud, ending ARR, is up 22% year-over-year. Top-to-bottom strength across the portfolio of businesses. At the core of that is the innovation engine that you saw on display earlier today. Company's firing on all cylinders, it's great to see that engine of innovation. In addition to the innovation, I think what makes this company special is we're uniquely positioned against the opportunity in this environment.
Every single company wants to drive their top line, but while simultaneously delivering productivity gains. Nobody's better positioned in that area than Adobe. You see it on display. We're not only driving strong profitable growth for our customers, we're delivering it for our investors. In addition to the top-line momentum we just talked about, significant achievement from an operating margin standpoint, $0.12 a share above the midpoint of the guide. Top to bottom, the company is performing well. Engine of innovation is alive and well, pushing across a diversified portfolio of businesses, driving top and bottom line, both for our customers and for investors.
Great. Thanks for that. Jessica over here is our mic runner. Jess, come on over this way. Let's take our first question from, let's go right here in front of me, in front of the podium.
Thank you, Jonathan. Thank you, team. Just wanted to ask a question about the intersection of the new product announcement with Product Analytics and the next generation design collaboration. When I think about that, I think of those tend to be more IT department-focused. I have two questions here just in terms of thinking about one, the strategy. Is this the Trojan horse for Adobe to move deeper into the IT department? The second question is, if that is the strategy, does it require investments in the new sales force to be able to attack that opportunity?
I can start then certainly people can jump. I mean, if you look at the strength of the Experience Cloud business, I mean, there are multiple centers within an enterprise that buy the Experience Cloud business. I mean, we have fabulous relationships with the chief marketing officer. We have our CIO in attendance here as well. So I think tackling the IT folks has always actually been a skill. I don't view it as a new motion or a separate motion in terms of what we have to do. That's really what's been driving the strength of our Digital Experience business. I think what actually is even more, at least from my perspective, exciting about this, you know, all of our products.
If you look at the products right now, if you look at what the sentiment is, it's the viral adoption of these products within enterprises, that's really what's driving the growth of all of these. When you think of Adobe Express and being able to certainly have a site license, much like we've done with Acrobat, so it leverages all of that. I think the products that you've seen just appeal to users, and they'll just pick it up and start using it, both as an individual as well as an enterprise. You know, they've done a really nice job in these products of understanding whether you're an enterprise user, and therefore, what content management system is at the back end, or whether you're an individual user and how you're publishing it. I don't see this as an extra investment.
In fact, I see it as leveraging the strength that we already have in the IT community. One last thing I'll say there is, I mean, if you look at the success of the Real-Time CDP in particular, that is as core to the, you know, what's running the operations of the business as it is. There, every time you have the business owners driving the need for it. In retail, maybe it's the person who owns the website. At the end of the day, the implementation of that is happening a lot through IT and through the partner ecosystem that we have.
We're gonna have a transcript of this, so if folks introduce yourself and Brian from Oppenheimer, you, Brian Schwartz, didn't introduce yourself. If folks would do that, we'll go to Karl next.
Hi, Karl Keirstead with UBS. Thank you for the fire hose of press releases this morning. I actually could have used a ChatGPT bot to summarize them all, it was good reading. My question is maybe for Anil and Dan on DX. I think everybody understands that given your Q1 performance and 2Q-guide, you need a second-half acceleration in DX to hit your full year guide. I'd like to ask you, what needs to go right to make that happen? If there is a product or two that's going to lead that second-half acceleration, what is it? Thank you.
Yeah. We definitely had a good start to the year, as you saw in Q1 Summit. So far, you've seen the fire hose of announcements of new product innovation. The attendance has been great. It's packed. It is. So far, it's been going really well. We believe that this reflects the pipeline that we have going into the second half of the year. I think as companies of all industries continue to look at this particular area of customer experience management as one of their critical priorities, that is what is going to continue to drive the opportunity set for us.
Yeah. Just to build on a little bit of what Anil said. In an environment like this, deals are getting a higher level of scrutiny. They're getting more C-level scrutiny. We welcome that level of scrutiny because of the horsepower that you saw on display this morning and simultaneously delivering revenue and productivity gains for the enterprise. We're seeing those transformational deals win out over narrow, single product-focused solutions. It's not generating the completeness of solution for our customers, and our customers are rotating towards that. Driving the pipeline, that's on a global basis. Again, the diversified global footprint gives us look through visibility in any given quarter. We see what the opportunity is in that quarter, and then we focus. We intense focus to deliver that opportunity in a given quarter.
Last thing I'd say, we touched on it. Acrobat apps, great business, showing a lot of momentum. That's gonna add. At the scale it's at now, we expect that momentum to continue throughout the back half of the year. I think it positions us well to deliver against those objectives.
Let's go to Kash and then, Brad right next to him.
Thank you so much. Sorry, Jay, I had to go before you, but it was my design.
You know, today, Jay is reaching a very special milestone. He's gonna complete 40 years of being a sales analyst on Wall Street.
Wow.
Congratulations, Jay. I don't know how you do it, but we'll have to do a fireside chat with Jay one of these days. Chat with you guys on how we dealt with Jay for 40 years. Anyway, Kash Rangan of Goldman Sachs, congratulations on Summit. I'm curious, given the talk of the town is generative AI, Microsoft has talked about the position, taken a few opportunities to explain how it impacts the industry, their business. We didn't hear much. You did entice us with this Firefly thing, but take a step back. How truly significant is this? You've had Sensei for a decade or so. How does this really change the landscape, or is it something that the industry is jumping in? Not exactly sure, but you don't want to miss out on it, then we're gonna be disappointed.
How do you view this? What is your level of conviction that this is something that is more sustained and long-term, and durable? Thank you so much. Congrats, Jay.
Congrats, Jay. I'll start, Kash, and then maybe David, you can certainly add to that. I mean, you know, Kash, I think while the attention is on generative AI, it's not a new journey for us in terms of saying, if we want to unleash creativity for all, how do we make our products more accessible, productive, fun, and easy to use and make them available on all surfaces, make them available for all media types, enable the onboarding with Adobe Stock. I think what is not new is the vision that we've always had of saying, how do we get more and more people?
We do believe everybody has a creative idea, and if you have billions of people who have that creative idea, translating that, you know, whether you're a business, whether you're a small and medium business, or whether you're an individual, has always been one of our goals. I think when you look at AI and what AI has been able to do, I mean, we talk about Neural Filters in Photoshop, and when you do Neural Filters in Photoshop, Content-Aware Fill. I don't know how many people have been here and talked about Content-Aware Fill. You know, that just makes this magical way of allowing our creative products to be, you know, phenomenally accessible to people. The AI part, and again, it was a matter of great pride. I mean, we won an Oscar this year for 3D.
We won an Oscar for technical achievement for imaging, for video. I mean, the depth of our understanding of this is really so profound that I think we've always believed in how computing can help with, you know, automation, accessibility. I think the focus on generative AI for the creators, first, we believe this assertion that it's not going to replace humans, it's going to augment human ingenuity. You know, templates sort of did that. I mean, people are terrified of the blank slate, and if you can start off with something that enables you to get the process going, it's actually going to attract more people to the platform. I think in terms of enabling more people to the platform, we think we're very excited about it.
In terms of adding more value for a Creative Cloud subscriber, we think, you know, that's also exciting. I think what's new and maybe why there's more excitement around it is there's this confluence, tipping point, inflection point, or whatever you want to call it, of all these technologies that have come together, right? The ability to take text and encode that text and create a mathematical model associated with that. The ability to have all this data and, you know, how do you create data? We have been, and I'm sure David will touch on this, the curation of that data and the training and inference of the data, I think we have some of our researchers here also in the room, and, you know, I've spent days with them trying to understand that, and you marvel at it.
I think what's really new, what NVIDIA is doing today with GPUs and, you know, their training chips and their inference chips and being allowed to do that, the fact that we can store this on the cloud and get it. I think what's new, Kash, is that all of these are coming together in a really incredible way. We've always been thoughtful about we have to do two things. I think a lot of the noise that you're actually hearing are from companies who don't have a model, who don't really have, you know, the core data. All they've done is they built a workflow.
They built some sort of workflow, and they're like, "Hey, we're now in the AI generative space." The fundamental models, whether it be a text model or a visual model, there are three or four companies on the planet that actually have the capability and the science to do that. Adobe has this unique opportunity in that we can both do the model, which is Firefly. I mean, the hard part about Firefly is making sure that it works for images, that it works for vectors, that it works for 3D, that it works for video. We have that, and we have the footprint or the tools that enables people to say, "Hey," now suddenly it accelerates people's ability to do what they want to do. I think that's the unique thing.
Yes, the buzz is there and the excitement is there because of, you know, ChatGPT and the other ones, and, you know, we've integrated that as appropriate. I think this fundamental notion of how do we get technology to enable more people to do what they do? At the end of the day, is that gonna be more people coming in or less? Our assertion is there are gonna be more people, and they're different business models, whether it's, you know, what we might do in stock to allow generative content, what we might do in the apps to allow, you know, people who are subscribing to get access to this, what we might do as APIs.
An enterprise might say, "Hey, I'm not gonna use your, you know, non-curated data." If you have the world's largest content supply, you're like, "How do I use your model to train my data so that only my model is superior in what I'm doing at whatever large content company that you have?" I think we've been extremely thoughtful about all of that stuff. We don't focus on the press release. We're focusing on the innovation, despite the fact that we have a lot coming.
Yeah. I think that was very complete. The only things I would add are, first, we believe very much that, you know, generative AI is the starting point of a workflow, right? When you create an asset, whether it's an image or a video or even a ingredient, a creative ingredient like a Photoshop brush, generative AI lets us do that in a more precise way than we've been able to do in the past, and it avoids the blank screen problem. You can take it into Express and start to adjust it the way you want and create something more precise, or take it into Photoshop or workflow it into a DX workflow for automated marketing content generation.
We think that generative AI is the foundation that is going to really inflect this business, not just, you know, what people can do with it. As we look at the business, we look at it in terms of three things, three ways we've always talked about. First is top of funnel, right? How do we bring more people into Adobe? Express has already started us down that road and is doing a great job. When we start to be able to tell people, "Come to adobe.com/firefly," and all you have to do is type in a prompt of what you want and you start that process, it increases creative confidence in those who don't have it, and it brings people into the funnel.
Top of funnel is massively benefited by I think what we can do with generative technology. The second thing is we've said we already have very good retention rates, but what are the things that are gonna drive that, those retention rates even higher? It's the ability to be more productive in the product. generative AI integrated into our creative flagship applications, I think has the potential to drive more engagement than we already have, which is already high. The third piece is the ability to also offer new upgrade opportunities, upgrade SKUs, to take people who are using our products and are able to sort of upgrade into more capabilities.
Every step along the way, we have built Firefly in a way that's embeddable with the community in mind, with usability and value in mind, and with the business model in mind.
Okay, let's go to Brad, and then, we'll come to Mark after that.
Great. Thank you so much. Brad Zelnick with Deutsche Bank. Thanks for making time for us. This is obviously a customer-focused event, so really appreciate you hosting the financial community. Great to see the innovation on display, particularly as it spans across both creative and DX, which we've been writing about, we think is really.
Special and unlocks a lot of potential. My question is really around partnerships. You announced a relationship with NVIDIA or expanded upon the relationship with Microsoft. As we think about all this opportunity with AI and the partnerships to come, there are certain things naturally that Adobe is going to look to partners for. You're not going to become a semiconductor company. I don't at least I don't think. Anything that's core
We hired Dan, that was step one.
Maybe there is a message in that. But there are also things that naturally, you know, 100% belong to Adobe. When we think about the gray area, the things that are in the middle, what is the guiding light for you as you think about where it makes sense to own versus perhaps compete or cooperate? Maybe if I could sneak in one for you, Dan. As we think about the time horizon of the investments required to drive Firefly and all these exciting things, AI is a huge proposition. How should we think about, you know, the guiding principles around investment time horizon and when we should see the payback? Thank you.
Do you want to start with DX?
Happy to tell. In the world of DX, as we talked about, when it comes to creative content, where we have a secret sauce, where we have special assets like Firefly, it obviously makes it make sense for us to work very closely, and from the get-go, integrate Firefly with Adobe Experience Manager assets and with other tools like the other with Adobe Express. When it comes to things like large language models, for example, we don't necessarily have a, you know, any special experience in that area. One of the things that we have done, therefore, is work with Azure OpenAI, work with LangChain, work with some of the other ones to do that.
The criteria that we think of is when we have any kind of proprietary data set, if we have proprietary experience or we need something special for our apps, that's where we would build it. Otherwise, it makes sense to partner.
Yeah. From our perspective, you know, the things that we know we want to and need to own are creative media content generation. That's why with Firefly, we started with image generation. We're also doing text effects. You'll start to see literally in the next few weeks and months, new capabilities lighting up all the time. The progress and the pace of innovation there, I think is gonna hopefully surprise all of you with just how fast and quick we're moving there. We've got, you know, hundreds of people that have been working on all these creative media types for, you know, well over a decade, and this is really the inflection point that lets them get a lot of this research directly into market.
That is an area we feel we really need to own. Specific to the announcements, for example, with NVIDIA too, our partnership with NVIDIA had their conference today. Our partnership with NVIDIA is different than what everyone else is, or most people are announcing in terms of their partnership, because we're building and leveraging our own model along with their chip and cloud infrastructures that they, that they announced today. We have complete control over what we do, including the ability to build it in a way that can be atomized and integrated into our tools and workflows. Other areas like large language model, when Anil's talking about, he's talking about ChatGPT and Bard and those kinds of things. Those are areas where we think we can leverage partnerships to drive more conversational experiences and interfaces on top of our products.
By the way, as we do that, I think there's also growing confidence that not only will it make our products more approachable and easier to use, but it also starts to fade away the hard lines between the products in effect, allowing people to work across DX and VME in a much more, I think, smooth way than it has been in the past. That's how we think about it.
Maybe two other quick things, Brad. It's not new. If you think about what Adobe has done, right? I mean, the chip one is straightforward, which is like, you know, you pick up the phone, talk to Jensen, and say, "Hey, we'd love to figure out what we can do on training and inference," and, you know, we're completely aligned around how we can make that faster, cheaper, better for everybody and sort of generate the ecosystem. If you think about Adobe's evolution, I mean, when we have to say, where do we leverage the operating system versus where do we have to build something by ourselves, I would say it's similar to what you're talking about here now, which is like when the Mac was there and Windows or Android and iOS, there's always what is Adobe's secret sauce, as David and Anil said.
The visual models and understanding that part, I think we're in the early ages when we partner also of how enterprises are going to buy this. They're going to try and understand what data went into it, what's the curation, what's the training, what's the ethics. That's the other reason why I think we have to be really thoughtful about who we partner. I mean, you're going to see a whole bunch of other companies, they're open source. You know, the question is, what data went into it? You know, clearly there's a fair amount of litigation also going on. I think that's the other reason why we have to be really thoughtful about partners. Most important to us is our innovative agenda. We're going to invest in it ourselves, is the way we think about it.
From an economic standpoint, as David said, and we've talked about, we've been at this for over a decade. We've brought this magic to life in the market with the, you know, rule of 60 we've been operating under for the last six years. Very strong profitability that sits underneath that revenue growth and projection. That's the journey to date. We've been very efficient, but we've got some of the world's foremost experts in these technologies, hundreds of PhDs sitting in our research teams developing these capabilities on a prospective basis. Given some of the broad contours, we'll say more when we come out of beta on how we're going to engage with customers and go to market.
Just from a broad contour standpoint, we talked about, you know, partnership with NVIDIA, co-optimization of software and hardware to deliver breakthrough capabilities, from a performance standpoint. It's something the technology industry has been doing for decades. When there is that real use case out there, the underlying technology and the unit cost of that technology falls precipitously. We're in the early stages, and we already see those trend lines bending. As I look forward, that's gonna be a precipitous fall in that underlying cost of technology. I think the most important thing is how companies can bring this technology to life in the context of workflows to be real productivity enhancers for customers. I think we're in a unique position there. We'll have more to say, once we come out of beta and talk about how we engage with customers.
Thank you. Mark Murphy with JP Morgan. Shantanu, I'm glad you're physically safe after being the first line of defense, when the protesters came on stage. It was a little scary watching that go down. I wanted to ask you a question on the actual generative models, the AI, the generative image models that you have. Fundamentally, how does yours differ? If we look around and we think about DALL·E two
Mm-hmm
and Imagen
Mm-hmm
and Midjourney, and Stable Diffusion.
Mm-hmm.
Can you compare and contrast that part of it? I think we have the understanding of the workflow...
Mm-hmm
the security, the administration, the integration. Critically important, and I think we get that part. I'm just trying to understand, because I think what we've seen from Firefly thus far is we saw the tent image and then, kind of a, like a desert landscape being placed behind it. Could you compare and contrast that a little bit? I think what we're wondering is will Firefly actually be able to... Does it create something from scratch? We're eagerly anticipating, kind of the demo opening up to us. Can it kind of imagine and create and render something, brand new, or is it kind of reaching a little more into stock photo, repositories and kind of adjusting the lighting and the placement?
Sure. I'm happy to start. Maybe, you know, if there's sufficient interest, Jonathan, at some point, we should probably do a more detailed sort of debrief on the AI and, you know, what happens with the AI. I mean, first, the way models work is, as all of you know, you're training this data based on, in this particular case, images and text. You know, we have this thing called PAIT. You're training all of these models. The first question that goes into this is really a lot about what is that training set of data, and how are you gonna be able to get that training set of data? There are a lot of questions that you can do associated with that. I think we've been very transparent about the data that we've used. We've used Adobe Stock data.
A lot of the other companies are actually using data that potentially, you know, they're scraping the internet. They're, you know, accessing data that they may or may not have rights and license to. The first hard problem that we had to tackle was, you know, how do you curate this data? Again, I'll geek out a little bit, which I apologize for, but just to give you a sense for what that means. When you get this data, and you have even, let's say, you've tagged an image and saying it's a family with, you know, it's a family on a vacation. I was actually gonna say a family with dogs, and then I said vac- but it's a family on vacation.
The AI that even goes into understanding how you crop that image to train the model is unbelievably sophisticated. The first thing that our researchers have done is to, when you look at this vast amount of data, how you train it to get the model efficient. I guess I'm speaking to why the model is way more efficient and why there's rocket science involved in that. The second thing that we've done. That's pre-processing. There's a lot of pre-processing or training where we have to look at the model. We think that that's important because people are gonna say, and our creative community is gonna say, "I want do not track, so I don't wanna be part of that training data." We created that entire infrastructure to allow people to do that.
Once an image is generated, the way the term actually that's used in AI is hallucination. I mean, you're training this model, and then, you know, what happens is when you're trying to get an image out of it based on a text, it's hallucinating. Even there, then there's what's called post-processing that we do an incredible job of. I'll give you two examples of that. When we know that the text has to do with a face or an image, you know, we have this incredible face technology that we've had for decades in terms of recognizing faces and, you know, taking advantage of that, or multiple resolutions. You know, how do you change the resolution of those images? A lot of the science that goes into that is, you know, how much training data do you have?
How much are you sampling? How are you doing the autoregressive models, as they're called, in terms of doing that? That entire experience of going through that is absolutely critical for us because then we understand, you know, how the products are built. I mean, we can go on about what that differentiation is. First, the data set is really important. The feedback loop. I'll give you one more demo, which I don't think we've shown. Have we shown the Illustrator demo?
Yeah.
There's a demo in Illustrator. you know, I guess.
Preflight.
Huh?
We preflighted it.
It's preflighting it. You can go to one of these other things, you can say, "Generate me a picture of a bird," and it'll generate you a picture of the bird, and you download it, right? It's probably, you know, downloaded as an image or an SVG. If you wanna use that in Illustrator to create, you know, some artwork associated with that, you have to vectorize that. Trying to vectorize an SVG image, sure, it'll vectorize it. It'll give you 40,000 spots. Try and edit that. It's impossible if you have done it from somebody else.
If you do it in Illustrator, and you do it with our technology, you'd be like, "I know this is a vector." The demo that it shows you is it creates this, generates this image with our model, and then it says, "Okay, I want to touch the wing," and it knows the entire wing is one vector, and you can change it conversationally from green to blue. It's the way in which the model works with the applications that works with the data that I think is our how am I doing so far, Dave?
You're doing great.
Yeah. Is really the secret sauce, it's both the learning and yes, there will be other models that exist. Certainly, you know, there's the issue of the more data that a model has, the better it is. The hard problem is doing it the way Adobe did, which is doing it in a commercially safe way. If some of you have asked, you know, why are you late? First, it's such the early innings in this entire space, you know, when Adobe comes in, it's going to change the game. Second, this aspect of how we've done it, I think is really the secret sauce of what we've done.
Yeah. The only thing I would add is, you know, just very specifically, yes, the models are fully hallucinating the output. It's not remixing elements of stock. Just to be very clear.
Yes. Oh
it is a full generative model. You know, whether it's the vector example, whether it's, you know, the ability to generate an image and a mask, a perfect mask associated with it, whether it's the ability to generate, you know, gradients that can get applied so that you can sort of change
Layers in Photoshop. Yes.
layers, repaint and all that stuff. This is why it's important for us to own the model, and this is what fundamentally differentiates the model in addition to the training content. you know, this is, I think, our magic sauce, and as Shantanu says, we are gonna change the game here and take it from something that is, you know, a amazing technology demo and use for individuals into something that can be truly used as a foundational part of the workflow for enterprises.
We're really excited for all of you to play around with it, but I just wanna make sure folks get it. This is the real deal. You go in. My first one I played around with, my prompt was an ant playing the piano. Then you can choose the styles, and there's a range of styles. You all get to play around with it. I chose a shallow depth of field, amazing content. Then I chose a polar bear. Polar bears playing poker. That was cool, too. It's a realistic size.
You can select the image, you can change the image, you can change the text. I mean, it's intuitive.
The aspect ratio.
aspect. Exactly.
I think it has a completely differentiated user interface that gives so much control. It's really, really cool. You guys are gonna be very impressed with what you see. I've been biasing this side of the room, so just if I could get hands up on this other side of the room to make sure you're not left out. Jay, I think we'll finally give you a chance to respond to Kash Rangan, then we'll go to Mark right next to Jay.
Thank you. I feel much better now. Shantanu, much of what you announced today is the culmination or continuation of what you've spoken of successively at Adobe Summits the last number of years, including, for example, content orchestration. You've talked about content supply chain for a number of years. The new AAM reminds me of the early days of DPS in terms of content dissemination.
Mm-hmm.
The question, though, is how do you see these things as being a catalyst for inducing something else you've talked about, which is a whole new class of application and intelligence services, not purely products, per se. Related to that, also for example, consulting services, which seem to be part and parcel of your internal investments and what you need to do to go to market in this new era. A clarification on Firefly. When it was unveiled at Summit two years ago, it was explicitly positioned as an API framework, with some very nice graphs to show exactly that. Since then, it seems to have morphed into this AI system, and question therefore is, how did you do that? Or was this the original conception two years ago to get to this point with Firefly?
Wow. Multiple questions in there, Jay.
One for each decade.
First, I think in terms of imagining the possibilities, I mean, what's just this unbelievable, you know, privilege at Adobe is that you sort of plant these flags and the way the product teams envision how things are going. You know, it may take a circuitous route, but what they come up with is always significantly better than what we as leaders sometimes talk about in terms of what's happening. I think when you look at the AEM announcements, which was your sort of first question, and intelligence services, you know, that's why we try to say we focused a lot on content and the creation of the content. Without marrying the content and data, we recognize. The number one problem that I hear from large customers worldwide is we're creating all this content. We have no idea where the content is.
We have no idea what the efficacy of that content is. We have no idea what the, you know, where it's being used. That content and data was really, it's always been one of our holy grails of how do we conquer art and science. The other holy grail that we hear about, and it was true when we were coming up for our own launch, this whole ability to democratize who can create a website or a mobile application or a campaign. You know, making this available both through AI, making it available through easier tools, making it available through workflow, and making it available, you know, through what we are doing by embedding it in our products, that's really always been one of the goals, which is in the creative community, how you can get more.
In an enterprise, how can anybody, how can any product manager go ahead and create their website and create their mobile application and engage with the customers? That's gonna be the most authentic conversation. The API conversation that you're relating to, that's where I think, you know, we always said all great ideas You know, exists only at Adobe, and how we can make that happen for, you know, whether it's all our partners. All the agency partners in particular, and maybe you can touch on that.
Yeah.
-they want access to this because they want to embed this in their own proprietary systems, and they're like, "Hey, we know we can do it. Adobe, you're the best technology company." I think unveiling exactly what's happening with APIs, I wish I could be here and tell you these are the 10 kinds of applications, but I think people will amaze us with their ingenuity. We've thought architecturally a lot about how do you embed it in our products, how do you make it accessible, how do you make it available through an API? The fact that Firefly and AEM, this integration, I could not be more just in time in terms of.
Yeah.
you know, when this stuff is coming together, but it just speaks to architecturally how we've been doing these building blocks. I don't know if that helps, but, you know, I think it's we've always been like, how do you think about the North Star of what this could be? Allowing multiple people to do it, having it available as API to developers, enabling an ecosystem, integrating our clouds in a way. At this confluence of content and data, I think that's what's driven us, and it's great to see how many people are resonating with it on the customer side.
Yeah. Maybe just to add on to the API question. Clearly, you know, we mentioned in the Adobe Experience Platform side, we now have over 450 partner integrations. Those are all API-driven. There it's really taken off because in the world of data and data-driven services, it's a natural thing to do, and both technology partners and system integrators have really latched onto that. I think on the content side, it was a little slower because people are used to end user tools, desktop tools, et cetera, for content. The publishing of the content was, as Shantanu was saying, was not a democratized process.
That combination of that changing, where you have a lot more of the content management being driven by API so that you can use the content anywhere, on a website, in a mobile app, et cetera, it really opens up where the content can be used. The fact now that you can have all of these users who want to democratize to them so that the end user, the marketer, can use it directly, that has driven huge demand for the API technology.
Thank you very much. Mark Moerdler, Bernstein Research. I'm not gonna try to follow Jay with 12 questions, but maybe, if you don't mind, I'll ask two. Amongst the numerous announcements today was Express Enterprise and the integration of Firefly into Express. Can you give us an update of how, where you're thinking of the lines between Express and Creative Cloud? Or should we just consider them long-term as blending together into a single whole? On that one, I'll get that one, and then I'll ask the follow-up, if you don't mind.
Yeah. Our vision for Adobe Express has always been about creativity for all. It's something that has been, you know, core to the company for a long time. We were very thoughtful about the name Adobe Express because what we see Adobe Express going to over time is actually creativity is the new foundation for productivity, right? There was this large separation between creative applications and productivity applications. If you look, you don't have to look very far. If you have kids in school or if you look at the next generation, they're not just writing papers in school. They're creating, you know, fully interactive experiences to talk about whatever they're looking for. You don't have to look beyond what a solopreneur or creator economy individual wants to do in terms of creating content.
What we saw today was actually that is actually playing out in enterprises as well. Express, we actually look at Express as part of Creative Cloud, but it is really as a product set. The design center is really the intersection between Creative Cloud, Document Cloud, and Digital Experience. It's the way for all of those individuals, whether you're a marketer, whether you're a knowledge worker, whether you're a creative, to have a capability with speed and ease to create content. Then when you start putting it together, and they'll talk about this, when you start putting it together with the DX product set, you can really fundamentally change the content supply chain and involve everyone in the organization participating in the marketing workflows.
Okay. Following on that, exactly. On the Experience Cloud, a lot of the focus is on is not just adding GPT capabilities. A lot of this seems to be significantly expanding the analytical capabilities driven by AI. Can you give more color on how you're thinking of that going forward? Does analytics become the true secret sauce, or is the integration of the content into experience or is it both of those that come together to create the differentiation of the product?
Yeah, I think it's all of the above. What we see is generative AI, just as Shantanu was saying, is the evolution of AI. We think the generative AI will become a foundational element. Just like content and data, AI will be a foundational element of the next generation of Experience Cloud, which means that anything to do with customer experience, planning, execution, analysis, all of that will get reimagined. When we think of what marketers do today and how they do it, both will change. What I mean by that is when you think of what marketers do in terms of how they think about what a customer journey should be or what audience they are looking for, what campaign they want to run, all of those today are essentially constructed from scratch.
With generative AI, again, just like you would do with content or anything else, you start with a fragment or You don't start with a blank slate anymore to what Shantanu Narayen was saying. All of these things you will get something to start with. How they do it is the conversational interfaces. Today you might have tools where you're clicking through things, you're assembling them, you might be using API capabilities and getting it through a program, but you can just as easily start with a conversational interface and just say, "Create a journey for me for retention of active users of this product," and that'll start the whole process. Both what people do and how they do it will go through a fundamental change.
Maybe tying these two together, Mark, just finally is in terms of SKUs. Let's get down to SKUs, right? I mean, we will have an Express which is certainly available for any consumer who wants to come. As David said, top of the funnel, somebody has an idea, you know, they do a search engine, they want to, you know, remove a background from an image, they want to create a flyer, they want to create a logo. That's a massive funnel on the consumer side. On the small and medium business side, they want to run their business on it. I think the Express for Enterprise specifically, what are the use cases for what people want Express for in an enterprise, and why would that be a, you know, hopefully a site license where everybody has Acrobat?
It's in the use cases that both of them mentioned, which is it's in the use case of I'm trying to update the website, and I want Express to be able to do it, and I want to be able to use the generative AI capabilities to maybe come up with ideas. I want to be able to use Express in an enterprise to create an email campaign. You know, it's going to provide the agility. It's going to provide me different ways to do that. I want to use Express in the enterprise even for embedding a picture in a PowerPoint. So the Express in enterprise SKU has very, very clear workflows. I think the embedding and integration of that with AEM or campaign or conversational marketing, as Anil said, you know, are some of the more active use cases.
Your last question, you can create all of that stuff when if you don't know the analytics of, okay, how many people looked at that campaign or how many people when you created five images and you tried five, which of them really worked and which one should I use and which shouldn't I use? Hopefully that ties together what David and Anil said into how Express for Enterprise has this unique capability.
Okay. We're gonna try to squeak in three more. Right behind Mark here, and then Saket, and then Kirk to take the last word.
Thank you. Brent Bracelin with Piper Sandler. Two questions, if I could, real quick. Number one, I want to go back to gen AI and the ambitions here. Firefly sounds like a family of products. You talked a little bit about the value of internally building that model. How big of a family do you want to build? Is this gonna be a small family related to images? Is your ambition to build a really big family, maybe multimodal ambitions? Walk us through, like, the vision there on the size of the family.
Yep.
I think if you could touch on monetization again.
Okay.
If I think about gen AI models out there's obviously a cost to inference and training. Love to double-click on monetization as well. Thanks.
Yeah. Think of family dinner at Thanksgiving. No, there's a very long pipeline of innovation. What I will tell you is just, you know, it is such a pleasure going into the office these days because there's so much energy and excitement. The, you know, Shantanu, Anil, and I were in India, visiting our offices there two weeks ago, and the amount of generative AI innovation on top of the Firefly platform was amazing to see. We come back to San Jose and San Francisco and there are so many interesting ideas that are coming out. What I would say on this is that Firefly is, as we've talked about, it's a family of models.
The first two models are about image generation and text effects. We've already said you'll start to see more models, including, you know, the little sneak that Shantanu gave you around vectors in the course of the next few weeks. We expect to continue to expand that to include video, to include 3D, to include design, and sort of aggregated content types as well. There's a long list of those things to do. The second thing is, where do those actually show up, right? As I mentioned, we can and already do with Firefly. As we expose it out of beta, we can and we'll start having people come to adobe.com and see a text prompt.
You type in what you want, and we'll get you started. You know, basically we've opened up the available audience to anyone that wants to create or access an image. Then we'll take that image, and then we'll give them up the ability to move into Photoshop or move into Express or move into Acrobat, whatever type of image they're creating. We have the user journey capabilities that we can drive folks to. That opens up a really broad optionality from a business model perspective, which as I mentioned, is really all about top-of-funnel expansion, which we just talked about, embedding into our existing products to drive increased retention, even though retention has continued to be very, very strong for us.
Three, the ability to look at additional SKUs or packs that we upsell people to. I think, you know, broad aspirations around this.
Think of it as where we have applications, that's where the models will be in the application. If you're like, "Wow, are you going to go in a space where you don't have an application yet?" That's probably unlikely to be, you know, in the short run.
Right.
And going back to Brad's question again, an area that we have. Monetization a little bit.
Yeah.
That was the second question.
Yeah, monetization that, yeah.
[audio distrotion]
Okay.
Thanks for holding this. Saket Kalia, Barclays. Really great to be back in person. Anil, maybe a question for you. You know, I think for Adobe Experience Platform, we said was a roughly $500 million kind of book of business, and you guys correct me there if I'm wrong. Growing rapidly, escape velocity, right? Kind of tone. Maybe the first question is, how much of an uplift is this giving for those customers that are using what I'll call prior generation sort of Adobe DX tools, how much of an uplift is that AEP on top of that, right? That's the first question. Then secondly, as AEP goes from $500 million to $1 billion to $2 billion, right? As it scales, is there a material margin differential
That we should keep in mind as AEP becomes a bigger part of the DX business. Does that make sense?
Yeah? Sure.
Yeah, I think first, yeah, we are approaching half a billion dollars very quickly. We'll talk about that in the next call, I'm sure. You know, in terms of the lift that customers get, this is a new set of capabilities that we're providing to the Adobe Experience Platform, where previously they would have had to either take two applications that Adobe had or some of the architecture that they had, they would have had to do the integration together. With the unified profile, with the real-time activation that we're building, with the trust and governance and data privacy that we have built into the platform, this is a new set of capabilities. What it's doing is customers are realizing, just like Susan talked about with Prudential, I can take...
She gave the example of we have 10 different things, platforms that we have been using, we can decommission those by really adopting Adobe Experience Platform in the native app and extending that into our architecture. We see that over and over again. In terms of your second question on margins, as we are scaling, we're already starting to see the improvement in margins. I mean, Adobe had the foresight to invest in the Adobe Experience Platform five years ago, when a lot of these underlying technologies were still pretty immature. Now we're seeing a lot more maturity in the underlying technologies for, like, the databases, for example, to hold profiles and so on. We expect to see continuing improvement in the growth margin.
If a question at the back of people's mind also on margin, not specifically your Saket is, "Hey, all this generative AI, how much has Adobe invested?" I want you folks to know we've been investing in this, you haven't necessarily seen that. I have to give Anil and David and Dan kudos for, you know, while we're doing this, we're always prioritizing where else we get efficiency. you know, we've generated this model and we continue to deliver great margins. we will continue to be focused on that. If that's also a question at the back of people's minds because they see what's happening elsewhere, but that's a reason to partner more aggressively with NVIDIA as well and saying, "Hey, let's improve the training.
Let's improve the inference. I wanted to cover that margin on cogs, on generative AI as well, in case that was in, you know, people's minds.
Gr.at. We'll take a last question and then any closing thoughts from that. We have gone a little bit long, so we've maybe eaten a little bit into the time for meet and greet, but it's good discussion, so I'm making them stay up on the stage here.
All right. Well, thanks for sneaking me in. Kirk Materne with Evercore. Questions on the DX side of the business. You know, for a long time in digital marketing, it's been a bit of a point vendor market. You've had a lot of best in breed out there. You know, whether it's the confluence of your portfolio coming together as well as maybe the economic backdrop. On the last call, you guys did call out that you're starting to take share. Do you see this as a really sustainable trend right now, if we finally hit the point of buyers wanting to have more of a portfolio or a platform approach to this technology? Any color around that would be great.
Absolutely. I can start, Shantanu, please. Yeah, historically, you're right. The chief marketing officer maybe was not at the top of the priority list for the IT organization and so on. They went in and they took whatever point technology was available for specific problems, so they could get going and do something with it. To your question earlier, that has completely changed. I mean, this is the investment in customer experience in transforming the customer experience. This is a C-level priority in the top two or three priorities for the company, and therefore it is getting a lot of attention. As it gets a lot of attention, to Dan's earlier comment, people want to know, "Hey, this is a 10-year investment that I'm making.
I want to really work with a company that is my partner for the next 10 years," and that is clearly benefiting us. The underlying technology has grown more complex. When you think of now the kinds of announcements we made, intensive, data intensive, compute intensive, needs to integrate really well with content, has to be democratized, has to take advantage of these new technologies like generative AI. That's hard for single product companies to do. I think the technology landscape as well lends itself to people like us, continuing to take share.
Closing thoughts?
Well, again, thank you for coming. again, I think the depth of questions, I mean, we keep hearing about, hey, innovation on creative. We didn't cover Document Cloud. I know that was not one, but as you know, Dan alluded to, I mean, it was great to see 22% increase when you think about constant currency and AR, ARR, what we've done with Acrobat on the web. I didn't want, you know, that other powerhouse in the portfolio not to, you know, at least get its fair share of exposure here. We're excited. I mean, we're excited about as soon as we finish this, going and talking to customers and anticipating what's on their mind. I think to your question as well earlier, I mean, it's events like this that are compelling events.
There's no question because, you know, they hear from their peers, they hear from others, and, you know, it allows companies like Adobe to not just have a forcing function or a catalyst for the innovation delivery, which drives our product people, but then to engage with customers. We appreciate your taking the time to be here with us today.
Yeah.
With that, I'll give it to you.
Great. Thanks, Shantanu. Thanks everyone for joining. Operator, this concludes the event, we'll end the webcast now. Thank you.
Okay.