I think I'll start with the same awkward silence and pause as we did this morning. I'm kidding. Good morning, good afternoon, and good evening. A warm welcome and a big thank you to everyone joining us for this year's Tech Ed. Along with our online audience, it's great to see so many of you here in person today. This, as we talked about in the morning, is my fourth Tech Ed. The first one I'm attending in person, I'm already amazed by the incredible energy and the community spirit that I feel in the room and in the show floor. A big thank you. Also, please join me in giving a big thank you to the teams that move mountains to make this event possible with content that's fit for the times that we're in. Let's give them a big round of applause, please.
Now, given this opening keynote is the closing session for our first day here, I hope you've already made good use of your time today to explore the show floor or our event platform if you're joining us virtually. I would love to know, though, with a quick show of hands, how many of you visited at least two demo booths today? You guys have a lot of catch-up to do tomorrow in the back there. How many of you have been to the Joule Lounge? You guys have to go see the robots and the little dog that does the backflip. If you haven't done that, please make sure you put that on your schedule for tomorrow. I hope at least all of you have explored the community clubhouse, which is very engaging and very active.
From a TechEd perspective, what makes TechEd unique is that, as we discussed, it's an event created by our community for our community, our community of developers, users, customers, partners, and SAP employees. This is a big event for us. We have over, not 2,500, but the last count was over 2,800 customers and partners in attendance here at Berlin and nearly 20,000 that are joining us virtually. With TechEd on Tour, we're bringing this in-person experience to cities around the world, from Bangalore to Sydney, all the way to Louisville, Kentucky, in the United States. Speaking of Louisville, we actually have ASUG Tech Connect happening right now for our North American community in the U.S. Despite conventional wisdom and common sense, we decided to kick off both events together, connected wirelessly at the same time. Let's see if that works.
Geoff, can you hear us?
We are perfectly in sync and on time this morning from Louisville, Kentucky. How are you doing?
Excellent. I'm doing well. Let's give our ASUG community in Louisville a round of applause. And our event team to make sure that this worked perfectly, even though we have a recording and some jokes that we were going to go do if it didn't work.
Yeah, absolutely. I mean, when we talked about this a year ago, Muhammad, we didn't think this was possible. This is the amazing technology that we have. We brought together the North American community for a third year. This year is one of our most important Tech Connects ever as technology continues to accelerate, which you are so well aware of. We are here at a time where there's a broad range of business challenges. AI continues to innovate faster and faster. We live in a world where there's more global uncertainty than ever. We also need to focus on professional transformation as a precursor to business transformation, learning, connecting, and growing so that we can get the most from our SAP products. It's more important than ever, which you well know.
Muhammad, with TechEd wrapping up its first day, I'm curious, what are you hearing?
What we're hearing, Geoff, is that people are tired of keynotes and keynotes that hype up AI. I see some smiles. We also hear.
I hope this audience agrees with you, Muhammad.
We also hear that people are tired of keynotes claiming we're past the hype of AI as well. The reality is, AI is indeed every part of the conversation. There's talk about AI changing roles and, of course, the associated anxiety around it. There's also a lot of optimism and excitement about the potential of AI. The question is, how can we put AI to work to make the lives of developers easier? That's what we're focusing on here at TechEd. Does that resonate with you and your audience, Jeff?
Muhammad, 100%. It 100% resonates. As a matter of fact, Walter Sun, Yaad Oren, and Stefan Steinle joined us for our keynote here this morning. Their message was absolutely clear. We, as a community, need to really focus on skill development and learning in order to ensure that all of this future technology gets deployed smoothly and delivers business value as promised. We learned that new technologies are emerging that will impact business over the next two-plus years in areas like quantum, robotics, UX, all of those things. Our community tells us that they need to get more familiar with what's coming next and how to stay ahead of the game. We also hear loud and clear that SAP customers are spending too much time maintaining existing legacy systems.
They want to know, how do they free up time to harness all this great innovation that you're going to speak about so that we ultimately can innovate faster? Muhammad, overall, lots going on. I know that you have some great news for everyone here today and to help them free up time and innovate. I'm going to hand it off to you and have you kind of talk through all the amazing innovation you've been thinking about along with your team.
Thank you, Geoff. By the way, great background. It looks very sustainable and green in the back there. Anyhow, Geoff, thank you again for being on, though. I know it's not a goodbye from us just yet, since I know you and the ASUG community will stay with us for a little bit longer to hear more about the announcements we're making in this keynote. Now, the question of how AI will impact development and developers is not a theoretical question anymore. Every week, we hear two conflicting opinions. One that says that there's a developer shortage and the other that says AI will replace developers altogether. Now, both of these can't be right. The truth is, developers aren't going away. They are getting supercharged. They're becoming the architects of smart, connected businesses. The real question isn't if we need developers.
It's how fast can we empower them to thrive and lead in this AI-native era? It comes down to two simple things. First, every enterprise is becoming a data company. Second, every user experience from the front line to the boardroom is becoming AI-driven. These shifts lay the foundation for what comes next, the era of agentic AI, where AI moves from just being a tool to becoming your trusted teammate. As developers, you don't just code anymore. You also design intelligent workflows and supervise AI agents to shape real business outcomes. We know teams are already stretched thin, and backlogs are never-ending. We're not asking you to do more with less. We're giving you smarter tools so you can do more, more quickly. Our strategy at SAP is very simple.
Best-in-class applications across the breadth of the business processes—finance, spend, supply chain, HCM, customer experience—that then produces a harmonized, managed, and governed data layer, which in turn powers world-class AI. This seamless app, data, and AI flywheel is what we believe creates unique and exponential value for organizations, all on SAP Business Technology Platform, which also acts as your AI-native foundation to build, extend, and manage your entire enterprise landscape. We are enhancing BTP to be your first and preferred AI teammate to help you turn ideas into innovation and ultimately value. While BTP is the engine, our SAP Business Transformation Suite is the navigator. With SAP Signavio, WalkMe, and SAP LeanIX, we ensure technology translates into real business impact. Over the next hour, Michael Ameling and Philipp Herzig will present our newest innovations and show how they can help you work smarter and achieve more.
Without giving away too much of their thunder, I also wanted to outline a few of the broader themes you will hear today. First, we're continuing to make SAP more open, open to the developer tools your teams already use and love. With our new local SAP Build servers, you will be able to use any agentic technology of your choice, whether that's Klein, Windsurf, Cursor, Cloud Code, OpenAI Codex, or any others. To make this even easier and simpler, we're also releasing a VS Code extension for SAP Build. I was hoping for that reaction. Thank you for not letting me prompt you for it. I know Michael's going to be very proud. We're also taking, again, the spirit of openness to the data front as well. Last month at SAP Connect, we launched SAP Business Data Cloud Connect for Databricks and Google Cloud.
Today, we're excited to add Snowflake to the list as well. BDC Connect enables secure zero-copy sharing, so you can bring your SAP data products directly to your existing data platforms. Not just that. I'm also excited to announce SAP Snowflake, bringing the full data and AI capabilities of Snowflake as a solution extension to SAP Business Data Cloud natively. You guys can clap on this one, too. We're really proud. This partnership is exactly what our joint customers have been asking for since we launched SAP Business Data Cloud. Second, we are building the most context-rich agentic platform.
Based on the foundation that Business Data Cloud provides, we're working on shipping the most complete set of out-of-the-box agents across the breadth of our application—agents that are not only ready to use but also ready to extend, so you can adopt them for your unique needs without losing the out-of-the-box value because we know no two customers are exactly the same. On top of that, with Agent Builder and Joule Studio as part of SAP Build, you can build custom Joule agents with deep process and data context using low-code and pro-code tools. While we would love for you to exclusively use Joule for your agentic needs, we know that sometimes you will need to operate in a multi-agent environment. For that, we're also standardizing on AI agent interoperability with agent-to-agent protocol, allowing your agents to collaborate securely across the ecosystem.
Finally, we're really excited to announce the first foundation model built specifically for structured business data: SAP-RPT-1 , or Rapid-1 for short. Our relational pre-trained transformer delivers enterprise-grade accuracy and scale, outperforming both large language models and AutoML for tabular data, which is critical for building high-value agentic cases on business applications. This allows you to solve classical prediction problems like sales order completion, as an example. With Rapid-1, you can start a line item on a sales order, and the foundation model can then auto-complete it for you. This one's worth an applause as well. The power of this with large language models now enables a set of use cases that have remained elusive from a business application perspective. Now, to summarize it all, we're embracing an open ecosystem, supercharging agents with deep process and data context, and giving you tools to amplify the power of AI and agents.
Remember, as developers, you're not on the sidelines of this AI revolution. You are the revolution. We are here to supercharge you for what's next. On that note, I hope you have an amazing few days here at TechEd and to our friends in Louisville. Thank you for tuning in. I wish you a great ASUG Tech Connect as well. Now, to show you some of these announcements in action, as well as to share an even broader list of innovations, please join me in giving a very warm welcome to our President of SAP Business Technology Platform, Michael Ameling. Michael.
Thank you, Muhammad. Hi, everyone. Amazing to be here at my favorite event. Let's start with data. For decades, data has been key to drive innovations. With AI, data is more important than ever. Data that is well-organized, data that is semantically harmonized and rich in context, data that turns into intelligence, and data that turns into impact. The reality is, data is often siloed and lacks required context. As a result, AI cannot lift up to its full potential and deliver the outcomes businesses need. There is a solution. With SAP HANA Cloud and SAP Business Data Cloud, we provide the best foundation to address these challenges. Let's start with SAP HANA Cloud. Today, it powers tens of thousands of enterprises worldwide, forming the backbone of the world economy.
It all started in 2009 with a powerful idea: bringing transactional and analytical workload together in one system based on columnar in-memory technologies. This made real-time analytics possible on operational data and massively simplified the SAP data model. We did not stop there. Next was a columnar store and the spatial engine, the property graph engine, the document store, the vector engine. Last year, we introduced the SAP Knowledge Graph engine. Without SAP HANA Cloud, you would have one database for each and every data representation, which leads to disaggregated data silos that limit your AI potential. With HANA Cloud, we have all these powerful engines in one integrated multi-model database. In short, SAP HANA Cloud is the database AI was looking for. It has the best performance.
For example, the Knowledge Graph engine is now three times faster than the previous generation in every competitive engine out there in the market. This enables use cases which would not have been possible before. A customer, Martinez & Valdivieso, is an agriculture supply company. Their field-based salespeople had the challenge to get the information needed to set prices. To address this, the company developed an AI-based solution using SAP HANA Cloud vector engine. The outcome? 35% automation of sales order entries and 40% reduction in order typing times. Now, let's look at what's coming next for SAP HANA Cloud. We are using agents to make working with SAP HANA Cloud easier than ever before. Today, using HANA's advanced features requires deep expertise. I think you all agree. Agents change that. Take, for example, the discovery agents.
It helps you instantly find the right data objects in your data elements. What used to take expert interviews and digging now happens in seconds. Next, we are expanding the Knowledge Graph capabilities in SAP HANA Cloud. You will be able to simply let HANA Cloud scan all your metadata you built on a Knowledge Graph in real time. Creating semantic understanding of relational relationships across thousands of objects, this is possible. This is not a one-time conversation. It stays in sync as your data model evolves. We will also allow AI models and Joule agents to use data-based objects like tables, views, data graphs, spatial functions, and more as tools with model context protocol support for SAP HANA Cloud. This enables agents to seamlessly interact with SAP HANA's advanced capabilities in enterprise-ready fashions. AI agents need more than that.
They need to remember past conversations and decisions as context. Therefore, we are enabling agentic memory in HANA Cloud. With long-term memory, AI agents can persist context across long-running sessions. This enables agents to become continuously smarter through learning and remembering. With that, SAP HANA Cloud is a database for SAP's AI-native software architecture. It is also the heart of our second key element in our data fabric. You might recall, earlier this year, we announced SAP Business Data Cloud. It offers the most powerful foundation for connecting all your data with full business context. It also enables building next-generation applications and reliable AI deployment. The response to BDC has been extremely positive. In that short time, customers are already seeing more value from the AI projects, building, of course, on a trusted data foundation and a unified semantic layer. A key component of Business Data Cloud are intelligent applications.
Intelligent applications are adaptive and AI-powered, and they learn from your data. They automate and orchestrate workflows across both analytical and transactional workflows, enabling customers to make decisions and take action within BDC. SAP delivers intelligent applications across multiple LOBs, from finance to procurement and from supply chain to HR. Under the hood, intelligent applications are powered by data products. We already have delivered hundreds of data products across every line of business. Today, we are excited to announce that we have a broader coverage and availability of SAP data products from our business applications. This includes S/4HANA, SuccessFactors, sustainability, and customer experience, and of course, much more. On top of that, bidirectional sharing between Business Data Cloud and SAP HANA Cloud lets you leverage your data across transactional and analytical workflows.
SAP BDC creates a business data fabric, making it easier to discover, share, govern, and model your data. It includes SAP Databricks as a first-party data service, bringing the power of Databricks directly into Business Data Cloud. It also gives SAP Business Warehouse on-premise customers a flexible path to the cloud. It all comes together on a fully managed and unified architecture. Our commitment is to give customers full flexibility and choice. That is why we are excited to announce a new partnership with Snowflake that brings Snowflake's fully managed data and AI capabilities to BDC. For those ones, last month, we launched SAP Business Data Cloud Connect with Databricks and Google Cloud already. Now, we are adding Snowflake.
This enables you to integrate SAP and non-SAP data products seamlessly between BDC and Snowflake so that you can deploy intelligent applications faster and share across your preferred data marketplace. Let's hear more from Christian Kleinerman, Snowflake's Chief Product Officer.
Hello, tech nerds. For decades, SAP has powered some of the world's most mission-critical business processes, generating data that is vital to every organization. The opportunity now is to unlock the data's full potential by combining it with all your other enterprise data. That's why a new strategic partnership with SAP is a true game changer. This partnership will bring you the full power of the Snowflake AI Data Cloud as an SAP solution extension for SAP Business Data Cloud. By coming together, you gain the flexibility to choose the right compute and storage for every data and AI workload while maintaining governance, interoperability, and semantics. With SAP Business Data Cloud and SAP Snowflake, you can unify all your enterprise data with zero copy, preserving its rich business context and creating a business data fabric to power intelligent apps and AI at scale.
I'm also pleased to share that your existing Snowflake environments will also interoperate seamlessly with SAP Business Data Cloud via Zero-C opy. What's really exciting is putting SAP Joule and Snowflake Cortex to work on a harmonized data foundation. Together, SAP and Snowflake are shaping the future of how enterprises use data and AI. Thank you and enjoy the conference.
Thank you, Christian. Now, for data teams who want to get further, we are introducing Data Product Studio. With this, you can visually model new data products. You can blend SAP and non-SAP data into reusable and shareable data products. You can version, validate, and govern those products through a shared domain-based catalog. Ultimately, this gives the data and the teams the control to manage lifecycle changes, blend LOB data products, and even accelerate business warehouse modernization. That is our data foundation with HANA Cloud and Business Data Cloud. Our new Data Product Studio makes modeling SAP and non-SAP data easier than ever before. New agentic capabilities make HANA Cloud even more relevant for AI. With our strong partnerships, we deliver the strongest data foundation in the industry. Now, let's move up the stack and see how AI benefits from the strong foundation.
For this, welcome SAP CTO, Philipp Herzig.
Good afternoon, everyone. Thank you, Michael. As he already mentioned, AI, as you know, is nothing without well-organized data. On top of the data foundation sits, of course, AI Foundation that allows you not only to use the latest frontier AI technologies that are out there in the market, but also to extend SAP's out-of-the-box AI capabilities and, of course, build your own experiences deeply contextualized in your business processes and data. Now, AI Foundation is home of many AI models that we all love and use every day. Of course, we're continuously expanding them and introducing the latest models such as GPT-5 Pro or Claude 4.5 Haiku.
We are super thrilled that now Perplexity is also generally available to all of you in AI Foundation, so you can correlate all your business data with external data from the public internet with ease. Even with all these powerful models in AI Foundation already today, businesses are truly missing something. What we really want to do in business is to make predictions to drive better decisions and to drive better outcomes. Predictions like whether there are delays in delivery or payments, or you want to identify upsell opportunities, or you want to see what the chances are that maybe a customer might churn. The problem for such use cases is large language models are not really made for this. Most of our business data lives in tables, not in text. All these data products in BDC are no different.
In order to enable such use cases, we have to basically all the time go back to classical good old machine learning. You can use something like XGBoost and many other things to train what we call narrow AI models. But they're specifically trained for each task. Therefore, you really have to train a hell lot of models. If your boss comes to you and asks you to solve 10 predictive tasks in your company across 10 different entities, like company codes, for example, you would easily have to train 100 models. An incredible amount of work. You know your boss, he will ask for return on investment before you even have trained two of them.
What we really want to do is to get rid of all these hundred models and just introduce a giant model, one giant model that only requires a small amount of data to learn from. A model that's relational, so it understands business data, that is pre-trained on tens of thousands of GPU hours, so you don't have to do this. Of course, it's transformers-based. Relational, pre-trained transformer, we're super excited to announce SAP Rapid-1 . SAP Rapid-1 presents a new category of foundation models. Because language models, they predict the next token, as you know. Image models predict the next pixel. Video models are phenomenal to predict the next frame. Relational models, such as Rapid-1 , predict your business future. We believe Rapid-1 is the most capable predictive foundation model that's out there today. Let's look at the stats.
Compared to individually narrowly trained models, Rapid-1 gives you up to two times better prediction quality. Compared to state-of-the-art language models, Rapid-1 increases your prediction quality up to 3.5 times. At the same time, it's ultra-efficient. It's 50 times faster and requires 100,000 times, this is not a typo, fewer GPU flops. It's about 50,000 times more energy efficient than large language models on comparable tasks on an NVIDIA H100 to compare with. A prediction with Rapid-1 is like taking a small step versus making a prediction with a large language model is like running a marathon and not reaching the goal. What's more, Rapid-1 has been built so your efforts to build phenomenal business predictions reduce from now weeks down to days. Our goal in the future with future model generations is to get it down to minutes.
Rapid1- is only possible because of years of hard research work with these two papers as the key published papers, some of which we have worked on together with Stanford University. SAP Rapid-1 comes in three amazing model versions. Rapid-1 Small, that's optimized for super fast predictions. Rapid-1 Large, that's optimized for highest accuracy but requires a bit more compute. I am so excited personally that we are also releasing an open-source version of this with SAP Rapid-1 OSS for everyone to use and learn. The problem is with the open-source version, you need a GPU. You go to Hugging Face, you download it, and boom, you put it on a GPU. If you do not have a GPU, because there are still sometimes scars, you can go to this playground. We put an amazing playground out there on the internet under rpt.cloud.sap.
We put a bunch of data samples out there. You can upload your own CSV files. There is even an API-based access. You get a token and just start with it. You can just try it out right now on your phone. Just check it out on your phone, do your first predictions. Just use it. It is amazing. Now, this is a big leap forward on how businesses can combine rich semantical data from BDC with frontier AI for predictions from SAP for decision intelligence like never before. Now, let us actually look at this, how all of those pieces we just discussed are actually coming together. Please give it up for Sabrina.
All right. Thanks, Philipp. Here I am, a sales manager in SAP Business Suite. I have concerns about the sales pipeline that's coming up towards the end of the year. Let me open Joule, SAP's AI copilot, and ask, how's my sales pipeline for 2025? Joule returns a link to the Revenue Pipeline Intelligent Application. As I open the link, I can immediately see a large number of open sales inquiries coming up to the end of the year. Now I need help to quickly decide and triage which inquiries should have my sales team focus on. To prioritize these sales inquiries, I'm going to need support from my data analyst and data developer. Let me open a task in Joule for the data analyst to create a prediction and then see which sales inquiries are most likely to close.
My data analyst now receives this task in the SAP Business Data Cloud Cockpit to create a sales inquiry forecast. Now I'm acting as a data analyst, and I need to connect data from multiple sources to build a semantically rich data fabric that's ready to be used with the latest AI technology. I start in the Business Data Cloud Cockpit and browse all the SAP-managed data products coming from multiple applications across the SAP ecosystem. I can now see the S/4HANA sales data package. Open it and browse all the data products available. I'm really happy to see that the package has already been installed and the data products are ready to use. Now I'm going to launch the BW for HANA and open the BW data product generator.
I'm browsing through the available info providers and can see historical sales data on invoice payments and sales returns and can now convert them into a data product. I know we have both SAP and non-SAP data across our organization that I will need for this task. I must ensure proper semantics are being preserved, given how easily sales data can have critical fields and dimensions that are similarly named. I also need help to locate the most appropriate data products to support my task. I open Joule and ask for recommendations of data products and data sets that can support a sales conversion forecast. Joule now recommends a sentiment analysis that was analyzed in SAP Databricks, a service history data product from Snowflake, and a historical sales data product from BW for HANA.
I'm really happy with the rich data products recommended and ask Joule to help create a new derived data product from these. Thankfully, Joule directly provides me with a link to the Data Product Studio. As a next step, I move to the Data Product Studio and see data products from S/4HANA, BW, Snowflake, and ECC. I want to bring them all together and make them a single data product that's governed and ready for consumption. Let me highlight the relationships within the data and create the necessary joints to create an AI-ready derived data product. Once all of this is done, I now need help from a database developer to use the semantically rich data product and create a prediction on which open sales inquiries have the highest likelihood of closing.
I now share the data product with SAP HANA Cloud using Zero -Copy and can begin my next task. Once the data product was shared, we can install it into SAP HANA Cloud via virtual integration through the provided Delta shares. Now that the data product is installed, we can analyze the data in HANA Cloud. Having a look at the data, we can see various fields that may have an impact on the closure of sales inquiries. Via the powerful integration between SAP HANA Cloud and SAP's AI Foundation, we can benefit from the power of the SAP Rapid-1 to forecast the likelihood of sales inquiries to close. For Thomas, the database developer, there is really no need to leave his domain and tool of choice as he can now easily create a forecasting scenario.
To do so, he uses Joule again to generate a table function and adjust it for his purpose. The generated table function is making use of the built-in database procedure. To pass the sales values for open inquiries together with historical data onto SAP Rapid-1 , we're using the function Foundation Model Predict. Note that we do not need to select relevant parameters for the inferencing the likelihood to close. SAP Rapid-1 is able to detect the relevant signals and derive the predictions. Also, there is really no need to run any pretraining with SAP Rapid-1 . The foundation model is a versatile tool, helping us to solve a large variety of business problems based on its breadth of domain knowledge. All right, here we go. We can now successfully predict the likelihood for sales inquiries to close based on the existing circumstances and conditions.
How do we now bring this back to the end user? Of course, we want to make it accessible via Joule. Thanks to our custom database objects, Knowledge Graph, and the SAP HANA Cloud agents, the prediction function is immediately accessible to Joule and other agents. Let's give it a try. Via Joule and SAP HANA Cloud Central, we can directly access the prediction function through natural language. Now, returning to my role as a sales manager, I'm back in the dashboard and receive a notification that a sales inquiry conversion forecast is now available. As I open Joule, I can immediately see the top five sales inquiries likely to close. Now I know exactly where my sales team should be focusing on.
From data integration to AI-powered predictions, SAP Business Data Cloud, SAP HANA Cloud, and Joule enable you to move faster, make smarter decisions, and lead your business into the next era of intelligent decision-making. With that, back to you, Philip.
Thank you so much, Sabrina. Give it up. Great job. It is so beautiful, right, how all these things are coming together, this flywheel of data and AI and even more data. Now, with all of these great model updates in AI Foundation, we still have a big challenge because it becomes increasingly painful to benefit from the advances in AI that are really happening quickly. You want to switch these models, but it is very hard. Now, why do you want to switch a model in the first place? Because models, they get cheaper, they get better, of course. Sometimes, as your business grows, they are not available in the respective region. What if your AI scenario may want to use multiple models at the same time? Switching is not easy because your prompts, your agents, whatever, are completely married, completely coupled to the underlying models.
Your carefully tuned prompts that work on one model and give you phenomenal outcomes do not work on another model. We would rather avoid doing this activity, leaving a lot of money and better outcomes on the table. Now, throughout 2025, we really have turned AI Foundation into an all-you-need toolset for what we call benchmark engineering. This contains first a new eval service to test and benchmark all your AI use cases across models. Prompt Optimizer picks up these evals and either optimizes your existing prompts or automatically, with AI, converts them across various models. Prompt Registry allows you in your application to maintain multiple prompts for the same use case. Model Fallback selects the best prompt considering cost and quality, regional availability, and more.
Feedback Service actually closes that loop because once you deploy your use cases into production, you collect end user feedback and with that create even better evals. As part of this loop, we are super excited that Prompt Optimizer in collaboration with Not Diamond is now generally available. Prompt Optimizer, yeah, that's a great announcement. You get Prompt Optimizers from all sorts of vendors, but Prompt Optimizer from SAP is actually very special because it introduces this one magic button. It takes all your prompts, whichever model you preferred in the past, it's the Gemini models or the OpenAI models and Tropic models, doesn't matter. You click that button and you automatically get optimized prompts for any other model that's out there, which you can use and then run and select at runtime as your use case evolves.
We are very grateful for some of the early adopters that we worked with, with Henkel and with Wipro and with Deloitte. We have seen improvements up to 30 percentage points on the same prompts, on the same model through Prompt Optimizer. Now, creating evals has meanwhile really emerged as the most highest value activity in AI that turns today's vibe checking practice that we somehow have into a real engineering practice. AI Foundation is the place where you can do this. Finally, we're also super excited that AI Foundation is now generally available on SAP Cloud Infrastructure. With that, we are actually introducing AI for Europe, the very first of its kind. Today, AI for Europe, yeah, thank you. Today, AI for Europe serves your frontier AI from Mistral and from Cohere and SAP and more end-to-end out of Europe through SAP data centers.
AI for Europe is not limited to SAP. If you followed the news earlier today, we are partnering with the best. Therefore, we are also super excited that we have strong infrastructure champions like Deutsche Telekom that are joining these efforts early next year. We will expand this across many data centers, but what will stay the same is BTP and AI Foundation across all of them. This way, we ensure AI happens entirely on European terms through highest standards of sovereignty, local data residency, data privacy, and of course, regulatory compliance. It is not just about where your AI operates, but also how we manage it responsibly. We are excited that the entire platform, including Joule, is now ISO 42001 certified. This is AI Foundation, everything you need in one deeply integrated business technology platform.
Thanks to our constant infrastructure optimizations, we are lowering all of our prices up to 20%, and we're introducing a 30-day free trial for everyone to get started now. With our data and AI Foundation, now we have seen we are creating insights like never before. Now it's time to turn these insights into business action. For this, we obviously need something that automatically reasons over all this data and then acts to change the course of business accordingly. Joule agents are the way to do this. As you know, we have already shipped 20 agents across lines of businesses with 40 by the end of this year, like our accounts receivable agent that instead of manually digging through relevant data around overdue receivables, the agent automatically analyzes all open items, identifies risks, and prioritizes what to focus on.
The impact: four times faster reconciliations, a time span from 20 minutes down to five, and of course, lower daily sales outstanding. Joule agents, of course, like we humans, can leverage all the more than 2,100 pre-delivered Joule skills in addition to our more than 300 embedded AI scenarios across product lines. If you would deploy all of these capabilities across SAP's portfolio, that's up to EUR 440 million in top-line or bottom-line delivered value for a company every year. Many customers are already benefiting from this offering. For example, Pandora, who is now interacting with Joule across lines of businesses in a conversational way, enhancing their productivity across the enterprise. Nestlé, who already gets phenomenal value out of leveraging our AI agent ecosystem. Now, with more and more of these agents and skills coming in every day, every month, we need a way to manage this.
That's where AI assistants first come in. Role-based AI teammates exist right through Joule, like, for example, a financial assistant that brings together all the financial agents, like for cash collection, treasury, and more. We provide AI assistants for every role to give you an agentic experience unlike everyone else. Our ready-to-use agents are great, but now, of course, we know that every company has unique requirements. Every company wants a slightly optimized version for their specifics in their business processes. The big question is how you tap into a larger ecosystem of agents because it's not only SAP who builds these agents. How do you bring this together? What we really need is to extend, build, and share all the agents in the enterprise. Let's start with extend.
Extensibility is really you can take any of these SAP pre-built agents that we ship to you and introduce them to custom fields, custom tools, custom reasoning logic while retaining all the deeply grounded integration capabilities that SAP provides to you. To show it to you in action, here's Karishma.
Let's pick up the top five sales inquiries to be closed from the previous demo. I can see here that Altenova put in a large sales inquiry that I want to follow up on. Looking at the details, this could be an interesting business opportunity for us. However, our customer is based in a country with a lot of currency swings. Processing it could be a risk. This is where AI agents come in. They help us act in dynamic situations like this. Let's proceed by calling the accounts receivable agent to check on the customer. The agent will fetch and list all overdues and dispute cases to get a better understanding of the customer's payment history. This is a good overview, but wouldn't it be better if we could extend the agent to include additional checks? Let's switch over to Joule Studio and SAP Build to extend the agent.
Using Agent Builder, we can now extend ready-to-use Joule agents. We will now create a new project. From the menu, we will select the AI agent and Joule skill as our project type. We will give it a name. We'll review the settings and we'll click Create. Now we can add an agent extension to our empty project. From the library of ready-to-use agents, we will then select the accounts receivable agent and add it to our project. Now we can add new tools. You can think of these as little helpers that make our agent even more powerful. For example, we will add a document for the order handling policy. We'll also give it instructions for processing. This grounds the agent in business context and ensures reliable outcomes and informed decision-making. We can also add pre-imposed reasoning skills to our agent.
For example, we have a custom Joule capability in place to fetch currency exchange rates. We can also give it additional instructions to feed live data to the agent for calculating the exchange rate risk. Now it is time to test our new extension. Let us switch on over to test mode and check our accounts receivable agent extended with superpowers. I will prompt again for a check on Altenova and reference the latest sales inquiry. The new tools are getting executed, and here we can see in the summary that the order policy is now being considered. We can also see the foreign currency risk is also displayed in the table. On the right, this is the timeline, and this is where the real power for you all comes in. This is that unified transparent view of the entire execution flow. This is the transparency you all asked for.
With that, back to you, Philipp.
Thank you, Karishma, for now. Stay there. Because we just got started. Extending such agents is great, but sometimes it's not enough. What if you want to build something completely new, but of course, you still want to use all these mechanisms, all the other Joule agents, all the other Joule skills, and of course, integration with your data from BDC? Now there are two other ways to build deeply integrated custom agents. The first is low-code agents, which you again can build with Agent Builder in Joule Studio. You can do so by using natural language and drag-and-drop capabilities, all while leveraging all these powerful capabilities we just described earlier, like Hacker Cloud and BDC and so on, under the hood. That's a very powerful way to build business relevance agents in high speed. However, that's just low-code.
I know that many of you, including myself, really view programming as a liberal art and want the ultimate control and flexibility when developing. Therefore, we are also introducing pro-c ode agents. Pro-c ode agents give you the freedom to build agents in any programming language of your choice. Use any logic that you want, any memory, any development environment. Of course, you can use any of the existing agent frameworks that are out there: the CrewAIs, the LangGraphs, the Microsoft Agent Frameworks, AgentScope, and more as you like. To make sure that these pro-c ode agents can be seamlessly integrated back to SAP, we are super excited that we're expanding the SAP Cloud SDK for AI to support this agentic development to give you the best of both worlds: full flexibility, but also deep integration with all our underlying principles. Now let's look at it in action.
Karishma, what do you think?
Why don't we create a new agent to check on our customer Altenova automatically? For this, we'll use a powerful new feature in Joule Studio Agent Builder we call vibe coding. It lets us create custom agents in Joule Studio just by describing what we want in natural language. We'll give it a few instructions to research on a new prospect based on all kinds of financial data. The suggestions include a meaningful name, tools, and integrations. In this case, fetching credit evaluations from Cloud ERP Finance and combining credit ratings from external agencies. Agent Builder also suggests meaningful data products from Business Data Cloud to be added. I will simply confirm the outline provided and the project is getting set up. On the right-hand side, we can see the generated instructions, and I can refine these as needed. Agent Builder also suggested a system trigger.
The agent will then run automatically whenever the selected event is triggered in S/4HANA. I will choose the event trigger, and it will be added to our project. It even suggests a threshold so that only large inquiries are getting checked. Now I would like to add a background check based on public information. Good idea. We can use Perplexity integrated with AI Foundation to query public web sources and news articles to get a more holistic view. Agent Builder also guides us through these setup steps to create the web search tool. Our prospect research agent is now complete, and we are ready to run it. Whenever a new sales inquiry is posted, the agent will perform the checks reconfigured automatically.
I think that's on me. It seems like the agent performed the new check right away.
Yes, the integration from SAP Build Work Zone and SAP Mobile Star automatically delivered the results right to your phone.
Great. I mean, it looks like the sales inquiry about Altenova is promising, but there is a financial risk. Can we predict the impact and optimize the offer to use Rapid-1 to reduce this? I mean, Mattis, you're standing here. What are you doing here? I mean, can you do this and show this in the pro-c ode agent maybe?
Philipp, I've got you covered. I'm here to show you how to build this pro-c ode agent and integrate it to Joule. I love that I can choose the tools that work best for me. For this project, I'm using CAP and LangGraph. I've already prepared the tools the agent will use. It will use an SAP HANA Cloud MCP server to access the data product Sabrina created earlier. The agent can directly ground its predictions based on values from SAP Rapid-1 . Tools gave the agent legs and arms, but what we are still missing is its brain. Adding it could not be simpler. With SAP Cloud SDK for AI that we extended for agent development, you can freely choose between all models available in AI Foundation. Today really feels like a solid four-dot-five day to me. With this done, we are ready for deployment.
Going forward, all agents exposed via an A2A server can be added to Joule. Here, we are just missing the finishing touch: the A2A agent cart. Now we are ready for deployment. Deployment processes are so thrilling to watch. Said no one ever. This is why I prepared this agent and deployed it already backstage. Let's ask the agent to optimize the latest sales inquiry from customer Altenova. It's running. Perfect. There we go. There we got it. It's giving us some really promising recommendations on how to optimize. Let's ask the agent to create the sales quotation for it. Perfect. Now running and done. Amazing. Let's check a little bit more in detail. One of my favorite agent development tools is Langfuse. It allows us to check on all the tool calls, inspect the inner monologue.
For example, here we can see it's grounded. The optimizations are grounded in the values fetched earlier. Looks good to me. Philipp, I think we've got the agent you've asked for.
Yeah, absolutely. Great job, Mattis. Thank you so much. Amazing demo, Karishma. Okay. All righty. Now these are very, very powerful capabilities to build AI experiences in the rich context of your apps and data. And they are all available to you. Some of it is available today, but everything will be available until the end of this year. I'm so excited to see all the amazing agents that you will also build with this. Again, we are not stopping here. Of course, you need a smart way now to have all these agents and integrate them into a larger ecosystem. Because true agentic collaboration requires, like in a real organization, to break down silos and work across frameworks, vendors, and apps. Therefore, we are thrilled to announce that Joule agents are fully compatible with the agent-to-agent protocol exposed through Joule's Agent Gateway.
A2A really is the new API in the age of AI that exposes rich semantics describing what an agent can do so it can be discovered by other agents to then get actually work done collaboratively. Now, A2A, of course, works both ways. Via the Agent Gateway in Joule, you can discover all the agents that you have deployed in the SAP world to the outside. Likewise, you can consume any third-party agents from vendors that are supporting A2A, such as Google and Microsoft and ServiceNow, from within Jewel with the extensibility methods we have just shown to you. SAP is working very much with all our partners on the standardization of this emergent protocol for full interoperability for agents. We empower this through a new powerful technique in Joule, which we call agentic orchestration.
Agentic orchestration takes a complex task, whether from a human or another agent, breaks it down into individual steps, then works with the right agents and tools, reflects on it in a loop until the task has been solved. It is already part of SAP's standard agents, and we are making it available to you early next year. Now that we know how to find and talk to any AI agent that's out there, there's still the question: how do you manage the lifecycle of these agents? There's a simple way to keep track of all of them in the context of your enterprise landscape and business processes. It is called SAP LeanIX Agent Hub, which is now generally available to you. SAP LeanIX Agent Hub lets you govern and control the full lifecycle of your deployed agents, from early discovery to risk management and deployment.
In addition, we help you to measure and improve your agent's success and impact along the processes. I am very happy to share the availability of agent mining in SAP Signavio. Agent mining in SAP Signavio lets you trace every decision, every action that your agents have been doing, and benchmark them against KPIs and flags deviations, bottlenecks, and more. This gives you full control to identify exactly where Joule agents improve your business or where you need to adapt and strategically steer your workloads and efforts to improve even more. Agent mining is available to you today. That is what we have in store for Joule agents. Now we are taking the elevator again to the next level with Michael.
Data check, AI check. Now let's move up the stack and see how we can gain even more value.
We will show you how you can build powerful, scalable, compliant, and intelligent apps on BDP. Actually, let's start one step earlier. Because you don't just sit down and build an app, right? You usually start with a problem to solve. In many cases, you want to optimize a process within your organization. SAP Signavio is our all-in-one business process management solution. It lets you dive deep and understand every detail of process data. It turns insights into clear recommendations. With those, you can jump straight into SAP Build. With SAP Build, you can create custom AI agents like Philipp just showed, automate workflows, and create intelligent apps tailored to your needs. Our promise is simple: build with intent.
You describe the outcome, and SAP Build uses AI agents to generate code, logic, and UIs for you, all with seamless access to your applications and data while you stay in the flow, whether you prefer visual composition or hands-on coding. How are we doing this? Joule for developers enables vibe coding experience in making intent-based development simple and intuitive. It is more than that. With just a few clicks, you can move from initial intent to high-quality and reliable code. It provides you the context of all SAP tools, so CAP, SAP UI5, mobile, Fiori, and more. Now we are taking it one step further. I know that many of you already use agentic tools. We already provided extension packages to work with VS Code. Who of you uses Windsurf, Cline, Cursor? I see a few hands. Great.
Because you can now use all of them directly within SAP Build. You choose a tool, and we meet you where you are. To make this happen, we have exposed everything from CAP to UI5 via MCP. For easier onboarding, we will deliver a BDP extension pack on OpenVSX registry next year. Your editor plugs straight into it and gains context-aware assistance with SAP's tools. Code completion, refactoring, test generation, and agent building on BDP gets smarter and faster. It also means you can connect your own model of choice to stay intact with security and compliance. There is more. Some of you have heard maybe of n8n for building agents. N8n has become quite popular and has more than 1.6 million active users a month. We are working with n8n to integrate it with SAP Build.
Custom agents developed in Jewel Studio shall be able to invoke agents created with n8n and vice versa. N8n users shall also be able to access models hosted in Generative AI Hub. What about ABAP? Of course, we bring the same seamless user experience to ABAP. We start with extension packages in VS Code. I know there's a smile on a lot of faces. It's a full-stack experience for ABAP in a lightweight editor, similar to the CAP capabilities. In VS Code, you can now use the graphical data modeling, the service catalog with APIs and events, and code generations. Developers can view and interact with CDS models and services without switching tools. Developing an ABAP cloud in VS Code will be available next year. We are also integrating ABAP's AI capabilities and laying the foundation for future agentic ABAP cloud development within SAP Build.
For example, with AI-based code generation and explanation. We have good news for those of you using earlier releases of SAP Cloud ERP Private. We also want you to benefit from the latest ABAP AI capabilities. That is why we are decoupling them from the S/4HANA system and making them accessible side- by- side. All releases as of 2021 onwards will be supported. You want more? I am excited to announce that we are publishing the fine-tuned ABAP LLMs with SAP ABAP One on AI Foundation in Q4 this year. SAP ABAP One has been trained with SAP's ABAP code and is specialized for ABAP AI use cases written in modern ABAP code. Finally, customers and partners can develop custom developer productivity AI use cases. Now I am sure you want to see SAP Build in action now.
Let's build a custom intelligent app that uses the SAP data products from BDC and HANA Cloud, connects to agents, integrates Joule as a get-go, and turns an intent into a real project. For this, let's welcome Fredericka on stage. Fredericka.
Let me introduce myself. I am a credit controller at Best Run. My job is to improve our processes. This is why I'm in SAP Signavio right now, because I need the insights. I already see credit risk analysis is taking too long. Let's drill into this. There it is. Seems like credit holds are slowing down our order-to-cash process significantly. Most of these are the customers that already have sales contracts. Let's fix this. I'm going to open up a conversation with Joule, asking that we need to reduce, we need to speed up the time. Joule comes back to me saying, "Sure, we can create an S/4HANA extension with the agent we just built to perform automatic credit analysis." Sounds perfect to me. I say we follow this recommendation and generate it in SAP Build.
This takes us directly to the SAP Build lobby. Today we are previewing a future new capability in SAP Build's tool for developer. We are using the vibe coding concept to build intelligent applications through what we call intent-based development. I'm just going to speak my intent to Joule. Joule, I do need to find a simple way for managing these value contracts. Go. Perfect. Just look at this. Joule is creating a detailed end-to-end specification based on SAP Build's CAP, Fiori, automation, and HANA, all in the context of our business. It is all defined and aligned to our best practices and clean core principles. What more could I wish for? Let's create this application. On the left, you can see Joule now reasoning step by step, planning, building, even fixing its own mistakes on the way.
It is using the SAP Build MCP agent to now generate a full-stack application. It is doing extensive reasoning and content generation here, as you see. Technically, after a few minutes on the right, you'll see the completed solution with the summary of everything Joule has created. That is exactly what we mean by intent-based development. Joule turned this idea, this insight, and turned it into a complete architected solution. Without leaving SAP Build, right here, we're going to preview the app. Perfect. We see that it has been generated using SAP Build's UI5 framework, which is great because one of the many great capabilities is that it has strict accessibility guidelines. Yes, we're vibe coding, but we're vibe coding enterprise applications here. Looking at this, I have an idea. I want high-value contracts to stand out.
Let's take this intent to Joule, asking to add an indicator, and perfect. Joule is now updating the UI, describing the changes, and we can even refresh the app live. Looks good. Love the icons, Joule. Let's look at the code view for this one. I do want to see behind the scenes. Perfect. We see the UI5 annotation perfectly added. If I had to write this by hand, I'd be looking up the syntax just as it is here and hoping that it works. SAP Build is redefining developer productivity. It's creating intelligent applications and not just faster, but smarter. Let's take a look at the process and how this was been created. We're going to take a look back and see how it is. Wonderful.
All the time when a contract is created or the value is increased, it's running perfect, and it's using the agent that Karishma built earlier. Actually, Joule can even explain to me the entire process in step-by-step detail. Wonderful. Let's switch gears a bit and see how we would edit it in an agentic tool. We are now in VS Code. I have my app there, CAP service, UI5, all at the right point. I also have our new SAP Build MCP agents for CAP CDS development right here, also for our UI5 MCP agent, and of course, our Fiori agent. Today we will be using Cline, which is one of the possible setups that Michael mentioned earlier. Let's vibe coach, shall we?
I'm going to take my intent to Cline, asking that we need to create an extension to include the report from Karishma's agent. Klein gets to work collaborating with our MCP agent. It modifies the CAP service, adjusts the Fiori annotations, verifies the syntax, checks for best practices, and even tests it all autonomously. In real time, you can see the edits, the difference, and the updates. This is a real true developer AI partnership in action. We're done already. Implemented and verified. Perfect. Every extension that we wanted to have is now implemented already. Good job. With a few clicks, I'm now deploying the app to BTP, and these are all of the recently processed contracts. To see a solution in action, I'm just going to change up this value to EUR 500,000. Perfect. It's safe.
Now behind the scenes, our agent is triggered, and it's performing the new creditworthiness analysis. We see that the check completed, and we now see that the status is updated, and the new research report from the agent has been stored for our review. This means that now every sales order against this contract can be processed efficiently without any of the delays from the credit holds we saw earlier. The best part, all of this happened naturally through conversation. Because with intent-based development, you will take your developer productivity to new levels. Let some time pass and see the impact of the extension we just built. A few months later, we are in Signavio Process Intelligence to finally see the outcome. Every node in the process is green, and the numbers really do speak for themselves.
This is the power of intelligent applications. From this insight, from this intent to action, from Signavio to Joule to Build, all in one flow. That is how I, a credit controller at Best Run, turned this process bottleneck into a competitive advantage using generative AI and intent-based development. With that, over to you, Michael.
Thank you, Fredericka. This is what modern and AI app development looks like. We are using the same power for our own application developers. For example, with Ariba, business networks and supply chain orchestration, all running on BTP at scale. Fully integrated, AI-enabled, and at enterprise grade. We actually share all essential best practices from setup to scale in our updated BTP Developers Guide. The latest version is available now. We are making BTP easier to adopt by simplifying commercials for our ecosystem. We now have one unified SAP Build commercial model, giving you access to the full toolset, including Joule Studio and Joule for Developers. The entitlements come with S/4HANA Cloud ERP Private and the new business suite packages. You can license through subscription BTPA and now also through CPEA.
On top of that, we are even offering free cloud test demo and development licenses to our partners. Now, building solutions is just one step. True impact comes when we are connecting them seamlessly across systems. Process integration can be quite complex, and managing this across a growing landscape is even more challenging. The solution? SAP Integration Suite. Let's take, for example, the customer KONA. They consolidated legacy tools into a single API-led platform, SAP Integration Suite. This reduced the runtime cost by 50%. Now let's see integration in the agentic world. Remember our earlier scenario? We wanted to gain creditworthiness insights based on customer service history. By leveraging third-party solutions like ServiceNow, we can access additional data on payment risks and legal escalations. Using SAP Integration Suite, we expose ServiceNow as a secured MCP server.
You can apply policy to safeguard and govern the information for the prospect research agent. Finally, you can easily build federated MCP servers and connect them to tools. Analyze the tool calls that are discoverable through the developer hub in Joule Studio. As you can see, we are supercharging our integration foundation with powerful new capabilities. You can create MCP servers for third-party integration, use anonymous anomaly detection to hear your processes, tap to conversational AI analytics with Joule, and automigrate integration and speed up development. Now, let's hear from one of our customers who relied on our solutions on their transformation journey. Please welcome on stage Axel Wietfield, Chief Procurement Officer of Uniper. Axel, tell us more about Uniper.
First of all, thanks for having me. For those who are not that familiar with the energy industry, Uniper plays a key role in the European energy system. We employ about 7,500 people from more than 100 nations in 40 countries, and we have a lot of technical and commercial expertise. In addition, we have embarked on a journey for a more sustainable future, which means we promise to the capital market to transform our portfolio and be carbon neutral by 2040. How are we doing that? Obviously, with many suppliers we have got, and some of them are strategic suppliers, and SAP is, of course, one of our strategic and trusted suppliers. It is great to be here at TechEd discussing technology and business.
Yeah, thanks for sharing. Actually, I mean, Uniper has gone through quite a transformation journey, especially in procurement. How did that journey actually get started?
Yeah, we realized that procurement colleagues spent too much time on purchase request details and have not got enough time and only limited time on the important stuff like strategic analysis, like value-adding negotiations. For example, our purchase requisition process was predominantly manual, which means that the purchasers had to perform supplier and contract checks and search catalogs to find the right items before a purchase requisition could be processed. The solution was that we built a so-called perfect purchase requisition app. We used SAP Build and SAP GenAI tools for that. The benefit is obvious. First of all, the buyer gets it right first time. We could cut requisition processing time by 80%, and we could reduce errors from 40%- 50%. Quite successful. We now have a better data quality and also trust in the data we are using.
Yes, thanks for sharing. Actually, this truly sounds amazing. Thanks for using Build, of course, at that journey. Now, after we talked about all this transformation, what does it actually do to the people who are behind? Basically, especially the buyers and the teams using the new tools, what does it mean for them?
Yeah, I mean, for them, it's a shift. Let's take the human win first. First of all, 95% of the operational purchases, they're using AI-enabled tools on their daily business work. 85% of the searches are, meanwhile, accurate in the first place, which will cause a huge gain. Therefore, we need less time to correct the purchase request, which hadn't been correct in the former times. That also means more time for negotiations, more time for analyzing suppliers. Yeah, also, we have got more time to find some savings with our trusted suppliers. All in all, that means a transformation of our culture from more transactional to strategic, which requires a different skill set, an additional skill set. Most of our purchasers are pretty excited about that.
Yeah, that's indeed a good shift, actually. I think it's great to see how AI and basically data basically transform your business. Now, the app that you mentioned, does it auto-integrate with other solutions?
Yeah, absolutely. I mean, integration is key. Procurement is part of the entire Uniper story, and therefore, it's never one system. It also touches finance, operation, HR. SAP connects these dots and these processes. Therefore, we are now at a stage where we have an automatic data flow from requisition to payment after all. We avoid duplicate entries. This unified connectivity is essential for using AI and to deliver also the full potential of AI.
That sounds indeed powerful. Please share, what's next for Uniper?
The vision is to go beyond this stage and to have an assistant who is having a dialogue with our internal customers, the requesters. This assistant should understand what the internal customer wants, what this customer wants to buy, and fills the purchase request correctly in the first place or directs you to the right platform or system. We started with improving and automating the procurement initiative. Eventually, the idea is that we want to change the requester experience to make lives easier for everyone. Of course, SAP Joule and Joule Agent should play a role in that one. We are using that for real-time process status answers. Therefore, we feel that in future overall, we have a fully automated routine. We have automated tasks. Therefore, this additional experience on procurement and the requester side, that is super cool.
Therefore, for us, this journey we have embarked on means that it's higher efficiency, higher reliability, and also an improvement of both the quality of work and also the design of the work more towards a strategic discussion we have got internally and also with our suppliers.
Yes, thanks. Because especially, I love the Joule and AI everywhere.
I can imagine.
This is indeed a great example. Thanks for sharing this with Uniper. Give Axel an applause . Thank you.
Thanks for having me. Thank you.
With our promise to build by intent, integrate, and innovate, we introduced Vibe Coding and Joule, showed openness, and meet the developers where they are. Published SAP ABAP One, introduced MCP servers for third-party integration and Integration Suite, and much more. Now, let's shift our focus to user experience. Because we are working on a very powerful vision here, let's take a look.
Good morning, Jenny. You have a customer meeting at 10:00 A.M. There has been a shipment with broken parts.
Prepare the meeting.
I'm on it. Hey, Jenny. This is the customer complaint about the broken parts. Would you like to investigate it?
Yes, please.
The return and refund process is now handed over to an agent. This problem is piling up. Costs are rising. Would you like to loop in quality management? Hey, John. Jenny has flagged recurring issues with packaging. Could you take a look?
Thanks, Joule. I take over. Add shipping instructions for carriers, as shown in this example photo.
Sure. Just take the photo, and I'll upload it to the board and notify the carriers.
To be clear, for U.S. developers, this means you will design for outcomes, not for static workflows. This is how we redefine how people interact with AI. There is more. For that, let me hand over to Philipp again.
All righty. Ready for the last part? OK, now, without a doubt, Joule started the biggest UI revolution that we have seen in SAP for decades: a beautiful, intuitive, human-centered design. Joule really also turns natural language into being the API, an API that can not only be used by humans, but machines, too. Through A2A, we are now bringing Joule agents also into the physical world, robots that understand the what, when, and how to interact based on live business context and policy. In the last few months, we've created an entire ecosystem of partners to make this a reality end to end. We are building this together with our customers. I'm thrilled to discuss this huge transformation that is upon us with one of them. Please join me on stage with Torsten Müller, CIO and COO of Sartorius. Thanks for being here.
It's an amazing partnership. We do a lot in physical AI. Before we get there, tell the audience a bit more where you stand and your transformation, move to the cloud, just to contextualize it for everyone here.
Yeah, thanks, Philipp. Trust came from the foundation. That's something we focused on in the last one and a half years to solve these things. One and a half years ago, we have been still on our ECC system on-premise. We had less digitization from the end-to-end processes. We had not very much digitized warehouses. We suffered in the direction of KPIs, mainly delivery reliability, delivery ability, lead time, and also in inventory management. At that moment in time, we decided a very bold move. We put IT data and processes together. Later, we also put the entire operations into that, the responsibility for the production sites and so on. Now, one and a half years later, we have our ECC system migrated. It's in the cloud, on RISE, fully on S/4. We have implemented Joule.
In parallel, we have implemented the enterprise warehouse management system in the south of France, close to Marseille. Now, we want to go further to bring robots, cobots, humanoids, and everything into account to solve these issues for more digitization.
Yeah, and congratulations. I mean, it's an amazing move, right? That is really the foundation for everything else. Now, let's go into this exciting topic. How do you envision the collaboration between Neura, some of the robotics vendors, Sartorius, and SAP to really take this to the next level?
Yeah, our collaboration with Neura and SAP, that's really a great move because we see the integration in the EWM fully- fledged, linked to the Joule. You can imagine that in a fully automized warehouse, there are still manual processes. To link that, especially on the physical interface between the production and our clean rooms and the warehouses, we have to facilitate and to supply materials. Therefore, we are using this prototype, Neura, proof of concept. It's awesome to see how the robot is recognizing the forms, colors, materials, put that in boxes, and so on. I believe in a few months, we are able to facilitate that. In addition to that, because that's not enough, Philip, in addition to that, we have to put more into these kind of robots. They are able to work closer to real people without the hands.
We have to deal with flexible materials. Therefore, you need robots using also two hands to operate. That is something we are looking for. Hopefully, we can do that together with SAP.
Absolutely. Thanks for sharing that vision. It's very innovative. It's a great partnership. Our teams love to collaborate every day. Thanks for joining us today. Share your vision.
Thank you very much.
We are looking to a bright future. Thank you.
Thank you. Bye.
All right. That is truly exciting to see and the progress, even if it is still early days. We really work closely with customers to make it a reality. Now, I would like to show it to you in action if I would see my demo coming up here. Here it is. Much smaller now. Here, I am now a warehouse manager. You know the usual drill, right? You see all your warehouse tasks. I do it usually in a very manual fashion. Now, I want to do this in a very unusual way. Of course, I go ahead here in warehouse management. You see all these various tasks. I open Joule. When I open Joule, I take care of this urgent customer delivery that I have here and do the respective picking. Now, Joule thinks, of course, through the usual workflow.
It decides to actually delegate this task to a humanoid robot here in the experience center at TechEd. Now, Joule determines it was a picking action. Before it delegates it to a robot, it collects a bunch of actions from EWM and now sends this over to the robot, which we should now see here. It, of course, goes ahead. It notices the robot is missing a branding label on the box. Joule helps the robot to resolve that according to the process. In Neura, for anyone, actually inspected it in real time. Based on an ad hoc task from Joule, puts the defective item into a quality bin. As a next step, actually, it's now fully done. Teck also picked up a good item, actually, now.
The process can be essentially completed and continues the task picking the high-quality item. Then we actually see here in warehouse management that the task is completed and the order is now in progress. Of course, if I go back to EWM and I click the Go button again, you see that actually in warehouse management, the task is completed. This is phenomenal, a business process executed by a robot. Now, let's see what's going to happen. All right, here it comes. Here it comes. Here's one of Unitree's G1 robots as a direct delivery from the warehouse. It's so cool, right? Hey, Unitree G1, smile a little. You've got a few photos here for social media later. All right. Come closer. Come here. Come here. No, no, no. Here, come here. Come to me. Yeah, yeah, yeah. Here I am. All right.
Oh, see what we got here. Oh, very cool. Let me take that box. There's a bunch of merch in there. There's a Rubik's Cube Joule branded. You like one? You like one? Hey, there you are. You like one? Here we go. All right. Oh, yeah. Thank you so much, G1. Great job. Actually, I forgot the microphone because I was supposed to ask him, how are you doing?
I enjoy it way better than my last gig, waiting for firmware updates in a dark lab.
I can imagine that this is really awful. Hey, before you go, can we take together with all the cool crowd here a real selfie? You turn around. We wave a little bit so we can have a cool photo together. Can you do this for me? OK. I want you to cheer a little bit because this will go viral. Right? This is TechEd. This is TechEd 2025. All righty. You get the tips later. You did an amazing job. Back to your warehouse and do some more picking tests for us, OK? All right. Cool, right? OK. I have got to put this here for a second. This is how we are really imagining the future, right? Humans and robots working in harmony through Joule, through AI.
Now that we have talked so much about AI as one of the new—we need to work on the sound. As a new compute paradigm with AI, there is another compute paradigm on the horizon that is still hard to seize. Marks the 100-year anniversary of quantum mechanics. This year's Nobel Prize in Physics actually went to the pioneers of quantum computing. I could not be more excited to share with you today that SAP is taking huge leaps in that space as well. Let me be clear. SAP is not building a quantum computer. It mustn't make any sense. What we do is what we can do best to invent algorithms and business processes, now also in quantum computing. Today's businesses face, for example, huge logistical challenges across domains: routing, packing, scheduling, and more.
Such challenges, you have a huge combinatorial challenge, quadrillions of possible combinations and outcomes. In fact, the number is so huge that even today's most powerful supercomputers are not able to solve them efficiently and with the best outcomes. The good news is quantum computers actually provide a solution for this because quantum computers explore many possibilities at once and steer towards a better outcome. To build the best outcomes, we teamed up with quantum hardware leaders such as IBM, supporting us on the hardware side. Now, let's ask my friend Jay Gambetta, who is leading quantum research at IBM and who is an IBM Fellow, how we intend to bring this forward for our joint customers at scale. Jay, welcome to Berlin.
Hello, Philipp, and those of you at SAP TechEd today. I'm Jay Gambetta, Director of IBM Research. At IBM Research, we're a worldwide team building the future of computing. I'm speaking to you today from our headquarters in New York. This is where much of the research and development happens in our mission to bring useful quantum computing to the world. As you can see, I'm in front of one of our quantum computers. At the heart of programming a quantum computer is a quantum circuit. We manipulate qubits by superposition and entanglement to explore spaces classical machines cannot simulate. Right now, we have more than 15 quantum computers running across the globe, running at a utility scale. This is the largest fleet of its kind in the world. You might be thinking, what does utility scale mean?
Utility scale is the point we're at now in quantum computing and its exciting chapter. It means that we can start to use quantum computers, such as those we've deployed within our quantum data centers and the one right behind me, as useful scientific research tools. Much of our roadmap is focused on scaling quantum error correction, which is a key ingredient to mitigating the effect of noise in qubits and unlocking quantum's full potential. Building the best quantum computers isn't the whole story. Applying this technology to real-world problems is where our collaboration with SAP becomes so critical. Your team has already worked with ours to dissect a classic enterprise logistics challenge. Today, we're pinpointing where in complex optimization processes a quantum algorithm could be injected to find a better, more efficient solution than classical methods alone.
This is the unique value that SAP can deliver, the insights into enterprise workflows, identifying the computational bottlenecks and co-designing the quantum algorithms to break through them. Our shared vision is a hybrid cloud architecture where quantum and classical computers work in concert. Imagine offloading an intractable optimization problem to a quantum computer via a seamless API call. We are jointly engineering towards this future of a tightly integrated environment with zero downtime and continuous delivery where quantum resources are available on demand, just like any other cloud service. Together with SAP, we're also discovering new applications and forging the integrated industry-grade infrastructure that will bring quantum computing out of the lab and into the heart of your business. It's an honor to be a partner with SAP on this journey.
Thank you, Jay. Now, we just heard how quantum computing helps a little bit to solve some logistical issues. Let's take a look at an early example here in action. What you see on the screen here is a truck- loading scenario, supply chain scenario to, in an optimal way, load that truck here. I want to plan a truck for SAP's own logistics center and optimize how the truck is being loaded according to the route and the goods to be delivered. We kick things off in our usual business process here. I'd like to actually show you what is happening under the hood. Let me take you to this dashboard here that shows the engine of our mathematical optimization. All of this usually happens behind the scenes on a little app that runs on BTP.
This is where our team manages the day-to-day workloads for logistics and planning applications. What we now did in here is to build a new functionality into this engine that automatically generates the corresponding quantum circuit automatically from the business problem. If we click on the next screen, here you can see a snippet of this quantum circuit. This is how you actually program and the computation in such an algorithm works. Now, when we take this algorithm and now press it basically to the quantum computer loaded in there, then the generated quantum circuit runs on this computer. When we do, then we immediately, in high speed, get the result back. The optimization job is actually finished. This is how we are imagining to kind of interweave, if you will, quantum computing as an additional compute paradigm into the business process software.
You do not have to have a PhD in quantum mechanics anymore. We are making it as simple as flipping a switch: on when you need it, off when you do not. We have the mission to define quantum computing where it adds value to the right business processes. We believe quantum will join classical and AI compute in your stack. We are embedding it into the processes and apps you already use. It just shows up in the workflows of your enterprise. AI will predict demand, lead times, and risks. Quantum optimizes decisions given those predictions. With the cloud, we scale it all. With that, SAP got you covered. You got your business covered now and in the future in Quantum Leaps. Thanks to all of you who are joining TechEd on site here in Berlin, as well as online.
We hope you enjoy the more than 240 sessions where you can learn more about all the things we announced today over the next three days. Stay curious and take care. Thank you.