London Stock Exchange Group plc (LON:LSEG)
London flag London · Delayed Price · Currency is GBP · Price in GBX
9,624.00
+74.00 (0.77%)
May 1, 2026, 4:48 PM GMT
← View all transcripts

Investor Update

Nov 10, 2025

David Schwimmer
CEO, LSEG

Good afternoon, everyone. Thank you for joining us. Great to have you here, all in person, and also thank you to those of you who are joining us online. It is great to see such a big turnout, and we are really excited to show you a selection of the many innovations we have developed for our customers at today's Innovation Forum. We have made huge progress over the last five years, and our goal for today is to show you some of the innovation, the transformation, the disruption that we are driving through all of LSEG. Let me briefly take you through the plan for the afternoon. MAP and I will recap the group's strategy and some of the powerful drivers of our business, as well as the execution and transformation that we have delivered to date.

We will hand over to Irfan, our CIO, and Emily, our Head of AI, who will talk in more detail about our engineering transformation and our AI strategy. Next up will be Ron and Gianluca. They will update you on the DNA strategy, progress with Microsoft, the product roadmap, and most importantly, monetization. They will then tee up the DNA product demos, which will all be here in the theater. We will then break you into five groups and rotate through presentations and demos of a number of other great products, and we will cover the logistics of that later. Finally, we will be back in here for Q&A with all of the presenters, and we will finish off with some drinks. First, let's recap on what LSEG is and why these businesses are so valuable together from a strategic, commercial, and financial perspective.

All of our businesses have strong competitive positions, typically top three in the markets they serve, and often number one. Our services perform non-discretionary functions for our clients, i.e., not nice-to-have. Over the years, through investment and M&A, we have aligned the group to multiple structural growth drivers. We work with our customers very differently from how our competitors typically do. We have a partnership model based on an open ecosystem. We build products not just for our customers, but with our customers. We often become their strategic partners, and with their core businesses deeply reliant on our services and products, a high level of trust is critical to our customer relationships. Strong businesses aligned with structural growth tailwinds, with deep customer partnerships. This all translates into a really strong economic model with all-weather growth and very strong cash generation.

Our markets continue to offer very attractive growth prospects, from mid-single digits up to double digits for FTSE Russell and risk intelligence. We have an outstanding portfolio of assets within these markets. Businesses like Realtime Data, SwapClear, and TradeWeb are undisputed scaled leaders in their fields, with long track records of investment, innovation, and growth. WorldCheck is the global leader in the high-growth sector of screening and compliance. FTSE Russell, Workspace, and our non-Realtime Data all have strong top three positions in their markets, and we are investing in all three to build new services for our customers and to grow share. What I like about our positioning is that we are a top player in each of our businesses, but our growth is not constrained by a high market share. Our markets offer growth, and we have room to take share as well.

We have several world-class businesses across our portfolio. They are each great trophy assets on their own, but they become even more valuable as part of an integrated LSEG. We are increasingly linking these products and services closer together for our customers' benefit. This is most evident in our data flywheel. The data we generate from our own markets infrastructure feeds into our DNA business. That data helps customers make better-informed decisions as they trade more, creating yet more data through their trading and risk management activity. Second, Workspace is increasingly becoming the fully integrated workflow through which customers can access many of our services, not only for all DNA data, but now also for FTSE Russell tools, FX trading, LCH data, and in the near future, TradeWeb.

I've spoken before about how we have integrated our FX platforms throughout the group, with Workspace, TradeWeb, and our clearing business, all underpinned by industry-leading FX data and analytics. This creates an end-to-end proposition. We have the same comprehensive offering in swaps, drawing on our SwapClear and TradeWeb franchises. As you know, we're powering a number of FTSE Russell fixed income indices with TradeWeb data. This creates another flywheel effect. The more volume traded on TradeWeb, the better the pricing in the FTSE indices. The more usage of the indices increases the importance of the TradeWeb pricing as the industry standard. These are some of the product benefits, but there are commercial benefits too. We have become an important strategic partner to many of our customers, and our long-term contracts are reflecting that. I'll cover these enterprise deals in more detail in a moment.

We have aligned LSEG with a number of very strong and long-term industry trends. The growing demand for data in decision-making is not new, but AI is driving that to new heights. Not just any data; data that is trusted to be accurate and specialized for our customers' use cases. That data is at a premium, and that is our forte. Electronification and digitization also continue at pace, and through TradeWeb and our digital markets infrastructure, we are at the forefront of that trend. Whether through FTSE Russell, risk intelligence, DNA, or our post-trade businesses, both cleared and uncleared, we support customers as they navigate ever-changing regulation. Our diversification is yet another strength. Unlike many other companies we are compared to, we are not disproportionately exposed to a single asset class, geography, customer type, or product.

We serve customers across the sell side, advisory, buy side, corporate, and academia, and across a broad spread of asset classes. We're also open in our distribution and always have been. This is something you will hear much more about today. We are just as comfortable serving customers directly with our own front end or working in partnership to provide our content through other channels. The final point on why LSEG is so differentiated from a strategic standpoint: it's the unmatched breadth of our offering across the whole trade lifecycle and through the whole data value chain. This gives us a unique position from which to serve our customers as strategic partners, not just as data vendors. Now I'll hand over to MAP to talk about how this all translates into our economic model.

Michel-Alain Proch
CFO, LSEG

Thanks, David, and good afternoon, everyone. I think of our economic model as the best of both worlds. Nearly three-quarters of our revenue is from recurring subscription services. In many cases, these services are relied on by our customers and embedded in their processes. They are critical and high value. The other 25% comes from transactional revenue. However, most of these, particularly TradeWeb and post-trade, have structural growth drivers behind them. It means that they are, of course, cyclical to some degrees, but much less than other exchange-type businesses. You can see that from the 14% compound growth achieved over the last four years. This has translated overall into a very stable top-line performance, whatever the weather. You can see here that whether interest rates are up or down, GDP is stronger or weaker, equity markets are rising or falling, we can deliver mid-to-high single-digit organic growth.

As David mentioned just now, we are not exposed heavily to any single asset class or sector, which means that we have natural offsets throughout the business. Let's take a look at how the model has delivered over the long term. Of course, I certainly can't take credit for all of this, but I'm confident that we will continue the trend. Earnings per share has compounded at 15% over the last 20 years, and dividends per share at 18%. To give you some context, this is the best compound dividend growth and second-best earnings growth of the top 20 FTSE 100 companies. We have guided over $2.4 billion of cash this year, 60% higher than three years ago. This cash generation has enabled us to fund further M&A, just like the post-trade deal we just announced a couple of weeks ago, and return cash to shareholders.

By February next year, we will have returned $5 billion via buybacks in three and a half years. It is roughly 10% of our market capitalization. For the next section, we are going to focus on our delivery. Back to you, David.

David Schwimmer
CEO, LSEG

Thank you, MAP. Let me take you back to the Refinitiv transaction. This is old ground for many of you, I am sure, but many others of you are newer to our story. Five years ago, LSEG was a regional, mainly equities-focused, mainly transactional business. Although the long-term track record that MAP just showed you was outstanding, LSEG was subscale, overly exposed to Europe, and had limited data capabilities. Refinitiv was a global business with a high proportion of subscription revenue and very strong competitive positions, but with a number of assets that required significant investment. Growth had been anemic for many years, with steady market share losses in a healthily growing industry. We knew it was a fixer-upper, and that was reflected in the multiple we paid, around 11 times EV to EBITDA. The size of the prize was significant.

We saw the strategic value in the combination, the creation of a unique group which unites the full trade lifecycle with the full data value chain, each enhancing each other, as I laid out a few moments ago. If it hadn't been for this and the successful integration which followed, we would not feel so confident about the continued growth in front of us. For the last five years, we have been on an ambitious journey to transform the combined business. The first three years or so focused on integration. More recently, we have pivoted to transformation of our people, our platform, and our product. MAP will take you through that integration, and I will pick up on the transformation.

Michel-Alain Proch
CFO, LSEG

Thanks, David. It's fair to say that expectations were low off the back of the Refinitiv deal, mainly because of the perception in the market of the quality of the business. But LSEG has, in fact, delivered in every regard on this transaction. First, growth. We set a growth guidance of 5%-7% for the first three years, and investors were skeptical that we could even reach the bottom of that guidance, given the decades of underinvestment at Refinitiv. But in fact, as you can see, growth has exceeded 6% in each of the last four years. Second, ICON. Many of our investors were unhappy users of the platform and couldn't see a future in it. But with the investment in Workspace and a more resilient back end and improved account management, we have taken a business from many years of revenue declines to four years of growth.

Third, could we really achieve the synergy targets, given the size of the acquisition and the task required? Here we have performed very strongly. Upon announcement, we had initially targeted revenue synergies of $225 million. By the end of 2024, we were at a run rate of $292 million. Similarly, on cost synergies, we exited 2024 running at $562 million against an original target of $350 million. In total, as you know, we've spent $1.4 billion on achieving these synergies, as we expected and as it was reflected in the purchase price. Now, point four, margin. The Refinitiv businesses had a lower margin than the industry benchmark due to legacy system and operational complexity. At first, we improved margins through the integration cost synergies net of gross reinvestment. This brought a net 90 basis points of margin between 2020 and 2023. Since 2024, we've shifted from integration to transformation.

Doing so, we have greatly improved the group operating leverage through the implementation of a holistic and disciplined cost control and investment allocation. As a result, our reported margin jumped by 220 basis points in the last two years, out of which 180 are underlying and 40 is FX. This is positioning us very well to reach our underlying target of 250 basis points for the period 2024 to 2026. Remember, we have a further 100 basis points on top of that from the recent post-trade transaction. Finally, our leverage. We took on $13 billion of additional debt, and our leverage immediately post the Refinitiv deal was 3.3 times net debt to EBITDA.

Through strong cash generation, reduced capital intensity, a couple of disposals, and a disciplined capital allocation, we have reduced our leverage below two times net debt to EBITDA within 12 months, well before the 24-30 months we committed to. We expect to be at around 1.9 times at the end of this year, which is right in the middle of our guided range. I know it is a lot of detail here, but it is important to remind you of the journey we have been on through these two phases of first integration 2021 to 2023, and then transformation on 2024 onwards. Talking about that, David, do you want to talk more deeply about transformation?

David Schwimmer
CEO, LSEG

Thank you, MAP. Some of you have asked me over the past few years about changes to our leadership and what has been driving that. It's true, the team has changed a lot, but to be clear, this is a feature, not a bug in the system. To drive this kind of transformation in many areas, we needed different leadership. We now have a very strong team that has the right capabilities to execute on the next leg of the journey. We have the benefit of strong continuity in markets under Dan Maguire, who has led LCH with such a clear long-term vision and partnership mindset. I see the same continuity, vision, and partnership in our risk and legal and compliance functions under Balbir Bakhshi and Catherine Johnson.

In engineering, operations, finance, people, and corporate affairs, our leaders are driving significant transformation towards a more capable, agile, and efficient organization with a high pace of change supported by deep collaboration. Across our subscription businesses, we are seeing the benefits of bringing in industry and product experts like John Biaggini coming from S&P Global to co-lead DNA. This is also true below the ExCo level with the likes of Todd Hartman joining from FactSet to run data and feeds, David Wilson and Fiona Bassett coming to us as seasoned industry leaders, and Emily Prince becoming Head of AI for the group. Many of these leaders have made significant changes across their own teams. Looking at what we call our group leaders, the top 90 or so direct reports of my direct reports, over a third have joined in the last three years.

They bring new capabilities and enterprise leadership to balance the continuity of the wider population. Looking specifically at engineering, at least 10 of our most senior executives have joined in the last 18 months. Change has by no means been limited to LSEG's leadership. We have transformed the way that we work across the organization. Irfan will shortly talk you through the engineering transformation as we build a high-quality, deeply technical workforce where core capabilities are insourced, enabling our product ambitions. Pascal, Irfan, and Matt are leading our shift to a product-led operating model. Ron stood here two years ago talking through the transformation that we have driven through sales and account management, through training, incentives, and specialization. Matt and Pascal have made significant progress in developing lean and scalable enabling functions across the group.

The second P of our transformation is platform, in particular, building modern and scalable infrastructure. I'm not going to go through these in detail, but I have listed here some of the more significant programs that we have been delivering across LSEG. Any one of these on their own would have been a major undertaking. The work goes on. Enterprise resource planning and billing will take another couple of years, as will our migration of data and applications to Azure. Looking at platform through another lens, we have created a strategic platform which has transformed how we engage with our customers. We have gone from being two sizable but not always top-tier providers to a single critical partner across data and markets infrastructure. With our biggest customers, we work in partnership on their strategic roadmaps and how our products can help them deliver.

Commercially, this comes to life via LSEG Data Access Agreements, or LDAs. You have heard us talk about these a lot over the last couple of years. The breadth and depth of our data and workflow offering allow us to grow our share of wallet while reducing total cost of ownership for our customers. It also gives economic certainty over the long term for both parties. The results have been very good. Customers are showing increasing satisfaction with LSEG, and account growth comfortably outperforms original terms as we introduce additional products and services. As you can see from the chart on the right, if we complete negotiations on all new LDAs currently under discussion, they will represent around 17% of DNA ASV as we exit this year.

Turning to our product transformation, here I've outlined some of our larger scale investments, the areas that we have dedicated the most capital to as we enhance services to customers, starting with Workspace. That has been a double transformation. Not only have we migrated our customers off ICON and onto Workspace, we have also built a modern, customizable, and modular platform on which we are adding enhancements at the rate of two per day. This platform enables us to roll out new innovations as they become available. That ties into the Microsoft partnership. Almost three years ago, we entered into a long-term agreement to work together on product and cloud. Ron and Gianluca, as well as Matt Koerner from Microsoft, will cover this in more detail later. I am really happy with the progress that we are making for our customers.

Just a couple of weeks ago, we announced a new partnership for post-trade solutions with 11 leading banks. This is the culmination of several years of strategic planning and gives us a great platform for further growth in partnership with our major customers. There is the ongoing growth and innovation at TradeWeb. With consistent execution and new product development and deep understanding of customer needs, Billy, Sarah, and the TradeWeb team have continued to drive exceptional growth, enhanced where it makes sense by acquisitions. Most recently, ICD has given us access to a whole new asset class and customer group. Our investment in new product has by no means been limited to these bigger builds. As you can see from this slide, we have been innovating across the board.

Every division has launched significant new product in the last 12 months, and we plan to continue in the same vein. Maybe just a few things to call out. We have fully replatformed our trade routing network, Autex, in Azure, with Autex now connecting 1,600 brokers and asset managers via the cloud. As a result, it's faster, has much greater capacity, and is even more resilient. We have executed the first transaction on our digital markets infrastructure, which is positioned to become an important new capability for trading and settlement. In risk intelligence, we have launched WorldCheck on demand with all of our critical data and insight now updated in real time. What's next? Where do we go from here?

LSEG has changed beyond all recognition in the last 20, 10, and even five years, and it will continue to do so as technologies evolve, regulation changes, and customers encounter new problems to solve. What we are building and what we will show you today is a company that is disrupting itself for customers. Through the presentations, demos, and case studies this afternoon, we will show you how we are transforming through technology, executing a bold AI strategy, advancing our leading data and analytics franchise, and accelerating innovation across all of LSEG. We are focused on delivery, and you will see that today as we bring our commitments on the left here to life in our products. You'll see our data in numerous different environments where different customers work, LSEG everywhere. You'll see the richness of functionality in Workspace and how we are enhancing it with AI and collaboration.

You'll see solutions for real customer pain points that only LSEG can deliver, solutions that drive capital efficiency, manage risk, and reduce cost. All of these will be through the lens of making our demos as real as possible. These are not glossy marketing productions or vaporware, but real products with real use cases stepped through at a pace where you can follow the workflow. First, you will hear from Irfan and Emily, who will demonstrate the progress that we are making on delivering all of this through our engineering transformation and AI strategy. Over to you.

Irfan Hussain
CIO, LSEG

Thank you, David. Is it on? Hello everyone. Emily and I will start with brief introductions, and then we'll talk about engineering transformation at LSEG and our AI strategy. I joined LSEG as CIO in January last year. Prior to that, I was at Goldman Sachs for 28 years, where I worked in most of their businesses, starting from FICC to equities to asset management to wealth to consumer. From working on exotic derivatives, real-time trading, big data analytics, multi-asset portfolio construction, and 24/7 credit card transactions, I had the opportunity to learn and lead various engineering domains across finance.

Emily Prince
Head of AI, LSEG

I'm Emily Prince, Head of AI at LSEG. I've been working at LSEG for nine years, working as the Head of Analytics. I joined LSEG from BlackRock, and prior to that, I'd spent nine years in various quantitative analytics roles, including structuring, portfolio modeling, research across Barclays, Lehman, UniCredit, and RBS. I'm also a member of the Bank of England's AI Consortium.

Irfan Hussain
CIO, LSEG

As I mentioned, I'll first walk you through our engineering strategy and how it's helping us transform the way we build products. Emily and I will cover AI. We took a first-principles approach to our engineering strategy, focusing on the foundational problems we need to solve in order to accelerate product development and manage our costs and risks better. We ask ourselves, what are the key ingredients to building a world-class product organization which allows us to continuously capitalize on latest advancements in technology? How do we build a durable and efficient factory to create new products faster, cheaper, and with appropriate controls? We have three pillars of this strategy, and they're in line with what David just talked about: having exceptional talent, common platforms, and product discipline. Now, these pillars may sound very obvious and basic, but they are not.

They are foundational and some of the hardest aspects of building a world-class organization. You may also notice that these pillars are technology agnostic. Regardless of whether it's AI, quantum, digital assets, cloud, or whatever the latest and greatest is, these are the key building blocks. As we master these, we can play both offense and defense with any technology by accelerating our product development. The talent or people piece is essential. LSEG ultimately serves its customers and builds this product by shipping software. We are a fintech firm, and you cannot build amazing products without the best engineer. That's pillar number one. Talent alone is not sufficient. You will take the best engineers and convert them into mediocre performers if you don't give them the tools and platforms to be efficient. That's the second pillar.

If you only do the first two, all you get is speed, meaning you will be able to ship software quickly. Speed alone is not enough. We do not want to just move faster. We also want to move in the right direction and build the right products for our customers. In other words, what we are really after is speed and direction, which the tech firms would typically refer to as velocity. This is where our third pillar, product discipline, comes in. It sets the direction and is helping LSEG become a product-led firm. Rather than getting into the weeds of every single pillar, let me give you some concrete examples and metrics to bring these to life: where we were, where we are, and where we are going. Last January, we had 17,000 engineers in the firm, and only 40% of them were contractors. Only 40% of them were employees.

The rest were contractors. In general, firms don't get the best engineering talent when they go the contractor route, and you can't build the best product with the outsourced staff. Fast forward to today, we currently have about 14,000 engineers, with 58% of them being internal engineers. That's an 18% increase. Our goal is to get to 80% by the end of 2027. We didn't just shift these numbers blindly. We shifted them with a clear goal of raising the bar on excellence. We introduced new engineering principles to guide all of our actions. We significantly improved our hiring standards and implemented an independent bar-raising protocol to ensure we're consistently hiring the best people. Prior to this year, if you were an amazing engineer, you had to become a manager to progress your career. And as you know, not all engineers want to manage people.

We did not want to take our top quartile engineers and convert them into bottom quartile managers. Now we have individual contributor tracks where you can grow to have the most senior title in the firm without managing a single soul. We announced the first batch of our distinguished engineers late last year to recognize the best of our technical talent. What does it all mean? What is the upshot? Why am I talking about it? In the end, it is about productivity. Our productivity is up 11%, while our headcount is down 18%. In other words, 14,000 engineers are producing 11% more output than what 17,000 did in January last year. Our hypothesis that fewer higher-caliber people will produce more output is proving to be true. Regarding the second pillar, our engineers used to have a lot of friction when they built products.

We had eight different source code repositories, limited automated code pipelines, and no common credentials, artifacts, or logging systems. Fast forward to today, 96% of our code is now in a single source code repository with common platforms. Our engineers are actively using AI to build products, and they're seeing up to a 34% increase in productivity. We're not just using AI to do code completion. We're using it to write new apps from scratch, perform cloud migration, upgrade legacy systems, and automate tests. We have also deployed common cloud platforms to operate across all three of the major cloud providers, allowing us to automate software development and, more importantly, automatically enforce cyber and other control policies. What's the punchline for this pillar? We're seeing up to a 25% increase in release velocity, while our incidents or outages are down by 55%. Why is that important?

It's important because there's a risk that more software changes can mean more instability. These are very important metrics that we track. We want to, of course, move fast to serve our customers, but the same customers and our regulators demand the highest level of resiliency and quality, and that's a key part of our product offering. On pillar three, I know David, MAP, and Peregrine have talked to you about our journey to become product-led. This involves significant cultural, people, and process changes. We're going product by product, team by team, and ensuring that we have the right people and the right processes in place to improve our offerings. This means having a dedicated team of product managers, engineers, and ops people to own the totality of customer experience, regardless of how many teams are involved in delivering the ultimate product.

We're driving our decisions with data and ensuring that we're upgrading and attracting the best talent. To recap, these three are the foundational pillars of our strategy, allowing us to have the velocity and quality needed to build the right products for our customers while managing our costs and risks better. They are laying the foundations for us to leverage AI and other technologies so that we can serve our customers better. Without these pillars, it would have been much slower and more expensive for us to incorporate AI in our products. Speaking of AI, I will now hand over to Emily to kick us off on our AI strategy.

Emily Prince
Head of AI, LSEG

Thanks, Irfan. Now, whatever you think about artificial intelligence, whether you're an evangelist or a skeptic, what is remarkable is the way it's allowing us to consider new approaches to solving old problems. At LSEG, our global reach and diverse, vast data sets, together with decades of experience in data and analytics, we see AI as a powerful opportunity. We've synthesized LSEG's AI strategy into three pillars: trusted data, transformative products, and intelligent enterprise. Let's start with that first pillar, trusted data. You've known LSEG as that trusted provider of content across financial services for a long time. We've reinforced that commitment, and now we've made our data AI-ready. Data is the basis of AI, and to achieve trust in AI, you must first have trusted data.

While the first pillar focuses on the importance of our core trusted content, our second pillar, transformative product, is focused on applying AI to the products we build for our customers. We are in the age of product enablement, and with a single question in a customer's preferred language, we can not only discover new insights but orchestrate entire new workflows. Just as data is the basis of AI, knowledge is the basis of transformative products. LSEG is using its depth of market expertise to reimagine how financial services professionals work with speed, simplicity, and conviction, which you'll see in some of the demos later today. Finally, our third pillar, intelligent enterprise. Achieving success in AI starts with our people. It increasingly shapes the velocity with which we can build products, evaluate risks, respond to customer questions with consistency, and transform unstructured, disparate data into structured insights.

Let's spend more time on trusted data. The depth, breadth, and diversity of LSEG's data is hard for the human brain to comprehend, but for a model, it's a game changer. Why is it models care so much about data, and especially the 33+ PB that LSEG has? Models, of which there are now thousands, are generally trained on publicly sourced data. For models to differentiate, they need differentiated and deep data. When presented with trusted data through the likes of LSEG's MCP server, models can identify relationships and data that generate new insights for end users. Combining LSEG's extraordinary breadth, history, and subject matter expertise in areas such as evaluative pricing, together with powerful AI models, allows LSEG's customers to benefit from unparalleled insights.

Now, on this slide, which you heard David discuss as part of our recent results, we point to the level of differentiation we have in LSEG's data. While I won't step through every number, two I do want to draw your attention to are 90% and 45%. 90% of the data and feeds revenue is based on proprietary data, which the LLMs don't have access to publicly train from. 45% represents a proportion of our data and feeds revenues, which are real-time. Built on a global private network, this is a private content set, not available to AI models, and are highly desired for use by our customers in AI products such as agents. LSEG's trusted data is differentiated and highly valuable in the context of AI. We have an extraordinary mix of proprietary, non-replicable, historical data brought together with LSEG-defined standards and followed by customers across the globe.

Now, every day, our trusted data is underpinning decisions across the financial services ecosystem, from traders in the largest banks to quants building signals and risk analysts responding to changing market conditions. To achieve this data standard, LSEG's content undergoes significant care to achieve the quality which we are happy with. Our process starts with sourcing and has done for decades. It includes over 40,000 contributors and, of course, our own proprietary data generation. Then there is our data quality, which involves deep iterative cleansing until it meets our standards. Coming now to normalizing, the step that means our customers can use the breadth of our data out of the box. This step, together with the application of mastering, is a hugely important one and requires a deep level of expertise. Later today, Adam and Tim will go into this in further detail and also share some demos.

Now, this brings us to concordance and tagging. This is an enrichment step, which broadens the usability of LSEG's data and represents a very important part of what ensures LSEG's data is AI-ready. Finally, distribution. This is not as simple as depositing data in a client environment. LSEG is ensuring consistent delivery of data where and how our customers need it. Through Databricks' DeltaShare, Workspace, APIs, Microsoft, or Google BigQuery, LSEG is everywhere, and we are meeting our customers in their preferred infrastructure. LSEG delivers the highest standard and trusted data from source to insight through unrivaled quality, concordance, and intelligent distribution. Now, having spent some time on the importance of trusted data, I would like to spend a few minutes now on what makes our data AI-ready. Building from the quality we enabled as part of trusted data, we are layering this with control, including data rights management and accessibility.

Our focus on accessibility with semantic enrichment with MCP, or Model Context Protocol, ensures that models do not just consume vast amounts of data but truly understand it. We are leveraging consistent taxonomies, ontologies, and entity concordance to unify disparate data sets and preserve context, enabling models to reason over meaningful relationships rather than unstructured noise. The introduction of MCP has shepherded in the ability for LSEG's data to be safely presented alongside LLMs. We are extending the reach of our unique proprietary data while preserving the underlying licensing and controls. This positions LSEG as the preferred partner and is enabling us to plug into agentic environments such as Microsoft's Copilot Studio, in turn enabling the creation of trusted agents. With each such partner connection, we are opening new client use cases and opportunities, and you will hear more about these opportunities shortly from Ron and Gianluca.

Let's now watch a short video to bring this to life.

QBank is a fictional global bank with a very real challenge. Markets are moving, and central banks are hinting at policy shifts. Quant researcher Miriam, who specializes in risk management, needs to know what a rate change will mean for her portfolio's P&L. To support Miriam and her colleagues, QBank is investing heavily in AI and has implemented an AI chatbot that engages with their agent. Miriam is able to select the sources that she trusts: LSEG, her own bank, and a third-party provider of hedging models. With these boundaries set, she knows her answers will come only from governed sources, ensuring accuracy, repeatability, and compliance. Today, she asks what happens to her Australian dollar exposures if rates rise by 50 basis points, and how should the bank hedge. To answer this specific question, the agent doesn't invent numbers.

It accesses Miriam's selected sources through connectors that use an open standard called Model Context Protocol, or MCP. MCP lets the agent work seamlessly with data from any approved provider, with no custom engineering required once it is set up and configured. Here is what makes the difference. At LSEG, AI-ready does not just mean data that is available through these connectors. It also encompasses rich metadata, transparent rights, proven quality, and unmatched global coverage. Miriam already has entitlements, so her agent delivers actionable insights instantly. Next, Miriam needs the live cost of hedging. Her agent draws in intraday quotes and forward points, giving her a clear view of execution costs. She goes further, asking the agent to set up an alert for when rates move and to prepare risk exposures and hedge updates for review. Miriam stays in control. Trusted insights are always at her fingertips.

With LSEG's data available through these MCP connectors, companies like QBank, who have developed their own AI agent, can deliver an AI chatbot to answer a thousand questions. LSEG is ready, and our data is AI-ready.

We are meeting our customers where they are, from our flagship Workspace integrated experience powered by AI to the enablement of our customers' proprietary solutions and through our strategic AI partners. This broad-based distribution is underpinned by our multi-cloud distribution, AI-ready content, APIs, agents, and feeds. Regardless of how our customers prefer to consume and use LSEG's products, we ensure it is underpinned by trusted content. At LSEG, we provide trusted data to our customers to enable their trusted use of AI. Over the past few months, we've announced a series of strategic AI partnerships from specialist partners complementing our Workspace business, such as Rogal, to scaled partnerships with the likes of Databricks, Snowflake, and Claude. Building the success of our partnership with Microsoft, we've extended our relationship by making LSEG's trusted data available as part of Microsoft's Copilot Studio.

This is enabling customers to build custom agents in the Microsoft ecosystem with LSEG's trusted data. To hear more about this, let me hand over to Irfan and Matt Koerner from Microsoft.

Irfan Hussain
CIO, LSEG

Hello.

Matt Koerner
Corporate VP and CTO, Microsoft

Hello.

Irfan Hussain
CIO, LSEG

Okay, so Emily just covered the trusted data part of the three-pillar strategy that we earlier talked about in AI. For our second pillar is transformative products. Instead of talking about product by product, as David mentioned earlier, you'll be seeing these products live in action. Instead of talking about them, what we're going to do is we're going to talk to Matt Koerner, who's going to give us his perspective on our partnership with Microsoft and how we are co-building various different products using AI, collaboration, and other technology. Before that, it's a quick intro or bio of Matt. Matt is Corporate Vice President, CTO in Microsoft's commercial organization. He oversees technology partnership within the Worldwide Sales and Solution Organization, collaborating closely across global commercial and enterprise customers and partners.

As a Microsoft veteran with 24 years of experience in the company's product group, including roles spanning Windows, Azure, and Microsoft's Cloud for Industry, Matt knows our space, our customers, and he knows LSEG. Matt, welcome.

Matt Koerner
Corporate VP and CTO, Microsoft

Thanks so much for having me. It's great to be here with this informed audience.

Irfan Hussain
CIO, LSEG

Before we start, I know you just landed last night. How is your jet lag?

Matt Koerner
Corporate VP and CTO, Microsoft

I'm doing okay. I'm doing okay. I think I'll last through this conversation, but I reserve the right to go to sleep right afterwards.

Irfan Hussain
CIO, LSEG

All right, so Matt, let's start with Microsoft and especially AI. I know it's a beefy and big topic to start with, but share with us about how you're thinking about AI, how is Microsoft thinking about AI, and what is it that you and Microsoft are most excited about?

Matt Koerner
Corporate VP and CTO, Microsoft

AI is changing the way we operate at Microsoft. It's changing the way we develop our products and serve our customers in go-to-market. We see employees becoming more efficient, and as they become more efficient, they have time to exercise their creativity, and we see them becoming more productive in measurable ways. We also see people learning new skills. For example, a person who has no coding experience at all can now create applications and agents to make their jobs better, and a person who's an expert in their area now can focus on the specialized and most complex part of their job, which drives more fulfillment at work. With that flexibility of people being able to do new things, we also see some new optionality in how we structure teams and distribute work across end-to-end business processes to drive better customer outcomes.

In financial services, what we see is AI transforming the way people do risk analysis. People no longer have to wait for laborious work by a team of analysts. They can get immediate insights across many more options for actions that they take instead of a narrow set of options that they analyze. As Emily said, they can also, with AI, analyze a lot more data than they could have made sense of manually. We see not only better decision-making, but more demand for the kind of differentiated data that LSEG provides. We also now see the emergence of autonomous agents that can take on tasks that were historically only accomplished by people. The first tranche of that has focused on internal scenarios where employees are interacting with HR or IT.

We now see this happening externally, customer-facing, where in sales functions and in customer support, we see AI agents driving results. We see it happening across transformation of business processes. Perhaps most exciting for me personally is bending the curve on innovation, where we see autonomous AI agents now participating in product development, writing code, and taking on more jobs that developers have historically had to do themselves. That is really quite exciting. I think a theme that you'll hear through this conversation and, of course, through the rest of the afternoon is that through our partnership, we're bringing together enterprise workflows and technology with financial services-specific workflows and technology. By bringing those things together, we squeeze a lot of friction out of the system. We reduce cost for customers, we reduce time to value for customers, and we give them better and simpler product experiences.

It's a tremendous opportunity with AI between us.

Irfan Hussain
CIO, LSEG

In terms of the autonomous agent, I know you and I were talking about it this morning. Are you seeing people writing a lot of read-only agents, or are they also making decisions and updating and actually changing the systems?

Matt Koerner
Corporate VP and CTO, Microsoft

Certainly, there's much lower risk when people are just reading things. You can have any employee develop agents that can read things. As soon as you start to write and kind of change the world and drive transactions, you have to be a little bit cautious. Many times our customers and internally, we're bringing professional developers, and we're having more oversight on those scenarios. Certainly, governance to manage the risk that comes with writing is going to be important.

Irfan Hussain
CIO, LSEG

Makes sense. Now, turning to our partnership, we have obviously it's a strategic partnership between our firms. Microsoft, obviously, is a shareholder in LSEG, and Scott Guthrie joined our board in 2023. Give me your perspective on the importance of this partnership to Microsoft.

Matt Koerner
Corporate VP and CTO, Microsoft

This partnership is a game changer for us. We have a horizontal platform capability that we bring to our customers. With LSEG's differentiated data and vertical solutions and know-how, we see a lot of doors opening in the market that previously were not available to us. Our value proposition is more relevant to our customers, and we have simpler and better product propositions. As Microsoft, it's always been our premise that we need to very carefully and thoughtfully and intentionally serve financial services. There is a very large target market for us. It is also the segment that provides growth and stability to the global economy. It is important to us. Our observation would be that some of the tools, workflows, data distribution methods in financial services have not really kept pace with some of the changes that have happened in the rest of enterprise technology.

There is a big opportunity for us to bring value and change to customers that will really help in their business. Looking at LSEG, LSEG has this differentiated data. This data drives insights and actions for so many different market participants: the buy side, the sell side, asset management, insurance, banking, even corporate finance. We felt the partnership with LSEG would bring us closer and with more relevance to all of those different audiences, which we think of as being very valuable. The other thing that you could say about LSEG is I think there are multiple centuries of having established trust, which is tremendous. Microsoft also values deeply trust with our customers. The open philosophy of LSEG is really helpful because we have customers who want to do all kinds of different things. Having an open ecosystem enables their scenarios to work.

Finally, I would say that the partnership mentality and the role that partnerships have played in LSEG across different businesses that you have is very clear. We see this partnership between us as being no different from that. From a customer perspective, what customers see is a more thoughtfully integrated, out-of-the-box solution that just works. They have less cost, they have less time, shorter time to value, and simpler products. We deeply appreciate all the product feedback that you give us. You have helped make many of our products better. I could give an example with Microsoft Fabric. Microsoft Fabric is our data and analytics platform, which we make available to our customers to store all of their enterprise data and then run analysis in the enterprise. To date, we have had Fabric generally available for just about two years.

In that time, we've acquired 28,000 paying customers. We grew 60% last year. That makes Fabric the fastest-growing data platform in the industry. It is used by 80% of the Fortune 500. Now, in Fabric, we have this great horizontal platform that's used by many customers, and they have this broad data estate. The work we've done with LSEG is to bring LSEG data as a first-class capability into Fabric so customers can discover the data, explore it, and then access that data and analyze it not only on its own, but joined with their proprietary data or other commercial data that they've acquired. In order to do that, we had to do a lot of platform improvement to Fabric to meet all the requirements that LSEG had for global data distribution. LSEG has played a key role in making Fabric a better product for every customer.

Not only that, we have customers who have lots of different ways that they do business, even inside of their own organization, department by department. There is native integration with both Azure, Databricks, and Snowflake. LSIG data that shows up in Fabric can be consumed through Fabric workloads or through those partner solutions that also consume that same data set, which makes it very valuable. We have similar stories around Copilot Studio, as Emily said, for people to create agents. We have that story in Teams and Microsoft 365. We have it in Azure. Across all of these, what you see is the merging, again, of financial services-specific workflows with enterprise-specific workflows and platforms to be a more relevant, low-friction, low-cost solution for customers. All of this is about product truth and solution truth, the statements we would make to customers about what they can achieve.

There's also a go-to-market side. Microsoft has had great relationships with the CIO organization for many years at most of our customers. We do not often have deep relationships with the financial services line of business leaders. LSEG has those relationships and has that deep domain knowledge. When we go to customers together and tell our joint story, we can have a much more relevant, cohesive conversation that unifies their tech and line of business conversation so they can much more quickly get to a plan jointly with us on how they want to proceed. The most exciting thing is when this partnership started, AI was not on our radar as an important thing for us to focus on. All of these things that we've talked about were things that we set out to do at the beginning of the partnership before the AI inflection point.

It's a whole bunch of the foundational investments that we've made in these first two years of the partnership now put us in this pole position with AI, and we can very quickly adapt these things to bring differentiated AI value through the work that we've already done. It is a very exciting time. I think we have an innovative future that we can drive.

Irfan Hussain
CIO, LSEG

I'm sure I'm glad you mentioned the word innovation. Before we get to that question, I remembered early when I first joined, AI was a thing, but not that big of a thing. You're absolutely right. Now it is becoming a thing, and some of the work we've done, especially on the data pipelining, that really sets up really, really well. In terms of the innovation, we work very closely together. You and I talk at least once a week, if not daily, especially given many products we're about to ship out. From your perspective, from Microsoft perspective, what does innovation mean to you, and how does it work? We're not just working together ourselves, but working with our partners and our customers. How do you think about innovation in that context?

Matt Koerner
Corporate VP and CTO, Microsoft

We've had a lot of conversations with our customers who have told us how meaningful this partnership is to them. They've expressed their interest in helping to influence the partnership and shape the products that we build. That's great for two reasons. First, it helps us build the right product because we get a lot of customer feedback to inform it. Second, it sort of prepares this initial tranche of early adopters who can deploy that product more quickly and get value out of it because they have confidence, having shaped the product, that it'll be the right one for them. That direct customer engagement is really important. As I said, in Fabric, the case of Fabric that I described before, we had many, many customer conversations that shaped how that product would go.

I think another one to talk about a little bit more is Copilot Studio. We might unpack that one a bit because there's a lot of buzz and many keywords. Maybe I can just go through it step by step and explain how that works and why customer input is important there. I think it was three weeks ago LSEG announced that there'd be a Copilot Studio-based connector for LSEG's MCP server to make LSEG AI-ready data available in our Copilot Studio. That's a mouthful. What is MCP? MCP is Model Context Protocol. This is a way that you can make a tool available to an LLM that it can call to do something. It might push a transaction to a system, or it might query a system for information.

In the case of LSEG data, it's querying the LSEG data set to get information back for use in an AI workflow. In addition to making that tool available, MCP lets you describe what the tool is and how to use it. Here is what the tool is. Here are the parameters you can pass to it. Here is what the results look like. Here are some examples of calling the tool and getting the results. All of that text lets the LLM reason about what that tool can do and how to build it into a workflow. When LSEG wraps their data with MCP, it makes it easy for LLMs to interact with that data.

That takes trusted, definitive, up-to-date, and accurate financial data and makes it available to any AI workflow, which is really important because it helps you ground that AI, which reduces hallucination and makes those results more trusted and reliable. Now LSEG has this capability with MCP. MCP is a standard which can fit into many different AI systems. Microsoft Copilot Studio is one such AI system. Copilot Studio is a low-code and no-code agent development platform. Anyone, whether they're a developer or a novice, can write down what they want an agent to do in plain English or the language of their choice and have that agent created. They can then use Copilot Studio to publish that agent to various different channels. You can put it into Microsoft Teams. You can put it into Microsoft 365 Copilot.

You can put it into your website or your own application. You could even stick it on the end of a phone number so you could do IVR and have somebody talk to the agent. That agent can interact with other systems through connectors. It might be querying Microsoft Dynamics for CRM or ERP data. It might be querying Salesforce. It might be querying SAP, or it might be querying LSEG data. The feedback we had from customers was, "Hey, we'd like to consume LSEG data, but we want it to be like a super simple zero configuration task for a completely novice user. We also want it to be enterprise grade." When we say enterprise grade, we mean it should work consistently, and it should be governable.

In addition to allowing people to create agents, Copilot Studio allows an IT department to inventory and govern those agents so they can see all the agents that exist in their environment, and they can set permissions. For example, every employee ought to be able to create an agent for their own use. If they want to share it, maybe they can only share it with 10 other internal employees. If they want to share it with more than that, they have to go through a security and compliance and engineering review so we can make sure that the right thing is happening with that agent before we publish it out to the world inside of the organization.

The result of this customer feedback on wanting enterprise grade has resulted in LSEG being the very first partner of ours to release an enterprise-grade zero configuration MCP connector for Copilot Studio. Because LSEG is first, it means that LSEG is bumping into some product gaps and some sharp edges and other things in Copilot Studio that we have not ironed out yet. I can say over the past six weeks, we have had a very tight loop between LSEG and Microsoft product folks ironing out those bugs, getting those bug fixes pushed to production, and paving the path for every subsequent customer who is going to come use that connector in Copilot Studio to get their job done inside of their own organization. That feedback has been super valuable for us. I think that is kind of an explanation of how customer feedback drives the stuff on Copilot Studio.

Could I talk about DMI shortly?

Irfan Hussain
CIO, LSEG

Sure, of course. I mean, look, DMI, I think David mentioned earlier, it's a product that we just had our first transaction a month or two ago. I remember talking to you when I first joined. I know you were considered an expert in blockchain and digital assets. Yeah, please talk about DMI.

Matt Koerner
Corporate VP and CTO, Microsoft

Sure. OK. DMI is this sort of modern cloud-based infrastructure for lifecycle management of digital assets from cradle to grave. We started talking about DMI, and we said, "Hey, we can build this thing." We worked together to build this thing on Azure. Once we built it, LSEG started to talk to the world about it. LSEG got this flood of customer interest from customers who either wanted to onboard assets or transact on the platform. It's great to have that signal. The challenge is when you have a new product like this, it's very hard to go from zero to one. You have to balance many different considerations.

The thing that that product feedback did from those customers, the expression of interest and their fine-grained feedback on what they wanted to do, combined with LSEG's market knowledge and relationships with those customers, LSEG has orchestrated a very intentional path to go from zero to one and then from one to scale. That is a lot of trade-offs. You have to manage time to market. You have to manage which jurisdictions you are in and what regulatory requirements they have. You have to decide which asset classes you want to support, what workflows you want to support, and which customers you want to onboard so that you maximize liquidity and flow on the platform to make it relevant and to get scale.

It has been great to see LSEG chart the course for this product where we had a technical point of view on how that product would work, but LSEG knows how to take it to scale with customers. That is another place where I think customer feedback has really driven very intentional product development and product management and a product mindset. I would say this is where LSEG's data, LSEG's domain knowledge, LSEG's vertical solutions, plus our platforms, these are all examples of where we are breaking new ground for the industry. We are doing it hand in glove with customers.

Irfan Hussain
CIO, LSEG

By the way, on your point around the Copilot, we do not mind being guinea pigs. When I first heard that we were the first enterprise-grade MCP connection on the entire Microsoft plant on the Copilot and Copilot Studio, it was good to hear because we are learning at the same time. For us to learn much ahead of anybody else makes it better for us as well. We do not mind co-creating. One of the innovations that our teams have been working on and showcasing today is Open Directory. This has been an incredible partnership between our teams that has been going on for a bit. It solves a clear customer problem. It is secure. It is compliant. It is a cross-form communication augmented by LSEG's Workspace and LSEG's workflows. Tell us more about your thoughts on Teams and on Workspace because now we have brought both of these products together.

It was not a snap of the finger, off you go, and we have this product live. You invested a lot in it. From a Microsoft perspective, how do you view Open Directory?

Matt Koerner
Corporate VP and CTO, Microsoft

Let's start with Teams. Teams is an enterprise collaboration platform. People can do chat. They can do video calls, audio calls, collaboratively edit documents, work in a shared canvas. There are a lot of things that Teams can do. When we last reported on Teams usage, it was in our fiscal year 2024, so the number's a little bit dated. But at that time, we reported 320 million monthly active users on Teams. Teams, for those users, is a part of their daily routine. They log on. They interact with it. And it's part of the air they breathe. It's a system that they live in. Similarly, it's integrated into the IT environment in the organization with identity, security, networking, data policies. All of those things are in place with Teams.

What many people do not know is that Teams has a feature called Federation, where two different organizations can have their users chat with each other. We have this between LSEG and Microsoft. I can type Irfan's name in the address bar of my Teams, and his profile shows up. I click it, and I can send him a message. Emily and I were doing this this morning with a couple of links we were sharing. We can chat back and forth. That works on desktop, web, mobile. True story, about five or six weeks ago, I was in a 12-acre corn maze with my wife and four children. We were lost in the corn maze.

At that moment, Irfan pinged me saying, "We need to talk about an issue we found in Open Directory." I was like, "I'm lost in a corn maze. Can I reach you after we find our way out?" He said, "Sure, sure, sure. Federate your family, and then we can talk later." Indeed, we did talk later. You get alerts, and you get that chat. It is just like it is inside of your own organization. Microsoft is pretty free and easy with Teams Federation because we love chatting with our customers that way. Many financial institutions do not turn on cross-organization federation because there is risk associated with having your employees talk outside of the organization. We do not see high penetration of federation inside of financial services.

When LSEG came and said, "Hey, we'd like to do Open Directory," we were quite excited. We said, "This sounds great. Let's go." They said, "Wait a minute. We think there are some things that you need to do in order to make Teams better so that it'll be ready for these customers." We have spent two years working on a shared backlog of things that we needed to do in the Teams platform to make it ready for this use case. For example, two recent features. This fall, we enabled something called trust indicators. Next to every person and conversation, we mark, "Is it internal, or is it external?" That way, a person inside of an organization knows whether they're having a conversation with an external person, and they can gauge what they say. I see Nej nodding. That was very important.

We got your feedback. I think the second one, which just became generally available a week before last, is granular controls. An IT admin can now say, "This set of users is authorized to chat externally." Maybe it's front office people who have a business need to do that. There are back office people who are not authorized to chat externally because they have no business need to do it. You can turn on federation for just those users who should need it. That capability was also very important. These are examples of platform things that we had to do.

LSEG said, "Look, what we'll do is instead of if you have a new member who wants to join a network, with the way Teams Federation works out of the box, that organization would have to go talk to every single one of the other organizations in the network to do KYC and vetting, and then technical onboarding to get the federation turned on on both sides." LSEG said, "Look, we'll be the centralized clearing house. We have a KYC business that's a leading business. We'll do the KYC on behalf of the network, and we can facilitate and orchestrate that technical onboarding." We said, "That's great. Let us build a solution to do that," which we call automated domain management.

With automated domain management, there's a way for an enterprise to come and say, "I want to be part of this." LSEG does that vetting. That configuration is taken, and there's a little piece that runs inside of each organization that they deploy when they onboard, which picks up that trusted centrally distributed configuration from LSEG, validates it, and deploys it locally so that every member organization picks up that new federation and turns it on right away. I think I can announce that this past weekend, LSEG became the very first tenant deployed in production with Open Directory. In the next five or six weeks, we're going to go take it to other customers together, which is super exciting. Look, the whole point at some level of financial services, at least capital markets, is to facilitate transactions across counterparties.

When that stuff does not happen in Teams, we're not living up to our mission to empower every person and business around the world to achieve more. We really want all of the sorts of activities that happen in a business to be able to happen on Teams. For us, bringing this kind of collaboration into Teams through Open Directory is really strategic and exciting. We're delighted to see this thing happen. As you said, this was not an overnight job. This is two years of platform work and solution work. Part of the thesis of our partnership is we can tackle hard problems and see them through. It is great to have this data point show that.

Irfan Hussain
CIO, LSEG

Awesome. It's my last question. I know we're probably a little over time over here, but we covered a lot today. We talked about DMI. We talked about MCP. We talked about Open Directory. We talked about agentic platforms. I'm not sure if people are keeping track of it. We talked about a lot of these products that we're working together. What excites you? What's next? What is it that you think what is exciting you for the next few years for us working together?

Matt Koerner
Corporate VP and CTO, Microsoft

I like to think about the footwork that we do to get in position and then the execution we can do once we're in position. The footwork has been a whole bunch of this foundational work. We've got a team that operates as a joint team. That's no small thing to build that team. It took years of work to build that team. I think we have top-to-top alignment that's very clear. I spent a significant portion of last week with many members of LSEG's executive committee in Redmond. Now I'm here today. I have untold frequent flyer status. I know the people at the hotels around St. Paul's. They know me. That's very exciting. We've got sort of all those pieces in place. You look at the technology foundation. We have a regulatory compliant footprint of LSEG and Azure.

We have LSEG data in Fabric. We have the Copilot Studio integration with the LSEG MCP server and AI-ready data. We have DMI. We have Open Directory and Teams integration with M365 and with Open Directory. All of these things are now in place. You can imagine some very exciting scenarios. It does not take a big leap to now describe a scenario. I'll just hypothesize one. Two years ago, it would have been inconceivable to talk about this scenario. Now this scenario seems obvious and achievable. The scenario is this. Let's say you have a conversation going with a counterparty in Teams, someone in a different firm, that connection's facilitated by Open Directory. You have an investment thesis. You go into Workspace, and you look at some economic indicators, and you produce a chart. You then take the link for that chart.

You share it in Teams. It comes through as a first-class thing on the other side. They see the chart embedded in Teams. They can click it and jump in their own Workspace deployment, deep-linked straight to the same context that the originators had. That person can do more deep analysis and validation of the data in the chart. They chat about the investment. You decide, "Hey, this investment is valuable. I want to go pursue the next thing." In M365 Copilot, you initiate an agent that you built in Copilot Studio that uses LSEG's Copilot Studio connectivity. The agent goes and retrieves the transcript of the chat in Teams and extracts the investment intent from that chat autonomously. It takes that and constructs a scenario.

It delivers that scenario to LSEG's modeling as a service running in Azure with a risk model that runs over that possible investment thesis using LSEG data coming from Fabric to do a bunch of pricing and risk and volatility analysis, whatever the people who know this stuff know how to do. I do not know how to do that stuff. It does that stuff, and it comes back with data. That data, in turn, goes to a deep reasoning model running in Microsoft's AI Foundry. The deep reasoning model looks over that result of that analysis and puts together a proposed trade with an explanation of why that trade makes sense and maybe what hedges you might want to do and whatever other things. Again, these people do that. I do not know what it is.

It comes back, and it's presented in M365 Copilot along with an accounting of the recent emails, chats, files, and meetings that happened in your enterprise that you have access to to make sure you're not missing something about the context on any of those securities that are in the list. You look at the trade. Maybe you edit it. Maybe you approve it. You send it off to Workspace for execution. This all can happen in minutes, maybe seconds, if the analysis doesn't need to take that long. It doesn't require you to depend on a team of analysts. It's entirely compliant and audited in your enterprise. It's consistent and repeatable so every single person in the firm can get the same result if they want to ask the same question. This is the new AI standard that we're building to.

This is going to be the direction that we go in the partnership. It is very exciting to contemplate. Again, it is merging together financial services-specific workflows, enterprise platforms in a way where Microsoft and LSEG bring complementary strengths to the partnership. We are very excited for the value we will deliver for customers.

Irfan Hussain
CIO, LSEG

We did not prepare for the scenario you just mentioned. For the audience, as I hear what Matt just said, you will see many demos today, which brings a chunk of that workflow to life. I think that it is exciting to hear what you just mentioned. Thank you so much for your time. Really appreciate it. Thank you.

Matt Koerner
Corporate VP and CTO, Microsoft

Thank you for the conversation.

Emily Prince
Head of AI, LSEG

Matt, Irfan, thank you so much. I think one of the things that really resonates for me is how we're bringing the full force of LSEG and Microsoft to co-develop for our customers. Now let's come back to the third pillar of LSEG's AI strategy and focus on intelligent enterprise. Across LSEG, we're deploying AI to innovate faster for our clients, boost productivity, and to transform our data and content operations. At the top of this bubble image, as you heard from Irfan earlier, you see some of the examples of the AI transformation happening in our engineering organization. This AI-enabled transformation is also permeating through other parts of our organization. In our sales team, we're actively using AI to identify prospects and support client management.

In our customer support teams, there's already an 87% adoption of LSEG's proprietary QAS, or question and answer service, designed to support our customer consultants, providing consistent and timely responses. Indeed, we're already seeing up to a 40% reduction in the overall time to resolve customer queries, with 50% being resolved in under an hour. In content operations, we're similarly seeing the benefits of our AI deployment. We're using the latest techniques in AI combined with our deep data expertise to optimize delivery of the highest quality content to our customers. We're already seeing a nine times faster content extraction rate while simultaneously improving accuracy. We're not only bringing the benefits and speed. We're also seeing the gains in efficiency. We have realized a 51% employee reduction in central sourcing and a 66% reduction in cloud costs as part of data scraping activities.

While we are going faster and with greater efficiency, we are not compromising on quality. We are getting even stronger. Data quality issues reported by customers are down 52% on content volumes that have risen by 45% since the beginning of 2022. Content extraction success rate has increased to 98%. The breadth and depth of our content just keeps expanding. To say the volume of what LSEG is providing has grown substantially is an understatement. As an example, our exchange-traded fund holdings data has increased by 400%. AI is already creating new opportunities for LSEG and our customers. With our trusted data transformed to products and intelligent enterprise, combined with our solid infrastructure and strategic partnerships, LSEG is uniquely positioned at the forefront of this change. We're now going to take a short break. Our next session will start at 1:58 P.M. with Ron and Gianluca

Thank you.

Gianluca Biagini
Co-head of the Data and Analytics, LSEG

Ready? Hello everyone, and welcome back. I'm Gianluca Biagini , Co-Head of Data & Analytics alongside Ron. I joined LSEG three months ago, and I'm really excited by the tremendous opportunity we have at LSEG to transform how the industry operates. Previously, I was Head of Data, Evaluation, and Risk Analytics at S&P Global.

Ron Lefferts
Co-head of the Data and Analytics, LSEG

For those of you I haven't met, I'm Ron Lefferts. I've been with the group for four years and previously led our sales and account management function, a role I will be handing over to Chris Coleman when he joins LSEG in January. Prior to joining LSEG, I was a global technology leader at Protiviti, and I've also held senior leadership roles with IBM.

Now, let me start with an overview of our current positioning, including how we are partnering and innovating, before handing over to Gianluca to detail our plans to accelerate growth within the division. We will also give an update on our partnership with Microsoft. We are a leader in a GBP 35 billion global market for financial markets, data, and analytics. Think about the systems in your own institution: order execution, risk management, market surveillance, portfolio management, fund valuation, performance monitoring, and many others. They all rely on huge quantities of timely and accurate data to power them. There is a very good chance those systems are running on our data. Three quarters of our largest customers use 20 or more of our products, typically to help them with highly regulated business-critical activities.

Our solutions are deeply embedded in the global financial ecosystem, with real strategic partnerships grounded in expertise and trust that have been built up over decades. The market in which we operate is growing on multiple fronts. Customer expectations are rising all of the time. Of course, they want real-time data, but some want ultra-low latency feeds, and they want that data to cover a much broader range of asset classes, and for that data to be consistent across multiple platforms. Increasingly, they want to run AI on that data too. As you heard Irfan and Emily talk about, AI models are incredibly data-hungry. This is all driving demand for financial data and analytics, turning trusted and accurate data into actionable insights. Our customers span the world's largest banks, asset management firms, corporates, wealth advisors, and central banks.

For banks and other sell-side firms, the largest proportion of the DNA revenues, we provide fully integrated workflow solutions. Our flagship platform, Workspace, provides a modern, customizable user interface for integrated execution and order management workflows. Investment and wealth managers rely on our unparalleled tick history data and analytics. Here, we are the leading global provider of real-time data with unmatched scale, depth, and breadth. Our AI-powered analytics platform helps create market-validated models and tools to discover insights faster, while corporates rely on our business-critical data and workflow tools from treasury management to company fundamentals and comprehensive news coverage. It is the combination of those capabilities, anchored in the full breadth and depth and quality of our data, that allows us to provide innovative, distinct solutions. The demand for and consumption of data is accelerating, and we are facilitating that growing wave.

The chart on the left-hand side shows the amount of data or messages coming through our real-time data feed. During Liberation Day, up to 20 million data points a second were processed. This information on completed trades, price moves, or indications of liquidity is critical market data that participants need. While Liberation Day was an extreme event, there has been a four-times increase in real-time data over our network in the past 10 years, a trend that we expect to continue. With our direct connection to nearly 600 exchanges and venues, and our ongoing investment in technology and capacity, we are strengthening our market leadership. On the right, you can see the demand for our tick history data, covering 100 million instruments over almost 30 years. It is a powerful data set that is highly valued by our customers, with roughly 4 million customer data inquiries a month in early 2024.

It is also massive, tens of petabytes in size, which some customers found a struggle to manage when they had to receive it as a file. Last year, we made that data available in the cloud for the first time. You can see the impact that it has had on consumption of that data. Now it is 6.5 million customer requests a month and climbing. That speaks to the broader truth in the industry. Although data is abundant, not all data is equal. Markets do not just run on what is publicly available; they run on what is trusted. Customer demand for data that is accurate, comprehensive, verified, and auditable is significant. That is where LSEG sets the standard. If you make it easier for customers to access and consume this data through new cloud distribution channels or AI partnerships, you are also likely to sell much more of it.

We are only near the beginning of that journey. The same approach of meeting customer demand through relentless innovation is at work on our Workspace platform too, where we have driven a lot of change over the last four years. In June, we successfully retired ICON, one of the largest financial services workflow migrations in history, moving more than 350,000 users onto Workspace and establishing a common platform for innovation and growth. We also continue to enhance functionality week in and week out. With something like 500 updates a year, Workspace today is far more powerful than it was even a couple of years ago. As you will see shortly, we are accelerating this evolution further in the coming months and years. The increased power of Workspace is evident on how our customers engage with the platform. They are not using it for just one task.

In fact, all key customer communities—traders, bankers, investment managers—are regularly using 10 or so different Workspace applications. Their engagement with each of these functions is increasing too, with the average trading customer now using desktop applications 60% more than they were a year ago. This shows the success of the Icon to Workspace migration. The new platform is easy to use, intuitive, and drives much greater engagement from customers as a result.

Our disciplined focus on executing our strategy is translating into faster-growing and more resilient business. Customers are keeping our products for longer, with retention up 200 basis points since 2021. We are winning more business with a 700 basis point step-up in win rates over that same period. Revenues are growing as a result, and we have a clear plan to further accelerate these growth rates in 2026 and beyond.

Now, let me hand over to Gianluca to talk through our strategy for future growth.

Gianluca Biagini
Co-head of the Data and Analytics, LSEG

Thanks, Ron. Our ambition is simple: to be the leading provider of trusted data and actionable insights for customers. To do this, we have four strategic pillars. One, expanding our data leadership. Two, transforming customer workflows. Three, maximizing channel reach. Four, building an efficient and scalable platform for growth. As Ron has outlined, we are coming from a position of strength in a combination of trusted data, technology, and talent. Let me spend a few minutes breaking down the priorities under each of these four pillars. For decades, our data has been the foundation for critical financial decision, operation, and capital flows worldwide. Proprietary, licensed content with unmatched depth and breadth, spanning asset classes, institutions, and geographies. We are not standing still.

We continue to enhance and strengthen our content offering to maintain our lead. This has been so exciting for me personally to join such a powerful franchise and help drive the next phase of growth through data leadership. In news, we have expanded our leading news content through a partnership with Dow Jones, adding to thousands of other news sources, including exclusive access to Reuters News. We have also built on our partnership with Reuters with the launch of Reuters Super Summaries, an AI-driven earnings intelligence to deliver concise earnings insights at speed. We are focused on making our news machine-readable. This means investors can combine our news output with sources like Tick History to drive valuable insight into what really moves share prices from second to second. Todd and Tim will give a great example of this in a few minutes.

In private markets, an important growth area, we have added leading data sets in Preqin and Dun & Bradstreet. We have announced last week that LSEG will license Nasdaq e-investment private market data sets. Together, these data sets provide an end-to-end curated view of private markets in a way that others cannot. This is actually a fantastic example of the strength of our distribution platform and flexibility in our strategy. As you know, private markets data is quite fragmented, with no single information services company owning comprehensive coverage. Through organic investment and partnership, we have built that coverage ourselves. This extends across to the FTSE Russell partnership with StepStone as well. Finally, in the third box, we continue to build our unique assets, expanding real-time and tick history and embedding TradeWeb enhanced fixed income data in our services.

Under pillar two, we are enhancing seamless end-to-end workflows with Workspace. As David often says, it is not AI or a desktop; it is AI in the desktop. We have made countless announcements over the last couple of years and have some really big developments over the next six months. Nash will be bringing just some of them to life for you in a few minutes. As Ron highlighted earlier, our customers use Workspace for a wide range of applications. It is fully embedded in the workflows. How can we make that even stronger? First, through AI integration to search, summarize, and analyze, all built on our trusted and accurate data, which customers can rely on for critical processes and decision-making. Second, through collaboration, whether through Open Directory or our trading functionality.

Third, through our applications that are dedicated to specific user types, whether it is in commodities, investment banking, or wealth. Of course, this will all combine financial services workflows with enterprise workflows end-to-end as we fully integrate with Microsoft Teams and Microsoft 365. We have always had a multi-channel approach to data distribution, as Emily outlined earlier, through our own UI Workspace, through direct feeds and distributed via third parties. As demand for data grows and AI use becomes widespread, new channels are opening up all the time. Our customers are working with data in a number of new environments. Our LSEG everywhere approach is focused on delivering AI-ready data to where our customers are working. We are using MCP to enable discoverability for LSEG content in LLMs through a scalable distribution ecosystem, appropriately governed and licensed.

We have developed multi-cloud content distribution through AWS, Google, Azure, and Snowflake, offering customers choice. You will see several of these platforms and use cases demonstrated in a moment. I will also cover what the monetizations and growth opportunities are. Finally, we are investing to transform our data infrastructure to deliver more agile, resilient, and scalable platforms. The migration of data and application to Microsoft Azure is an enabler for more consistent data onboarding and faster product delivery, providing a more unified customer experience across data sets, as well as reducing infrastructure costs. We are making good progress here. On the real-time network, as Ron mentioned, we have just embarked on a five-year investment plan to deliver a step change in capacity and intelligence.

Let's see how all of this translates into what matters, how we will capture the value from the huge growth in demand for data and accelerate our growth. Let's start with the traditional levers: retention, displacement, and value realization. Our products are getting better and better. As a result, we are confident we will continue to improve retention and steadily displace competitors over time. Remember, even with amazing products, the rate of displacement can feel slow because changes can be disruptive for customers. We believe that this can be a steady long-term tailwind. Next, price realization. Ron showed that our real-time traffic is up four times over the last 10 or so years. Demand for our services is growing at a huge pace. We think that that can be reflected more in what our customers pay for our services over time.

On desktop, we have said before that for our high-end Workspace users, there is around a 30% price gap to our major competitor. As we improve functionality and we build networks with products like Open Directory, that gives us the opportunity to close that gap over time. Customers will see the value. It is early days, but we see scope for new revenue streams. If we are driving revenue growth for distribution partners through their compute or subscription, then there is an opportunity to share in the upside that we are generating for them. Finally, increased usage and users. As we modernize our infrastructure, we are introducing more and more telemetry into our stack. This will allow us to move to a more hybrid subscription and usage model, giving customer control and visibility on the spend and capturing the value of usage growth.

As for new users, the spread of AI and new applications is democratizing data like never before. Every industry vertical, every professional services firm can make commercial use of financial data. The opportunity to reach adjacent markets is opening up like never before. Alongside all of these levers, we have our long-term enterprise agreement, or LDAs. These are selective strategic partnerships. As you saw from David earlier, we expect these to represent around a rate of around 70% of ASV as we exit this year. This cements valuable long-term partnership with some of the world's leading institutions, building product roadmaps together and giving good visibility to both parties. The breadth and the depth of our data makes it hard for others to fully replicate. As you can see, we are very excited about the breadth of positive commercial outcomes our strategy will give us.

Ron, back to you to update on the progress with the Microsoft partnership.

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Thanks, Gianluca. Our partnership with Microsoft is a key aspect of our overall strategy. It was great to hear from Matt Koerner earlier about how important it is to Microsoft as well. In fact, Gianluca, David, and I, as well as a few other colleagues, were in Seattle last week for a few days to have a detailed partnership catch-up with a number of Microsoft leaders. We've made good progress over the last three years with the partnership moving from production product ideation to product build and increasingly into product delivery. However, inevitably, with partnerships on this scale, this process has not always been as fast as we would have liked.

In some areas, we needed to build the foundations of a platform before we could scale data migration, which is very important to get right, even if it's not glamorous. AI was barely a thing when we started out and is now fundamental. Customers were initially nervous about new product adoption, particularly around protecting their own data. Compliance proved quite a barrier to onboarding for some time. Our early-stage product launches have shown the importance of engaging with customers throughout the design process, informing a broad range of design aspects from product onboarding, iteration, and co-innovation, as well as the importance of applying a community lens to product rollout to drive adoption. Let me remind you of what has already been delivered. Each of the products on this slide are either live or will be in the coming weeks.

For example, and applying the community lens I just mentioned, we're rolling out Open Directory to FX and commodities users where our workflow tools are already deeply embedded. By connecting these users and giving them tools to surface, share, and collaborate on content, we will further deepen and extend these communities. Crucially, Open Directory will be using Microsoft's automated domain management, or ADM. You heard Matt and Irfan talk about this earlier. ADM supports the onboarding of external customers into Open Directory for secure and compliant intercompany workflows. It's a key differentiator for LSEG in enabling secure, federated collaboration across financial institutions. We're also making our AI-ready financial data available both in Copilot and Copilot Studio, enabling customers with LSEG licenses to build their own agents, working with our data, embedding our solutions across the finance industry and beyond.

Through the launch of the Analytics API and its extension to Visual Studio Code, we have nearly doubled the rate of growth over the last 18 months. We have fully replatformed our trade routing solution for 1,600 investment managers and banks, creating a first-of-its-kind cloud solution that is faster, more scalable, and more resilient. This platform is currently handling trading of roughly 4 billion securities a day. Staying on the theme of trading, we delivered the first transactions on our new digital markets infrastructure in Q3, deploying distributed ledger scalability and efficiency across the full asset lifecycle of a trade, from issuance, tokenization, and distribution to post-trade asset settlement and servicing. Looking ahead to next year, we will accelerate our pace of delivery. We will expand our data leadership.

In particular, we will grow our private market data feed substantially, combining our own proprietary sources with leading data sets from Preqin, Nasdaq, and Dun & Bradstreet, and delivering that through an intelligent combined feed covering private credit, equity, infrastructure, and real estate. Gianluca called this out earlier. We will transform customer workflows with the full launch of Workspace AI, including the scale-up of Workspace Teams, Open Directory, and the integration with the Microsoft 365 suite, bringing huge benefits to our customers. We will maximize our channel reach, making all our DNA data feeds AI-ready and building AI-enabled analytics, intelligence, and automation. We will continue our work building an efficient and scalable platform, both through the ongoing migration to Azure and opening up our own Analytics API for customers to distribute and monetize their own models via our infrastructure. Now, there's a lot to digest here.

As we invest in content and accelerate innovation, creating new partnerships and deepening existing ones. As Gianluca said, our ambition is simple: to be the leading provider of trusted data and actionable insights for customers. We are excited by the numerous opportunities we see ahead and have a clear strategic focus for delivery. Before I hand over to Todd Hartman, a quick word on what you will see over the next hour. Here is the model of content and distribution which underpins LSEG everywhere, which Emily covered earlier. If a customer wants to produce a detailed company report or a piece of fixed-income analytics, they can do so in any environment through their own UI, through our UI Workspace, based on our direct feeds, or through other consumption layers provided by our partners. We will showcase all of these options. Thank you.

I'll hand the floor to Todd to kick off the data and feeds and data analytics demos. Thank you.

Todd Hartmann
Head of Data and Feeds, LSEG

Thank you, Ron and Gianluca. Hello, everyone. Welcome to our session on data and feeds. My name is Todd Hartmann, and I lead this business. I'm joined by my colleague, Tim Anderson, who heads up our tick history and quantitative analytics business. I've been with LSEG now for a few months. Prior to that, I was with FactSet for about 19 years, where I helped to build their data and feeds business. We thought it would be helpful for me to start by providing an overview of our data and the value it provides to our customers. I'm going to cover the challenges our customers face, the value of our data, its breadth and depth, and how our customers are deploying AI on our data.

I'll then hand it over to Tim to show you an example of how we solve a specific customer need. Our customers face a number of challenges when using AI on financial data. Before AI can reason, it needs order. The problem is our clients are managing a lot of data from many different sources. Each data set speaks a different language with no single identifier to align them. Even the most advanced AI cannot see the full picture in a reasonable amount of time. It is essentially like navigating a new city without a map. Let me now explain our approach. On the left-hand side of this slide, you can see that we give our clients access to one of the world's most comprehensive libraries of financial and market data, including contributions from over 40,000 customers covering decades of history.

We provide this data in a connected and consistent way. On the right-hand side of the slide, as Emily covered earlier, all of that data is curated and mastered. At the heart of our approach are two industry standard IDs, which link and align data sets across asset classes and systems. It is this ability to structure and connect data that truly sets us apart and sits at the center of everything that you'll see here today. I mentioned a moment ago our industry standards. LSEG has a unique framework in place. On the left, our normalization allows all data to speak the same language, as I covered on the previous slide. That means there's a consistent structure across every data set and asset class, which reduces noise and minimizes AI hallucinations. This means faster and more accurate AI responses.

Any model knows exactly where to find the data point because we provide a map of our data structure. Next is the Reuters instrument code. The RIC provides unique point-in-time identifiers for over 80 million listed instruments, connecting across standards like QSIP and ISIN. Finally, our PERM ID links entities, people, companies, instruments, and sectors, forming the connective tissue between our data. For example, in the column on the right, you can see LSEG's PERM ID is linked to CEO David Schwimmer and to the banking and investment services sector, listed on the exchange with RIC LSEG. Together, these elements create a data environment ready for AI. For data and feeds specifically today, we'll show you an example of how our clients can use our data with AI in the cloud, transforming how investment workflows operate.

We see AI as an extraordinary opportunity to accelerate data and feeds growth. Our AI strategy makes our data sets AI-ready and available wherever and however clients need them. I'm now going to pass this over to Tim, who will walk you through an example. Some may find this a bit technical, but we thought it was important to show the complexity of what's involved. Over to you, Tim.

Tim Anderson
Head of Tick History Cloud Services, LSEG

Thank you, Todd. Hello. I'm Tim Anderson, the head of tick history and quantitative analytics, formerly from Trading Technologies at Deutsche Bank and JP Morgan. What sort of things can our customers do with our data, particularly enhanced by multiple agents, as many customers are working towards? Here's a use case that we'll be showing today. This is not just a chatbot.

It's an AI-generated back-tested analyst report that can show data from recently or look for signals in the past. The AI insights can be sent to any endpoint a customer wants. It can be tailored to be as complex as the customer requires. This example shows what's now possible for LSEG everywhere, a report built entirely from LSEG data with the option for customers to add their own. It correlates news events, sentiment, and ESG scoring, trading volume, and performance metrics, all connected automatically. This is insight at machine speed, producing a tailored report focused on investable securities across any exchange, complete with a reasoned buy, hold, and sell rating based on a customer's criteria, and in-depth summaries across multiple data indicators. This intelligence begins with our data foundation. For the example shown, at the core of that foundation are four key products.

First, as David had mentioned, Tick History, one of LSEG's flagship data products. It provides a complete timestamp record of every trade and quote on major global exchanges over 30 years. It contains petabytes of data at a billion captures per second. Next, quantitative analytics, our data and analytics environment for quants, portfolio managers, and data scientists who need large-scale, high-quality data for modeling and research. It integrates over 60 data sets covering pricing, fundamentals, economics, ESG, and sentiment, all AI-ready. Third, machine-readable news, our AI-ready news feed that transforms Reuters journalism into structured timestamp data, turning headlines into real-time signals. Finally, MarketPsych, behavioral and sentiment analytics, quantifying how people feel and talk about the markets in real time. This is data that's quantitative, sentiment-driven, and contextual, a living record of the world's markets. Now we bring all this to life with AI agents.

Across LSEG's data sets, we now have agents working in parallel, automatically. They query, they calculate, and they merge data in concert. From left to right on the screen, you'll see these agents at work, each a special analyst scanning the data, collecting, calculating, and creating insights live. One agent pulls trading volumes and calculates VWAP. Another analyzes market sentiment from news and social media. Another assesses ESG performance. Each is laser-focused, and they all come together to produce a unified result like worker bees in a hive. If you want to update just one part of the analysis, say ESG, you simply modify that agent, and it updates instantly. All of this runs seamlessly across Google BigQuery or in the customer's own cloud. It's AI-ready at scale. We can even add regulatory agents that automate reporting and validation so every output is auditable and compliant.

In short, you've got a network of intelligent agents doing the heavy lifting, freeing analysts to focus on decisions, not data preparation. Now, how do we handle one of the toughest challenges, transferring large amounts of data to customers? This critical component to making work is cloud delivery. Most data solutions stop at access. Ours goes all the way to delivery. You saw a chart earlier that Ron presented showing how tick history consumption accelerated significantly when we moved to the cloud. This platform runs natively in Google BigQuery, and think of it like a data jigsaw. We snap our data directly into the customer cloud. No downloads, no storage management, just secure delivery, which significantly reduces our customer's total cost of ownership, whereby the customer can query LSEG data on demand and their own data at the same time.

LSEG securely shares the data into the customer's environment, where AI agents execute securely inside the projects. Everything stays connected, governed, an d auditable. Data moves without ever leaving its secure home. By combining AI-ready content, agents in cloud. Now let's see how the agent engine example brings it all together.

Welcome. In the prompt box of the Gemini's enterprise agent, we've asked the system to produce a comparative performance analysis of British Telecom and Vodafone over time. What's happening now would normally take an analyst hours or even days to complete, but it can now be done in moments while the AI agents begin their research by querying the data in BigQuery. Let's look at what's happening behind the scenes. Most of our data exists in the form of numerical tables in BigQuery, but data on its own can be overwhelming. We need insights and analytics.

This is where the power of agentic AI truly comes into its own. Multiple agents, each with their own role, query the data in BigQuery or LSEG's MCP and interact with one another like worker bees. On the left, we see the data tables being queried. The white boxes on the right represent the dedicated agents, while the text beneath them shows the insight they're beginning to gather, forming the foundation of the intelligent summary they'll provide. Returning now to Google's Gemini Enterprise, we can see the results being presented. The data is connected to LSEG's RIC and PERM ID. Because the data has been normalized, the language model understands both the structure and the underlying data model as one coherent system. The way we design each agent determines the type of output we receive.

In this example, we've asked for a relatively simple report, but the customer can make the computations as complex as required using any of LSEG's data sets. In addition, the customer can combine the output with their own data. The choice is theirs. Now the system brings these insights together, allowing an agent to be designed to deliver buy, hold, or sell ratings for each security analyzer based on the customer's defined criteria. During this process, the agents perform industry-standard calculations dynamically, such as percentage price movements, VWAP, and time-based bar output, while correlating those results with point-in-time news stories and street events. The report generation agent then unifies all of this into a single intelligent report, bringing the analysis full circle.

Every customer problem we saw earlier has a direct LSEG solution. AI agents instantly collect, summarize, and connect the data.

Our data model integrates tick history, quantitative analytics, and news, all working seamlessly, and customers can add their own data. RIC and PERM ID, as Todd went over, ensures consistency across systems. With context memory, intelligence carries forward from one report to the next. It is this architecture that creates a new commercial opportunity for LSEG. What does this mean for our business? More revenue opportunity. Historically, LSEG generated revenue from selling data both directly to the end customer and via third parties like Aladdin. That will continue to be the driver of our growth. Today, we are adding new distribution channels as we accelerate LSEG everywhere. These new AI and cloud-based applications add value for existing customers and allow us to reach new customers.

In the future, we can monetize the example you've just seen, selling data with multiple functional agents, which extract context, insight, and relationships, or even purely computational agents performing high-value analysis. In summary, we'll be selling intelligence inside the data, where the reasoning itself becomes marketable. Thank you again for your time, and we look forward to continuing the conversation after the session. We'll now pass to Mikhail and Adam from the analytics team. Thank you again.

Mikhail Bezroukov
Director of Cross-Asset and Fund Analytics, LSEG

Thank you, Tim and Todd. Good afternoon, everyone, and welcome to the analytics section of our LSEG data and analytics presentation. My name is Mikhail Bezrikov from the analytics product management team, and I'm going to take you through some of our latest developments and core themes. I'll then hand over to my colleague, Adam Towne , who will walk you through some product demonstrations. First, an introduction.

LSEG analytics provides our clients with the tools, models, and information that they need to make good decisions and drive their businesses forward. Our analytics models cover hundreds of asset classes across instrument pricing and valuation, predictive analytics, and risk models. They're deeply embedded in clients' critical operations and ecosystems through our analytics API. They're used for investment research and alpha generation, for forecasting, and risk management. Our customers range from small startups to the largest financial institutions. For many years and across multiple turbulent market cycles, thousands of customers have relied on our analytics for accurate, trusted insights. LSEG analytics is focused on three core areas and benefits for our customers: coverage, scale, and efficiency. First, we deliver substantial analytics coverage. Our clients draw upon a multitude of input data sources and extensive model libraries to help solve the critical challenges that they're facing.

For many years, we've offered hundreds of models through distinct channels, including many Workspace apps, and these are now brought together through our Analytics API. This brings us to the second point, the scale. Rather than offering many disparate solutions, we ensure that clients can access our analytics outputs through a single consolidated Analytics API. The API makes our models available to clients in a cohesive manner at large scale and allows for a deep set of customer-specific customizations. It connects directly to users, to their internal platforms, or to other downstream systems. Third, we focus on improving our customers' productivity and efficiency of work. We do this by making our Analytics API much easier to access and interact with through our AI-ready initiatives and partnerships. In this, we're fully aligned with our overall LSEG everywhere strategy that we've spoken about elsewhere today.

We will speak about the integration of our AI-ready analytics with the Databricks Cloud Platform, with our MCP-powered AI agents, and we'll talk about our proprietary integration with Visual Studio Code. These initiatives save our clients critical onboarding time and high technology costs while continuing to give access to our trusted, deterministic analytics models. Our partners choose to work with us because our analytics are already structured for AI consumption, because our models have proven accuracy over many years and are already familiar from Workspace apps, and because they're supported by trusted LSEG data that my colleagues, Tim and Todd, have just spoken about. The depth, breadth, and accuracy of our analytics is unmatched, and we enable our customers to work in new ways whenever and wherever they choose, be it in the Analytics API, in an AI agent, or, of course, in a Workspace app.

It's LSEG everywhere in action. Now, let me hand over to Adam. He will bring this to life through a few examples.

Adam Towne
Director of Product, Scaled Analytics, and Funds, LSEG

Thank you, Mikhail. I'll be running through three demos in which I highlight the ways that our strategy of coverage, scale, and efficiency is delivering value to our customers and truly bringing LSEG everywhere. A credit quant needs to backtest a trading strategy as far back as the global financial crisis. Getting that coverage alone is a challenge, and onboarding data is traditionally slow and error-prone. With LSEG's AI-ready 20+ year history across millions of securities and integrations with data and AI platforms like Databricks, they can get started building and backtesting trading strategies in minutes, not days or weeks. Let's see how. Here I am in the Databricks Genie AI Assistant.

I'm looking at LSEG's historical analytics on government and corporate bonds, seamlessly shared to my account with Databricks Delta sharing in minutes so that I don't need to spend days building pipelines to ingest the data. I'm a credit analyst, and I'd like to understand the impact of the global financial crisis on the financial sectors in the U.S. and the EU so that I can backtest my strategy. I'm going to prompt Genie AI's natural language interface and ask about option-adjusted spread, or OAS, a key indicator of credit risk during that time. What was the median OAS from 2007 to 2013 for the financial sectors in the U.S. and EU? LSEG's AI-ready content is well-structured for AI use cases and tuned for LLMs.

Because of that, Genie is going to be able to answer that question in seconds rather than the hours it might have taken the analyst to write the code previously. You can see that the answer is already returning. First, a table, and then a chart soon after. If you look at the chart, you can see that the U.S. had a large spike in OAS early in the global financial crisis. The EU had it there a couple of years later and for a longer stretch. From here, I can easily dig a little deeper, again with natural language, to understand the specific subsectors that drove these spread movements and can even build correlations that will support risk models. We are generating fast, trusted insights powered by LSEG's AI-ready content, well-structured for LLMs.

Time to value for the content that our customers subscribe to from LSEG has never been faster. Customers who host their models on LSEG's infrastructure can automatically build the same rich histories and share them with their customers, just like we have here. What do we just see? Instant access to LSEG analytics in Databricks and other data and AI platforms with no need to wrangle data. Natural language queries, no coding skills needed. Trusted results using LSEG's historical analytics on government and corporate bonds. This means customers can make faster, more confident decisions powered by the analytics API, AI-ready in the customer's cloud. Coverage, scale, efficiency. Now that we've seen how we're enabling customers to interrogate our data with AI, let's head to our second scenario in which we're enabling a customer to marry our data with theirs in their AI stack.

A credit analyst needs to combine their internal portfolio data with LSEG's trusted analytics in their proprietary AI application to aid them in bond pricing activities. They need to do it securely without writing code. We're seeing an increase in the use of AI across financial services, and LSEG's Model Context Protocol, or MCP server, enables seamless integration of LSEG content and analytics into a customer's firm-wide AI solutions. Let me show you how. In this demo, I'm using Anthropic's Claude as my MCP client, but these capabilities work everywhere. I'm a credit analyst, and I'd like to understand my portfolio's risk. I'm using LSEG's MCP server to marry my internal data to the LSEG content for which I have a license. With a single click, I've connected to LSEG's MCP server so that I can analyze the risk of my portfolio.

You can see all the different models that this user can connect to in this dropdown. I'm going to start by asking a question about the yield of an individual security in my portfolio. Now, this is connecting to LSEG's analytics API to retrieve pre-computed results from last night. It's returning the price and yield in natural language without my writing a single line of code. Now, I want to grab the spread as well, that same risk measure we looked at in Databricks. It returns almost instantly, again without a single line of code. Finally, I'd like to run a scenario and see what would happen to the spread if the price changed. This is computing live using LSEG's analytics API to return the comparison of the spread today versus yesterday.

Everything you've seen is running against LSEG's analytics API, optimized to answer LLM-powered queries and connecting to LSEG's accurate models. We can extend this to customer models as well, running on our infrastructure. With MCP and LSEG's AI-ready content, customers can run deep analyses, joining their data to LSEG's, all without writing a single line of code. That is driving consumption across our content and our APIs. What did we just see? A credit analyst subscribed to LSEG's content was able to seamlessly combine LSEG's analytics with their own internal data in their AI stack. Using LSEG's MCP server, they can retrieve results and run dynamic calculations, all without a single line of code. They can do this with any of LSEG's models or models that LSEG hosts on behalf of our customers, in whichever MCP client they are choosing to use.

We are doing this with every LSEG model across every asset class, answering millions of queries per day on our APIs and integrated directly into the customer's AI stack so they can move faster. It is the same three pillars in action: coverage, scale, efficiency. Now that we have seen how we are enabling no-code integration with LSEG content and analytics, let's go to our third and final scenario in which we show how we have made it easier than ever to write code to leverage LSEG models. A quant in the FX markets needs to routinely hedge an FX position, so they need to write reusable code that can be run automatically. They need to move quickly, but mastering the syntax to build and run models at scale is challenging and can lead to critical bugs that can have a huge impact on your company's bottom line.

That is why we built the LSEG Analytics Visual Studio Code Intelligent AI Assistant. Visual Studio Code is a preferred development environment for 74% of financial services firms. We are making it easy for customers to build on top of our models with the power of AI. Let me show you how that works. Here I am in Visual Studio Code, one of the most popular development environments in financial services. I am looking at LSEG's AI Coding Assistant extension in the marketplace. FX markets move quickly, and I need a fast, scalable way to build an application that I can use to plan my hedges. I am going to head over to LSEG's prompt template library. We have many templates optimized for the most common activities of financial services professionals. I am going to grab one of the pre-canned natural language templates for pulling in an FX forward curve.

What looks like four simple steps is actually a lot of code. With the power of LSEG's AI Assistant, I do not need to write that code myself. In seconds, and using only natural language, I will have code that can build a graph that I can use for my analysis. This would have taken hours or potentially days, with painful debugging, reading of the documentation, and calls to LSEG for technical support. I am going to save this script and then click Run. The results return in seconds, powered by real working code that I can deploy. It took me minutes, not days. That is true for LSEG's models and for the models that customers host on our infrastructure. We are driving consumption of our APIs, and we are doing it by making writing production code easier than ever. What did we just see?

An FX quant is able to use LSEG's Visual Studio Code AI Coding Assistant to build their FX hedging strategy in seconds. LSEG's prompt templates enable them to rapidly write new code without worrying about syntax or the right order of operations so they can focus on building value. The quant can do this across any of LSEG's powerful, trusted models, all using natural language. They have real working code that they can deploy to production. We're expediting strategy development across every model in LSEG's arsenal, accelerating production use cases on the back of the analytics API. It's happening where our customers write code. LSEG Analytics is delivering on its three pillars. We are delivering coverage across hundreds of high-value, cross-asset analytics models and sources. We are achieving enterprise scale through our high-performance, interactive analytics API.

We are improving efficiency by delivering AI-enabled analytics to customers through the most commonly used channels, like Databricks, model context protocol, Visual Studio Code, and LSEG Workspace. In short, we are bringing LSEG everywhere, helping clients analyze faster, make more confident decisions, and innovate at scale. Now I will turn it over to Nej D’Jelal from Workflows. Thank you.

Nej D’Jelal
Head of Workspace and D and A, LSEG

Good afternoon and welcome to the LSEG Workspace session. I am Nej D’Jelal , Group Head of Workspace, LSEG's customer-facing flagship platform that serves over 350,000 users across the trade lifecycle. Now, building on what Ron and Gianluca shared earlier, our ambition is to be the leading provider of accurate, trusted data that underpins actionable insights for our customers. Workspace is where that vision becomes reality for financial workflows. Across the financial sector, professionals lose valuable time switching between systems and chasing data, inefficiencies that cost global institutions millions.

Whilst AI brings speed, value comes from confident decisions and secure collaboration in one place. This is where LSEG differentiates, bringing together actionable insights underpinned by the market's most comprehensive, trusted, and accurate data, as mentioned by Todd and Emily earlier, all of which is delivered through AI capabilities in Workspace. Secondly, integrated workflows enabled by Workspace working seamlessly with our customers' Microsoft tools. Thirdly, secure intercompany collaboration powered by Microsoft Teams and Open Directory. The result is an unparalleled package deal of trusted and accurate insights embedded where work happens, driving better decisions and collaboration across the industry. Today, you will see three demos that bring this vision to life. Firstly, Workspace AI in action, integrated with Excel, PowerPoint, Teams, and Open Directory. We will also show you trading workflows where we have integrated Teams, Workspace, and our analytics partner, TradeFeeder.

Finally, we'll showcase a proof of concept of Microsoft Copilot agents integrated with Workspace. These demos will show how our core enablers drive commercial impact, boosting license value, expanding reach, and unlocking new revenue streams. Let's introduce the demo. We start with a banker preparing a pitch for a private equity firm. Pitch books are notoriously time-consuming, hours spent chasing data across systems, switching between Excel and PowerPoint, and manual formatting. When you consider the tens of thousands of professionals that spend 15-30 hours a week on a single pitch book, the opportunity to accelerate that process, but without compromising trust or data accuracy, unlocks millions in efficiency gains alone.

Hence, our ability to bring together trusted and accurate insights, workflow integration, and collaboration as one single package into our customers' tools means they can build and share pitch books faster and with more confidence than ever before. The commercial benefits for LSEG are clear. Even deeper workflow integration drives higher license value, leading to stronger retention and price uplift. Let's dive into the demo.

Our customer opens Workspace and wants to get up to speed on a company in their coverage sector, so they ask for an update. The response is constructed using LSEG's unique, trusted data and exclusive content, including Reuters News. There's also an unmatched range of equity research and access to global deals data and corporate events. Workspace AI brings this all together in one summarized response. Even at this early stage, we have already saved hours by delivering data within a single, seamless workflow.

Now, they ask for key highlights from an earnings call. Workspace creates a summary using transcripts, news, equity research, and filings, all with clear citations integrated seamlessly into the workflow. The customer is now equipped with unique insights derived from a wide range of LSEG's exclusive sources. They need further research and analysis to build a pitch book. The user asks Workspace to create a company report. Workspace analyzes the request and knows what is required. It creates a comprehensive report using LSEG's trusted data, including financial reporting and estimates, news, equity research, deals data, comparable peers, shareholders and management, and more. The peer set is something they'll need for the pitch book. In this case, they want to add a revenue column to the table. Workspace can help with that via its AI companion. The companion understands natural language and can search for and modify content.

In this case, it correctly interprets the request, fetches the data, and updates the report. It is easy, fast, and accurate. Now that they have what they need, they move to analyze it using the Workspace Excel add-in. The LSEG Workspace add-ins enrich data discoverability, enhance visualizations, and improve reusability of analysis and reports. First-of-her-kind LSEG data types are integrated closely into core user workflows within Microsoft Excel and PowerPoint. They are an example of the seamless interoperability between Workspace and Microsoft Office. The peer set is available instantly in Excel. Thanks to the Workspace add-in, it is auditable directly using Workspace data objects. These objects elevate data from flat values within cells, adding deep metadata that allows for seamless exploration of LSEG content without leaving the Excel sheet. Now they have the relevant analysis. They want to create a chart for the pitch book.

They ask Workspace for help. Using natural language and agentic AI, they can add data points, make adjustments, and create a chart in seconds. They can also add the chart to the Workspace asset library, a library of user-created content, such as slides, charts, and tables, for seamless edits in the future. The chart is now ready for further editing and review, so it is added to the pitch book. After a few iterations, the pitch book is ready to be shared with their client. The most effective way is directly through Microsoft Teams and Open Directory. Here, they find an embedded Workspace agent that helps them surface data within the Teams environment. Because Microsoft Teams and Open Directory have simplified collaboration, they are able to use the Workspace agent to help find the right contact to start the conversation.

The pitch book is shared, and Open Directory has enabled seamless and secure intercompany collaboration, supporting a more efficient workflow.

As you have just seen, Workspace isn't just a terminal. It's the entire financial workflow. We saw trusted insights, integrated workflows, and collaboration through Microsoft Teams and Open Directory, delivering speed, confidence, and collaboration for our customers. Commercially, this means higher retention, more usage, and upsell opportunities, while embedding us deeper into the customer's environment, thereby expanding distribution and reach. Importantly, AI in Workspace and Open Directory are in beta pilots, and the Office add-in and Workspace app for Teams are both generally available for some data sets. Now, let's move on to the second demo. This time, we'll show you how Workspace helps traders make faster and smarter execution decisions. In volatile markets, execution costs can make or break a trade.

Yet traders often rely on fragmented data and manual processes that slow down, essentially, their overall experience and increase risk. Working with a partner that specializes in trade performance analytics, Workspace brings everything together in the form of natural language queries like, "What's the best way to execute this order?" and embedded analytics and execution tickets in one single workflow, and collaboration with liquidity providers via Open Directory. For traders, that means speed, accuracy, and reduced risk. For LSEG, it means even deeper workflow integration, driving license value, increasing retention, and price uplift. Let's dive into the demo.

Here we have another familiar scenario. A portfolio manager has closed a US fund and is now looking to invest in Europe. He contacts his trader for an overview on FX spot conversion. Using the Workspace agent, the trader can share analytics directly in the chat.

They can see different strategies and then choose to view it within Workspace. TradeFeeder tells them that an RFQ provides the best predicted results, and they launch an FX all ticket in Workspace with the pre-selected liquidity providers from the analysis. That is not all the trader can do. They also use the Workspace agent to review post-trade cost analyses. The trader sees that one provider is charging some of the highest spreads while also having the highest reject cost. Once again, this is where Open Directory can help by facilitating intercompany collaboration. In line with best execution policies, the trader wants to contact the provider, share the observed information, and take appropriate action. After starting the conversation, they use the Workspace agent to share costs directly and arrange a meeting to discuss them. Ultimately, they agree on an outcome in line with best execution policies.

Why is this different? It's trusted analytics insights. It's industry-leading data accuracy, workflow integration across Workspace, Teams, and third-party specialists, and collaboration with liquidity providers via Microsoft Teams and Open Directory. For traders, that means speed, accuracy, and smarter decisions. For LSEG, it means higher license value, wider adoption through no-code analytics tools, and partner monetization. That's how we turn trading complexity into a seamless value-driving experience. In terms of availability, the Teams app integration with TradeFeeder and Workspace is live, with additional enhancements to come. Now, let's move on to the final demo. This time, we'll show you a proof of concept that we're working on with plans to release next year. Earlier, Tim showed how we can enable customers to create an analyst report from their own environment. Now, let's look at another approach, this time from the perspective of a Workspace user.

In this instance, we are transforming the way analysts research and make decisions through Workspace integration with Microsoft Copilot agents. Today, buy-side analysts spend days pulling filings, news, and their own internal notes, manually modeling scenarios in fragmented workflows. It is slow, error-prone, and it delays investment decisions. By integrating Workspace with Copilot's researcher, analysts can pull filings, news, and internal notes in seconds. They can build comprehensive reports using natural language prompts. Effectively, they move from research to ready in minutes, not days. For our customers, that means speed, confidence, and end-to-end insights that combine their data with LSEG's accurate, trusted data. For LSEG, it means even deeper integration into our customers' environments, driving license value, increasing retention, and expanding distribution. Let's see it in action.

This time, we start in Copilot using the Microsoft 365 researcher agent to pull information from corporate resources and data, documents, and commodities research from LSEG Workspace. The seamless integration allows the user to create a comprehensive report combining LSEG's license data with the portfolio manager's own data using natural language prompts. As the report is built, Copilot researcher plays an active role in guiding the process, for example, clarification on the approach. This kind of intelligent prompting saves time and helps ensure that analysis is aligned. Once the scope is defined, a detailed data-driven report that provides a comprehensive overview of the sector is generated, with citations that link back to Workspace. After reviewing the initial draft from Copilot researcher, edits are made. The report is easily formatted into a polished Word document ready for sending.

Here are user benefits from seamless interoperability between the Copilot researcher results and Workspace, and can access deeper information where required. After reading, our portfolio manager pinpoints several areas where further insights from industry would be valuable. Using Microsoft Teams and Open Directory, it's easy to quickly locate subject matter experts, engage with them, and gather the necessary intelligence. This is achieved through the LSEG Workspace Microsoft Teams app, where users can also discover financial information and tools with natural language, share Workspace content, and access Open Directory.

Here again, we've shown why Workspace is different.

Through our integration with Copilot, we have transformed a research workflow that delivers trusted insights in minutes, again made possible by our three core enablers: accurate, trusted data, entitlement-aware, auditable, and combined with customer data; integrated workflows inside Microsoft tools where analysts work; and collaboration, which is enabled by preparing insight-ready analysis that can be shared using Open Directory. For analysts, this means speed, confidence, and better decisions. For LSEG, it means higher license value, broader adoption, and upsell opportunities via premium AI features. As mentioned, the integration with researcher is currently a proof of concept, and we are planning to introduce this next year. As we wrap, let me first thank you for joining this session. I'll leave you with one thought: Why does Workspace stand apart? Because Workspace is more than a terminal.

It's the entire financial workflow, a package deal of actionable insights, accurate and trusted data, integrated workflows, and secure collaboration, all in one. It's designed to transform how financial professionals work, and that transformation is already underway. Many capabilities are live today, others are advancing through beta pilots. As we look ahead, we're excited to partner with our customers to shape the future of trusted and accurate financial workflows. With that, I'll hand over to David for closing remarks. Thank you.

David Schwimmer
CEO, LSEG

Thank you, Nej. I think that if we get a slide up there in just a moment. There we go. This slide sums up really well what you have seen over the last hour or so. Our DNA business is built on great data, extensive, trusted, accurate. That is the foundation of everything that we do.

Most investors have a very narrow, direct experience of our products. It is typically just the left-hand side of this chart, where our data is vertically integrated with our own UI, with Workspace. As you have seen this afternoon, that is just a small part of our reach, and our reach is expanding every week. Whether it is combining tick history and machine-readable news in Google BigQuery, or leveraging agents in Microsoft Copilot, or doing advanced fixed-income analytics via our analytics API or in Databricks, LSEG is everywhere. Now, for the rest of the afternoon, we are going to showcase some great innovations from across our other businesses.

What you are about to see is just a small selection of our product portfolio, but it will give you a sense of how close we are to our customers, embedded in their workflows, responsive to their needs, and building solutions that help them grow revenue, save costs, and manage risk. Your lanyard will give you your personalized journey for the next couple of hours as you rotate through the different rooms all on this floor, or ask any of the hosts or the IR team if you get lost. The breakout sessions will begin in about 20 minutes at 3:30 P.M. We will see you back here in the theater for Q&A at 5:15 P.M. Thank you very much.

Okay. Shall we dive into some Q&A? First of all, hold on a sec. Thank you all for returning for the Q&A.

We know we have thrown a lot of info at everyone today, and our intention was to really give you a lot of information without it being overwhelming. We are looking forward also to this session as well in terms of taking your questions. With that, I saw that quick move on the first hand. Go ahead. I think, do we have microphones coming around? Why do we not come up here to the second row, please?

Yeah, so you have from.

Jim in the middle.

Yeah, sorry.

Ian White
Senior Analyst, Autonomous

Thanks very much. It's Ian White from Autonomous. Thanks for those presentations. Three from my side, please.

David Schwimmer
CEO, LSEG

Can we actually, sorry, keep the questions to one question per person?

Ian White
Senior Analyst, Autonomous

Okay. All right.

David Schwimmer
CEO, LSEG

We'll try to get it around.

Ian White
Senior Analyst, Autonomous

I feel like it's take three for me in this Q&A, so I'll try again. Okay, for my one question then, please. In terms of the two sort of partnership products you've discussed with Microsoft today, I'm thinking about Open Directory and Copilot Studio. Can you talk us through what is the enduring advantage gained by LSEG relative to its data vendor competitors from its role in those partnerships specifically? I'm obviously thinking of the non-exclusivity of the partnership with Microsoft. To kind of put it simply, I can understand how Microsoft benefits from LSEG identifying areas where its products could be improved and sort of fine-tuning these solutions. But does LSEG's first mover advantage in those partnerships provide an enduring long-term edge? Can you talk us through some thoughts on that, please?

David Schwimmer
CEO, LSEG

Sure. Sure. Ron, you want to touch on the sort of strategic aspects of Open Directory? And then maybe Emily, if you want to touch on the Visual Studio Code?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Sure. You're right. Open Directory can be federated, but there's something really important that we need to leave you guys, if you guys walk away with one thing around this. The automated domain management, the ADM tool, which is the ability for us to scale our communities and to be able to manage those communities, that is exclusively licensed to LSEG for financial services. Yes, others could potentially build those over time, but we have exclusive licensing rights to that in financial services. I think that's a tremendous advantage for us. We also, through our messaging platform, already have a well-established large community. That's yet another benefit for us to be able to leverage that set of communities and then populate our Open Directory and manage that. That really is key from our perspective.

David Schwimmer
CEO, LSEG

Emily?

Emily Prince
Head of AI, LSEG

Yeah, so on Copilot Studio, when we bring in the MCP into Copilot Studio, there's a couple of pieces there. When we enable that, it's actually opening up additional use cases. Because we're working so closely with Microsoft, not just in the context of Copilot Studio, but actually across that broader ecosystem, which we've been spending time on today, it is actually allowing us to build out entire customer workflows that actually are centered around our data. It is very meaningful the way that that was done. The second thing is when we think about quality in terms of those custom agents and what's happening in terms of Copilot Studio, you heard Matt talk earlier in terms of the work that we're doing hand in hand, Irfan, daily conversations.

We are really thinking very deeply when we're working with customers directly in actually expressing how they think about the opportunities with agents, which is opening up those broader opportunities for us.

Ron Lefferts
Co-head of the Data and Analytics, LSEG

I would add one more part too, which is still emerging. We bet early on Fabric, as you heard before, and Fabric is becoming quite prolific across a number of large accounts. There are a lot of advantages in terms of co-mingling our data with customer data and advantages about faster integration there. We expect that to be something to be an advantage for us going forward as well.

David Schwimmer
CEO, LSEG

Thanks very much.

Emily Prince
Head of AI, LSEG

Thank you.

David Schwimmer
CEO, LSEG

Shimon, pass it down. I will, it's a little hard to see people in the back, but I will make an effort to see you if you wave from there as well. Go ahead, Emily.

On the LDAs that you mentioned, could you talk about how these partnerships with your customers are better structured to what your partnerships look like before, your agreements look like before, particularly on the pricing side as well, how that is structured differently?

Sorry, differently relative to kind of a regular relationship?

Yeah, what those relationships look like before you set up the LDAs. You mentioned 17% of the ESV are now LDAs. What's the aspiration there in two, three years' time? What percentage would that look like?

I'll answer your aspiration question, and then maybe Ron, if you want to touch on the differences. We view this as sort of an organic development with the LDAs. We do not have a targeted level of we're aiming for X%. We see them fitting very well with a number of the customer relationships, and Ron will talk about that in a moment. As I mentioned in my remarks earlier, we see significant outperformance in those relationships beyond the perimeter of the LDAs because of the structure, because we effectively become the default provider for those customers. Even if there is something that is outside the perimeter, we are often the natural first call. You want to touch on some of the structural benefits?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Sure. We get approached much more for that type of arrangement than we provide. We have very specific criteria around how we engage with customers in that enterprise type of arrangement. Typically speaking, what we do is, without getting into too much detail around it, we understand where their consumption patterns are from our different services that are in scope of that agreement. We lay that out in terms of a joint customer value plan where we expect that growth to continue over that period of time for the agreement. We have now a very deep canon of specific cases where value can be unlocked, meaning they can find a competitive displacement, they can use additional services to drive top line or to drive bottom line efficiencies.

We go through that with them and we outline that case very specifically by each customer. From that, we determine what the commercial relationship is going to be. Then we enter into that arrangement. That is about as comfortable as I am about talking about the details around it. What happens is, as David said, we become the default provider. To achieve that synergy case, because built-in is, of course, our growth, they are highly motivated to execute on those projects, as are we to support them. Through that, we identify tangential opportunities. As we develop new products, which are outside of that framework, we have a higher propensity to close those deals. That has generally led to an outperformance in those accounts relative to peer firms that are not in that agreement and absolutely relative to what that arrangement was before.

I hope that helps.

Thank you.

David Schwimmer
CEO, LSEG

Anyone in the back? Out there, I'm trying in the back row. Excellent.

Melwin Mehta
Analyst, Sterling Investments

Thank you. Thank you very much, David. Melwin from Sterling Investments. I think you reminded us today what a lovely business you have created in the last seven years that you've been here, David. Fantastic to you, and congrats to the team. My question was actually not about LSEG this evening. It is about creating LSEG a platform for other great companies to list, grow, come to the markets in terms of the LSEG rather than LSEG. Any thoughts there in terms of giving momentum, encouraging new listings, new IPOs, smaller A markets, et cetera? Thank you so much.

David Schwimmer
CEO, LSEG

Sure. You would have heard from Charlie Walker earlier in terms of what is going on with the private securities market. That is just one of many different things that we are doing across our equities franchise. The LSEG itself has been the beneficiary of a huge amount of change over the last few years where we have driven a bunch of that. We have worked with the government. We have worked with the FCA. AIM itself is going through a consultation right now in terms of what can be done to continue to improve on that. We have seen a significant uptick and benefits from a lot of those changes with the IPO market reopening over the last couple of months. The pipeline looks very good.

I think that the notion of breaking down that kind of bright line between public markets and private markets, everything that we're doing on digital market infrastructure, the continuing support for the companies that are already listed on the exchange, we're probably the best market in the world in terms of dual listings, in terms of good partnerships with exchanges in other parts of the world, whether that's Africa, Middle East, Asia, et cetera. I actually feel very good about all the progress that's being made in that area. There are some dynamics in the broader global market, whether it is the private equity that we've seen over the last 10-15 years, a lot of the uncertainty in the U.K. market since Brexit was not particularly helpful, et cetera. A lot of that is behind us.

I think the pipeline, as I said, looks very good, and a lot of the changes have been very productive. Thank you. I am going to go into the third row. There we go.

Andy Lowe
Equity Analyst, Citi

Hi, thanks. It's Andy Lowe from Citi. You've lent in hard throughout the presentations about the benefit you have in terms of your data being AI ready, lots of talk about consistency, scrubbing the data. Could you maybe explain a little bit more on that point and with the ability for large language models to do more with unstructured data? Why is that advantage that you have currently not eroded as peers and may be able to do that more easily? Thank you very much.

David Schwimmer
CEO, LSEG

Yeah. Emily, you want to?

Emily Prince
Head of AI, LSEG

Yeah, happy to. When we stepped through the slide earlier, we stepped through certain steps. It starts with sourcing and across a lot of different contributors, and then also populated with our own proprietary data generation. We get into some of those steps that you were just referencing. These are really intricate steps, and they take a lot of deep expertise. Actually, a lot of that knowledge is encoded in something we call internally data models, which really describe relationships between data. If I say to you that you have a bond with a par amount of $1,000 and a bond with a par amount of $100, you as a consumer of that data, that is a very undesirable effect unless you have then gone on to correct it. It can cause a lot of problems.

It's a very simple example, but we can get further and further, and the nuances in financial services are vast and deep. If I say country of risk versus country of issuer, that has completely different meaning and consequences. Now, LLMs can do some of that in terms of understanding the context, but not in the level of detail that we need to achieve the level of quality that our customers really require out of this content. When we talk about AI ready content, we're going further in terms of providing all of those semantics and detail like that that allows for very confident use in the context of LLMs. That is multi-stepped. One of the other points that I made earlier is how much iterative cleaning goes into this as well. You can't just take data out of the box.

We really have to cleanse it to achieve the level of quality we want. Then on top of that, make sure it's delivered with all of that history going back decades with a consistency. Why does that matter? If I was, I'll take an example of quant, and I wanted to build a signal, I want as much history as I possibly can and as much orthogonal information in that data to build the richest type of signal. That's why the breadth and depth of this data matters so much, but also that level of quality that goes into AI ready content.

Andy Lowe
Equity Analyst, Citi

Great. Thanks very much.

Emily Prince
Head of AI, LSEG

Thank you.

David Schwimmer
CEO, LSEG

I see a question over here. Yep. Go ahead.

Thanks.

You got a microphone coming to you. Right behind you. There we go.

Thanks. I mean, you gave a lot of insight into the data, and I think you're very much an early leader there. Just in terms of in context of the multi-year agreements you have with partners and things like that, just what I really wanted to understand at what stage you'll be able to monetize that more aggressively. Do you think about that as you monetize that more aggressively in the consumption basis? How does that work with potentially usage, not just in Workspace, but just generally across practitioners going down? Can you ensure that that's a strong net positive for the business? How are you positioning for that?

Yeah. So let me just talk a little bit about usage and consumption-based pricing in general. Today, we already have that in parts.

of our business. They're relatively small today. As we roll out more and more of this technology, we will be able to do it across much broader parts of the business. For example, if data is consumed through an MCP server, that is something that is very conducive to tracking usage and implementing consumption-based pricing if we want to. A lot of what we provide today does not have that kind of metering available. It's on a contractual basis. As we go down this path, you'll see us moving both technologically but also operationally, financially, down a path of being able to do more and more of that. We will be doing this in a thoughtful way, in a careful way, in a way that incentivizes as much data consumption as possible.

In other words, we don't want to disincentivize the data consumption with pricing that is too aggressive. This is not, you know, this is not a new problem. In other words, other industries have gone through this before. We will certainly be learning from that and making sure that we are maximizing the customer usage of the data, while at the same time optimizing our pricing to maximize our revenue intake. This is going to be a journey that we are going to be on over the next couple of years. I don't know if there's anyone who wants to add to that. Okay. Yes.

Hi. You guys spoke about integrating TradeWeb further into Workspace. Can you talk about why now, what enabled that, and what opportunity you see from doing that?

Sure. I mean, fundamentally, it's not that complicated. As you all know, we have been getting closer and closer to TradeWeb, doing more and more in sort of a closely integrated manner. The big issue for us in terms of having TradeWeb come through our front end was the ICON migration to Workspace. It didn't make sense to spend time on that until Workspace was sort of fully migrated, established, embedded. It's now something that's, I would say, very close to the top of the list. As I mentioned in my remarks earlier, I expect to see that access to TradeWeb through Workspace in 2026. I don't want to give a specific date at this point, but I'm pretty comfortable with sort of the first half. Does that help address that?

The opportunity that you see from it?

I think the opportunity is consistent with the broader opportunity of Workspace as the front end to so many different parts of our portfolio. TradeWeb in particular, from a TradeWeb perspective, there's a lot that goes on in the market that is considered voice trading. It's actually people chatting on Bloomberg and then executing. If they can move people off of that, and that includes taking advantage of Workspace, taking advantage of Workspace Open Directory, that's a big shift in terms of the ecosystem. From their perspective, I think that's very interesting, very attractive. I'll let them speak about that specifically. From our perspective, it is about making us that much more competitive in a critical asset class. You know, we're seeing the strength that we have had historically in FX play through across the lifecycle and kind of the end-to-end offerings that we have.

To maintain or to build that kind of strength in fixed income is a very attractive proposition from our perspective. Right there, yes. I'm trying to mix it up in terms of the questions. Hopefully, we'll get to everybody.

Shivangi Varshney
Equity Analyst, BNP

You mentioned how you have.

David Schwimmer
CEO, LSEG

Can you just quick intro of yourself?

Shivangi Varshney
Equity Analyst, BNP

Oh, hi. I'm Shivangi. I'm from BNP. My question is regarding Workspace. You have these different avenues via which you're distributing the data. What incentivizes a customer to opt for Workspace over the other avenues that are there?

David Schwimmer
CEO, LSEG

To offer the customer to.

Shivangi Varshney
Equity Analyst, BNP

To opt for Workspace over the other avenues.

David Schwimmer
CEO, LSEG

What's the competitive attractiveness of Workspace?

Shivangi Varshney
Equity Analyst, BNP

Yes.

David Schwimmer
CEO, LSEG

Ron, would you like to take?

We think it's great.

Ron Lefferts
Co-head of the Data and Analytics, LSEG

I mean, it's tailored to different communities. It has integrated workflows. It's now interoperable with Microsoft. We've got the Open Directory coming. We've got all of our unparalleled data that is accessed through it. So we feel pretty strongly. I mean, we're a very compelling case in terms of how we compete in the market. Was your question specifically around competitiveness of Workspace, or was it around an alternative to, say, like a.

Shivangi Varshney
Equity Analyst, BNP

Sorry if I wasn't very clear. Like, within, you also have MCPs via which you can provide your data and your customers can access it. So amongst all those avenues of distribution of data, what upside would Workspace have that customers would be inclined to install that instead of just having their own models?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Oh, okay. Okay. As opposed to just accessing through some other.

UI.

Shivangi Varshney
Equity Analyst, BNP

Yeah.

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Right. We want to meet customers wherever they want to be. Some customers want to have their own bespoke solutions. Many of our customers have their own user interfaces, and they just want to leverage our data as part of their own curated workflow. We have always been open and fully support that as well, too. If customers just want, or partners also, take our feeds, and then in some cases we co-sell with them, we will own the customer relationship. One of our biggest relationships in that, for example, is Aladdin with BlackRock, where their platform is completely powered by our data front to back. When they bring on a new Aladdin client, we contract with that client directly and maintain that relationship, for example. We have those types of arrangements.

With, you know, AI or any other type of interface, it follows that pattern from our perspective. If customers want to use that, that's their choice. We feel strongly that integrated workflows across everything that you've heard over today, and especially in more complicated and regulated workflows in a trading environment, require all these other capabilities that are non-trivial to build. Especially in those cases, we feel it's a very compelling option for customers. If they feel like they have a different case and would like to use another route, we're happy to support that as well, too.

Shivangi Varshney
Equity Analyst, BNP

Thanks.

David Schwimmer
CEO, LSEG

Two rows back from there. Two rows back. There you go. Can't. That's. You just had your hand up. I can't see who it is from here, but person right, two people in front of you. No, that row. I'm calling on people now. You have to have a question. If you don't have a question, that's okay. Who else? There you go, right there.

Ben Krause
Analyst, Wellington

Hey, Ben Krause at Wellington. As you think about training your own tools, and it's relevant for WorldCheck, but I think it's relevant across the business, using customer data to make your tools better and stickier, like, how has that discussion with customers evolved? Do you feel like that is a competitive advantage across LSEG business?

David Schwimmer
CEO, LSEG

You want to take that one?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Sorry, your question was, do we train our model using customer data?

Ben Krause
Analyst, Wellington

Using customer data and, like, are there any parts of the business where it sort of creates like a network effect, whether it's WorldCheck or other?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Yeah, I'll start if I just can chime in as well. First of all, right now, we're not building any models ourselves. Right? We don't think of ourselves in the model game. We're not building any frontier model. We're not a lab. The way that we think about this is that our data is what our secret sauce is. We do work with the LLMs. We do work with our scale partners. What we want to be able to do is to be able to use those LLM, merge it with our data, and allow our customers also to use their data and our data, merge it, and be able to build solutions. That's the path we are on in terms of building our products.

David Schwimmer
CEO, LSEG

I think just, and then maybe part of the question you're asking, we do have the ability, for example, with our Risk Intelligence data, because we have such a strong position in the marketplace, we can learn from the usage without having anything that is customer identifying. We can learn from the usage. For example, if there is a particular person that regularly comes up with adverse media, but it's always wrong for some particular case, we can learn that. We can learn it in this customer case, and we can apply it in this customer case without having any sharing of information across those. That kind of thing we absolutely can. But.

Gianluca Biagini
Co-head of the Data and Analytics, LSEG

We actually, in those cases, we are using that to make sure that our answers get better over time. It is not training the model. I was specifically trying to answer your training model question. We make our process better every day based on the answers we give to our customers.

David Schwimmer
CEO, LSEG

Yeah. Okay. Over here.

Ben Bathurst
Equity Research Analyst, RBC

Hi, it's Ben Bathurst from RBC. As a company, I think you spoke two years ago about a $50 billion unvended opportunity. I wondered how much closer are you to monetizing that opportunity today, and how have your thoughts changed about how to address that, if at all, with respect to developments around agentic AI?

David Schwimmer
CEO, LSEG

Yeah. I'm happy to take that one. Anyone should feel free to jump in. You're referring to what we had talked about a couple of years ago as the opportunity in managed data services, basically, or managed data as a service.

That's right.

We still think that opportunity is out there. We still think it's a very attractive opportunity. We have basically prioritized what we're doing in terms of all of the work to embed AI in our functionality, all the agentic workflow that we're doing right now. It does not mean it's gone away, but this has just been a function of prioritization. Right, second row here. Right in the middle.

Russell Quelch
Analyst, Rothschild

Thanks. Russell Quelch from Rothschild. We sat here a few years ago now sort of asking the same questions. I'm going to ask you the same question as I asked two years ago, actually. The data analytics business was growing at 5% then. It's growing at 5% now. We've had a lot of ambition on the product side. I think it's well recognized that data is good. The technology is getting better. On the slide today, you've said, you know, the segment's growing at 5% or mid-single digits. You've changed that to a language-based target. What is the.

David Schwimmer
CEO, LSEG

Just to be clear on that, there was no intention to have any kind of signaling or change in terms of that page. We have not made any change in our guidance or anything along those lines today. I just want to be super clear on that.

Russell Quelch
Analyst, Rothschild

That's my question done. I guess the question is.

David Schwimmer
CEO, LSEG

Come up with one quickly.

Russell Quelch
Analyst, Rothschild

Come on. Get on with it. What's the ambition? I mean, is the ambition to grow at the segment rate? Is the ambition to grow above the segment rate? I'd love to hear you match your ambition on the product side with some ambitions on the financial targets.

David Schwimmer
CEO, LSEG

Sure. Again, we're not giving any guidance today. We're not, and there's been no change in terms of any of the guidance. There is no shortage of ambition. You have seen us, and MAP went through this in our opening remarks in terms of the consistent uptick in terms of the different parts of the business, including in the subscription-related businesses. We see opportunity here not just to grow at the segment rates. We see the opportunity here to, yes, grow at segment rates and take share and find new customers and address new TAMs. Without putting any numbers out there, you know, we look at all of our competitors in each of the highest-performing segments of each of our competitors.

If they're growing faster than us, we try to figure out why and try to address what we need to do to match or beat that level. That is how we think about it. I can't give you a time frame. I can't give you particular guidance. You see how we are investing in our product. You see the kind of change that we're driving. Hopefully, you can see from today the progress that we're making and the ambition that we do have.

Russell Quelch
Analyst, Rothschild

Thank you.

David Schwimmer
CEO, LSEG

Can we go two rows behind Russell? Yeah, perfect. Yep.

Shashwat Verma
Analyst, Lansdowne

Hi. Shashwat from Lansdowne. Just speaking of ambition, on the Workspace side, do we think of Cloud and sort of other LLM solutions as competitors? Obviously, I noticed the partnership last week, but just narrowly from Workspace, are they now competitors? How is the product/technology gap in terms of meeting the same LLM features that they offer? How's that going, I guess?

David Schwimmer
CEO, LSEG

Ron, you want to take that?

Ron Lefferts
Co-head of the Data and Analytics, LSEG

Yeah. We do not view Claude to be competitive at this point in terms of the core workflows that we discussed, especially the highly regulated ones and orchestrated ones that are linked within trading workflows. It's not even in the same category from our perspective. From a perspective of a UI that can present financial information, that may apply to some customers, that makes sense. I would say, you know, once the accuracy is addressed, if it's within someone's tolerance, then it's an option for a limited set of activity is how we would view it. From our perspective, having those kinds of technologies out there and how we can potentially leverage that to help our own search within our own Workspace, we feel that's an advantage. That's something we're also going to take advantage of.

We know that our search currently is obviously highly accurate and works. As you heard Nej present, we're building out our Workspace AI, and we'll roll that out when we are highly confident that it is highly accurate. We believe that that will be a great alternative for someone, especially when they want to have more integrated workflows. If a customer wants to go a different route, we will support that as well through our data.

David Schwimmer
CEO, LSEG

Can we get third row microphone here, please? Up here. Right here.

Sorry, I couldn't see.

Thanks. Can I just pick up on that point a little bit? We saw a number of examples today with, I think, seven or eight examples with prompt-based search and very interesting analysis. Is it good enough yet? Could you maybe kind of qualify that answer with the kind of discussions that you're having with your key customers, mainly the banks and mainly around anything that's proximate to a trading engine? Is it good enough yet to be monetized effectively?

Is that an accuracy question?

Yes.

Yes. Okay. Emily, do you want to take that?

Emily Prince
Head of AI, LSEG

Yeah, happy to. Let's structure it in two parts. When we think about accuracy, there's a couple of pieces to this. One, it's the accuracy in the underlying content. One of the things we said earlier is that across those three distinct channels, with Workspace, the proprietary experience, with customer proprietary implementations, and partners, we uphold that quality in respect to that underlying data. That holds. Now, I want to divide this into two parts. One, when we're exposing data through the likes of MCP or otherwise, into Databricks or otherwise, we're still providing that level of quality in the underlying content. When we look at Workspace as an LSEG-led experience, what we're doing is we're taking an extra step.

Not only are we making available that 33 PB of data, which is quite a lot that can come through, we're also doing what Ron was mentioning in terms of that orchestration, AI-powered, where it makes sense. There are deterministic workflows, like with FX, where it does not always make sense. Where it does make sense, we're creating that interoperability. The third piece is we go an extra mile with Workspace in terms of ensuring the level of quality in respect of someone's prompt, which I think is the basis of your question. When we get a prompt in Workspace, we go now an extra mile in terms of making sure that the context of that is understood in the context of the data we're serving up. We really are going an extra step in the presentation of accuracy in respect of Workspace.

David Schwimmer
CEO, LSEG

Go ahead. Third row, right in the middle.

Enrico Bolzoni
Equity Research Analyst, J.P. Morgan

Thanks. It's Enrico Bolzoni from J.P. Morgan . We're seeing some reports with eye-watering figures in terms of the CapEx that is expected to go in AI projects across the world. At your recent results, you stood by your guidance for CapEx for the next few years. I just wanted to understand what gives you confidence that unexpectedly you might need to actually increase your CapEx just to keep pace with the industry. If this is not the case, is there a risk that the players that are indeed increasing their CapEx dramatically will try to pass on some of these additional costs to other market participants that are benefiting from it, but they're not doing it themselves?

David Schwimmer
CEO, LSEG

MAP, you want to take that?

Michel-Alain Proch
CFO, LSEG

Sure. So first of all, in order to square off the numbers, we are not in the business of buying chips for billions of dollars. I mean, all these, as Irfan was saying, we are not building an LLM. I think there are other people in there who are doing this very well, and we're using them. On the CapEx side, I think the importance, so I think two things. The first thing is we're spending double our competitors. If you look at the FMI or if you look at the data provider, with 10% of CapEx, we are at double. That's number one. Number two is in this, you know, we went from 15 to 12 to 10, and we're going to high single digit next year.

Actually, next year, it will not be a decrease of CapEx in terms of millions of pounds. It will be pretty much stable. The important thing is that we've been investing to cover the technical debt that we have inherited from Refinitiv. Year after year, this investment in the GBP 900 million-ish of CapEx is becoming smaller and smaller. I can't give you a figure because it's not a public figure. What I can tell you is that it decreased 25% on 2024, and it will decrease even more, 26% on 2025. It is going to open up for us space to invest into growth and into AI or some other things by keeping our CapEx flat.

David Schwimmer
CEO, LSEG

Enrico, was there a second part of your question in terms of what others might be spending on? I just want to make sure.

Enrico Bolzoni
Equity Research Analyst, J.P. Morgan

The question was related. I appreciate you're not in the business of doing chips. The, I guess, was a bit more of a philosophical, big-picture type of question, which is clearly there are some players that are in these sorts of businesses, and they're spending huge amounts of money for that. I was wondering whether there is a risk that to some extent this cost will cascade through the rest of the industry, also across those players that are.

David Schwimmer
CEO, LSEG

I think, yeah, I got it. Sorry. I think it's actually just the opposite. What I mean by that is you have massive competition. If there are sort of three legs of the AI ecosystem stool, it's compute, which is data centers and chips. It's the models, and it's the data. You're seeing huge capital going into the models. You're seeing huge capital going into the data centers and the chips. That will end up, you know, there's enormous capital going in there. That's also enormous competition. There will end up being more commoditization in those two legs of the stool. In the data, you can't just throw money at it to create more data. You can create synthetic data, which is, the quality of synthetic data is based on the quality of the underlying data. Fundamentally, there's sort of a defined universe of data.

We have the best content set, the best data estate. We see that as being protected while those two other legs of the stool will, over time, become more commoditized.

Enrico Bolzoni
Equity Research Analyst, J.P. Morgan

Thank you.

David Schwimmer
CEO, LSEG

Yeah. Right here. There is a mic coming from behind you. There you go.

Nadim Rizk
Analyst, PineStone

Yeah. Hi, David. This is Nadim Rizk from PineStone . The question is sort of more longer term. When I look at all these amazing AI tools and products that you showed us today, I can't help but wonder about the long-term employment in the financial industry, or if you want to call it number of seats, and think that number of seats will eventually shrink because you're going to be able to do so much more with so much less. If that's the case, how would you think this would get reflected into your business?

David Schwimmer
CEO, LSEG

I'm happy to take a shot at that. Anyone should feel free to jump in. We have had this dynamic for a number of years. In fact, in this theater, we gave our low single-digit guidance for the workflows business. Before AI was such a prominent topic, there was a question as to, you know, electronification. Are we going to see more and more data growth, but relatively limited kind of human participation? That may be exacerbated by AI. I think from our perspective, we are very well positioned to serve our customers where they want to be served. If that means a smaller number of humans doing a lot more, great. We're really well positioned for that. If that means more consumption of our data through our data and feeds, great. We're well positioned for that.

One other thing I just want to put out there, OK, this market is changing dramatically. There is an enormous amount of disruption going on in this market. It is not beyond the pale to think about the fact that some of our new customers may be agents. OK, as there is a profusion of agents, think of each agent needing access to data. That could be a very different model from humans, agents, data, and feeds. You have already heard a number of companies thinking about their HR function overseeing their people and their agents. You have heard about people putting agents on LinkedIn to be hired. Again, lots changing. I do not want to tell you, hey, you know, this is what we are charging per agent. I think it might be overly simplistic to think, hey, people are going away.

It's going to be all AI, and that's going to reduce the economics.

Bertie Thomson
Analyst, Brown Advisory

Thanks. It's Bertie Thomson from Brown Advisory . Thanks for today to everyone. Enjoyed the conversation with Matt from Microsoft. If we think back to 2023, David, you were sort of quite confident that we'll see a material contribution to revenue from the partnership in 2025, which sounds like it might have been pushed out a bit. Do we still expect a material contribution from the partnership? If so, when should we expect to see it?

David Schwimmer
CEO, LSEG

Absolutely is the answer. It is also, let's say, a steady upward trajectory as opposed to a spike up. We've gotten this, or we've had this conversation with a number of you. I think it was touched on during the course of today's conversation. Part of our business is that there tends to be gradual adoption. I think I've said on a few occasions, we could introduce the most amazing product in the world tomorrow, and it would still take a few quarters for us to see a fairly gradual uptick in that. I think that's just, it's the nature of this industry, risk-averse, big customers, periods of adoption that can take a while. We feel very confident about the upside in terms of the revenue generation.

One thing that we have talked about, you've seen our analytics business growth double over this past year. That is the one of the three businesses within data and analytics. That is the one area where you do see sort of meaningful in-year sales or in-year adoption as opposed to the other areas which tend to be longer-term contractual or subscription revenue growth that picks up over time, and we're often displacing other competitors. Hubert ?

Hubert Lam
Analyst, Bank of America

Thanks. It's Hubert Lam from Bank of America. A question on your data. I know you pride yourself on the breadth and depth of your data, the proprietary, the nature of it, as well as the petabytes of data that you have. Can you talk about how concentrated is the use of the data? Like, how broad-based are the users using across all your data sets? Is there like some sort of 80/20 rule where 80% of the people only use 20% of the data you have? Or is it more broad-based than that?

David Schwimmer
CEO, LSEG

I don't know the answer. Anyone want to put a hand up on that one?

Matt Koerner
Corporate VP and CTO, Microsoft

It is contextual by community, right, I think is the answer. Like in investment management, we do not see a big demand for our real-time data, for example. In the quant business, they clearly want a long history. It really kind of just depends on which community and which use case that they are focused on. We do see a diversity within that. I do not know, Gianluca, if you would like to add anything to that.

Gianluca Biagini
Co-head of the Data and Analytics, LSEG

I also think that it depends on the firm type. There are some firms which historically are more in the DNA to take massive amounts of raw data and analyze the data well from a quant perspective and so on, because I think that can create a competitive advantage on that. There is also a trend from clients who wanted to see more actionable insights and information. Perhaps us doing some of the worker source, but it really depends on the client type.

Hubert Lam
Analyst, Bank of America

It's not 80/20.

Gianluca Biagini
Co-head of the Data and Analytics, LSEG

It's widely distributed.

David Schwimmer
CEO, LSEG

OK. Anyone? Hand it right to your right there. Yeah.

Mike Werner
Analyst, UBS

Thank you. Mike Werner here from UBS. Just a question. You were talking about maybe agents, right, as a potential customer base. Do you see, or is there demand from the client base today to potentially unbundle some of your pricing, for example, within workflows? I mean, you're adding all these incremental enhancements and opportunities. Do you see a world where you just price that data, not through a data feed, but through, say, an MCP server or something like that? Is that something that you're seeing from clients today?

David Schwimmer
CEO, LSEG

Not meaningfully at this point. It gets back a little bit to the question we had over here about consumption-based pricing. We really like the subscription model. We think that's a great model. Even when we are fully capable of having usage-based pricing, we're not going down the path of saying, hey, if you want this little piece of data, over to you. You can pay a small amount for that. We want to maintain the subscription model, and then within that have the usage-based bands and the consumption pricing within that. At this point, we're seeing, and I'll combine those last two questions, we're seeing this shift to sort of model consumption, enabling our customers to access a much more significant amount of our data as opposed to wanting a little bit here or there. I don't know if anyone wants to add anything.

Emily Prince
Head of AI, LSEG

I'll just emphasize that last point. Traditionally, when people implemented data, they would typically focus on asset of data. That's not the way that models work. Actually, what models do is they encourage a left and right. You don't need to look up a catalog to say, I want that additional set. You don't need to speak to your technology team to then integrate it. It is there. It's ready. It's accessible. The models know where to look. That's together with the fact that we've got a lot of relationships between these data sets means that when you ask your first question, it's actually going to really bring back a relevant set of data. Where you've got providers like Elsa providing that breadth of content, it becomes extremely powerful in providing very competent responses back.

David Schwimmer
CEO, LSEG

I have just checked with the boss here, Peregrine. We are at time. We are going to adjourn from here to have some drinks. Of course, we are happy to, we'll all be there. We're happy to continue having the conversation from there and continue to answer your questions. Again, thank you very much for your time today. Thank you for joining us. Thanks to those who joined us online. We really appreciate the interest. We hope you have found it to be an interesting and worthwhile day. Thanks a lot.

Powered by