Cirata plc (AIM:CRTA)
London flag London · Delayed Price · Currency is GBP · Price in GBX
17.88
-0.82 (-4.37%)
May 5, 2026, 3:58 PM GMT
← View all transcripts

Earnings Call: H2 2020

May 5, 2021

Okay. Welcome, everyone. Welcome to the WANdisco Full year 2020 results presentation. On the call today, we have David Richards, CEO and Co Founder and Eric another CFO. So I'm going to what we'll do is we'll run through the presentation with David and Eric. For those of you on the call, You can ask a question through the Q and A box in the Zoom window at the bottom of your screens. Alternatively, raise your hands. We'll take priority from those who write written questions first, and then we can move to those who raise their hand and we can try and unmute you and get you onto So as we're now, handing over to David to take you through the presentation. Thanks, Dad. As Dad mentioned, this is David Riches. I'm the Co Founder and CEO of Wendysco. I'm actually in our Sheffield office today. I'm the only person here as in this new world that we live in. And hopefully, this is going to run very smoothly. So it's our full FY 2020 results. In terms of agenda, I'm going to go over the strategic and financial highlights, the opportunity that we see ahead of us with Microsoft Azure, go through some recent case studies, then Erik's going to cover the FY 2020 financial results, then hand back to me for outlook and summary and, as Darrad mentioned, then Q and A. And feel free to ask questions in the Q and A box. So in terms of our primary strategic goal, as many of our shareholders will know, associated with fortifying the Microsoft partnership with the go to market launch and metered billing in Q4 appearing as a deeply embedded native Azure application. And that obviously was Key to our strategy moving forward is to transition from an enterprise software company to a cloud service company. As part of this, we saw joint blogs from Microsoft, parts of the Data Lake Migration Program, a featured app in the portal. And they made it very clear that this is their preferred migration solution. We're targeting jointly targeting migration of over 100 petabytes of data in FY 2021. And you'll be pleased to see later in this presentation, I'll be going over some of the progress that we've made over the past 12 months in regard to that goal. We also launched Live Data Migrator on Amazon Web Services On the AWS platform in September 2020, we launched this with GoDaddy and achieved migration competency status, which means that we are also a preferred app on the AWS Marketplace. We've got a funded joint marketing program currently happening with AWS, and we're very pleased with progress that we're making there. And we're the chosen migration solution for EMR, which is the Enterprise MapReduce or the Hadoop technology built into the AWS cloud. And we're targeting migration of over 30 petabytes of data in FY 2021. And we're also, behind all of this, Expanding SI System Integrator relationships, including Mphasis. We also see a growing opportunity In artificial intelligence and machine learning with Databricks and Snowflake and those familiar with the U. S. Markets will know the performance of those companies has been extraordinary over the past couple of years. So financial headlines. We've obviously got a robust balance sheet to accelerate the conversion of pipeline into revenue. Revenue for 2020 was down on prior year as we revised expectations associated with the launch of the metered billing applications that I mentioned earlier. The statutory loss from operations was up from £28,000,000 to £34,000,000 reflection, of course, of the lower revenue in the period and increased cash overheads as we get went to market with our new products. We do have a strong balance sheet to accelerate the conversion of this cloud opportunity. Cash reserves of over $55,000,000 at the 30th April and virtually no debt following the $42,500,000 fundraise We did post period end just a couple of months ago. And we to reiterate, continued Board confidence in outlook, restating our target to deliver at least €35,000,000 in revenues in financial year 2021. So in terms of cloud platform, system integrators, data and AI, ISVs, Our unique capabilities have been recognized across the whole ecosystem. So in terms of cloud platforms, Azure, as I mentioned, we have deep integration, of the only third party applications to be delivered as a native service on Azure, AWS, the 1st migration vendor to receive competency with a funded joint marketing program. And we're also going to start talking about Google Cloud Platform. We are seeing demand for this, And we do have resources now to support that platform, and that's in progress. In terms of data and AI ISVs, We'd already talked in the past about Databricks, and we have a live data platform plug in, which is going to be available in the second half of the year. That's approaching preview very soon. And Snowflake, we announced a strategic partnership with Snowflake recently. We're looking at pipeline very closely joint with that at the moment. And the plug in is currently being developed for our applications. And we are seeing demand for Snowflake. And we do have existing customers We also need integration with Snowflake Tank and the data warehouse that Snowflake provide. In terms of system integrators, I've already mentioned Infosys There's a key global SI relationship, but they were Microsoft's SI of the Year last year. And then in terms of data and AI specialists, companies like New Desert, Motiveworks and Avanand already have public applications powered by our technology available in the Azure Marketplace. So we're talking about being positioned for growth And the 4 Ps, product, partnerships, promotion and priorities product. In order to do this, we had to launch a self serve automated data migration tool. And that's coming from a background of being a traditional enterprise software vendor with POCs and handheld white glove service, so we had to reverse that and become a turnkey solution. We launched that in Q4, And that was absolutely a critical stepping stone in transitioning from enterprise software to a native cloud business that we keep talking about. In terms of partnerships, well, the largest cloud platforms, AWS and Azure, the analytics focused SIs and the largest cloud analytics services in terms Snowflake Databricks, Azure Synapse, Azure HD Insights, which is the Hadoop implementation, AWS Glue and EMR that I talked about earlier, all recognize the need to move data at scale on premises data lake into the cloud, and only Wendysco's technology is able to do this. We'll see a customer reference who moved over 13 petabytes that I know some of you were present at the Dallas event. A week or so ago, I would have heard the VP from that company talk about this publicly. From a promotion perspective, from leading cloud vendors and SIs giving us instant market presence, talking about this being the preferred solution for data migration. And why are they doing this? Well, for every dollar spent with 1 disco represents a 10 to 20 times financial gain for vendors. The likes of Databricks and Snowflake need data to power their applications. And as I mentioned earlier, the way in which they get that data at scale is using Waddisco. Priorities our primary focus is to accelerate the conversion of 130 petabytes of data from on premise Hadoop systems into the cloud that's critical for us to meet our objectives this year. So the Azure opportunity is really just beginning. And when we talk about the only solution To migrate at petabyte scale, we had one of the largest companies, largest telecommunications company actually in the United States, the VP of their Cloud Application Infrastructure was asked the question, why did you choose why did you pick this particular piece of technology to move this 13 petabytes of data? And this is a transcript of what they said. So they looked at a lot of different technologies. What they needed was a tool that could move data reliably at scale. So they have 13 petabytes of data in this data lake. There are a lot of companies that can deal with terabytes of data, the small data sets, but they needed a solution that could do this at volume and in less than a year. And as they say here, I have to credit 1 DSCO because They didn't believe that this could be possible at this kind of scale. And I think we're already more than 50% of the way through moving that data set in about half the time that they imagined it could be done. So this is a brilliant example of a company who had a massive strategic objective to move mission critical data, business critical data on premises to the cloud without any business interruption. And given the size of this company and the importance of this data, I think this is a really good customer example. And this is actually a look into look through the window, we talk about IT modernization and how cloud is going to change everything. Well, you can see here just the impact that the cloud is having on these companies. And in the table to the right hand side there, you can see the former technologies that they had. So you can see Hadoop, Teradata, Vertica, the traditional data warehouses, and you can see the technologies that are now being adopted. So for raw data for unstructured data, Our data is going to go primarily from Hadoop to Databridge running in cloud for self-service deep insights, clientele based technology And the data warehouse, of course, is Snowflake. And you can see that really that is the software stack moving forward. And the really nice thing about this customer is they're looking beyond just the initial migration into the cloud. So at Wendesco, we talk about the 3 phases of enterprise cloud adoption: The initial move into the cloud, keeping the data in sync so they congratulate you move application infrastructure from on premises to the cloud. And then finally, our goal, of course, is to become the broker of data between clouds. And you can see here, very early in that cycle, we can see customers already looking to move data between Azure and AWS for Databricks and Snowflake. That is already happening today. So this is we're obviously at Landisco delighted about this because we're already designed into that architecture as they move from on premises to cloud and then between clouds. We think it's unlikely that any particular company will adopt a single cloud strategy as they may have done in the past with relational databases and so on. So multi cloud is certainly going to be a thing. And they're going to arbitrarily decide to run compute applications, which is going to be one of the biggest fixed costs between different clouds arbitrarily. So this arbitrage is definitely going to happen. We like the fact that Wendisco is the broker, the sole broker of that data between those clouds. So some feedback from Azure. Again, some important proof points. Turnkey solution. That means that you can begin the migration in minutes. It's the preferred solution. They very rarely say that, if ever, for Hadoop to Azure as data lake to their cloud migrations. And really, the key points of the relationship with Microsoft is that the LiveData platform for Azure appears on the same monthly consumption bill as all of their other Azure purchases. So there's no friction in the relationship here with the customer. The customer is just using their existing purchasing relationships with the cloud vendor. And they may have large commit to consumes. In the case of some customers like big pharmaceutical company has a $1,000,000,000 commit to consume, a large telecommunications company has a $2,000,000,000 commit to consume. They can use that commit to consume against the procurement of 1 DSCO technology. And you're going to see us talk about consumption, which a lot of these successful companies are doing. I'll talk about that later in the presentation. So in terms of the Hadoop migration program, which they launched, you can see here another proof point from Priya, who's the VP of Data and AI at Microsoft. So talking about modernizing Hadoop workloads, as you saw from the from one of the prior slides. And this all sounds good, but migrating data is going to be impossible. That's not true according to Microsoft with 1Disco's live data platform for Azure. You can migrate production data from on premises to big data platforms is yours data like storage with no application downtime or risk. And that really is the key, that WANdisco can do this at very, very, very low risk and latency. Again, some supporting quotes, very nice support here from Merve Adrian, who is the lead analyst in data and analytics at Gartner. LiveData's ability to move petabytes scale petabytes of data without interrupting production and without risk of losing data mid flight is something that no other vendor does. I think they call that the silver bullet. And Johnson Controls, another recent public customer example, We've been migrating data to the cloud to Azure using the LiveData product from WANdisco. We also talk about the importance of the last mile. So it's not just about moving data on premises to the cloud. It's also about turning on the lights of these other services. So We talked about some of them earlier. HD Insights, Databricks, Synapse and Snowflake are all real examples of, yes, I'm going to move my data to the cloud, but what's the driving force do that, well, I'm going to move my data to the cloud, really to achieve artificial intelligence and machine learning. And that's really why those companies are moving taking quite large risks to move that data to the cloud. So this last mile of transformation is very important. And again, to reiterate that each dollar of WAN DISCO translates to the $10 to $20 of ACR. And this, is attracted to the whole ecosystem. From an Azure seller's perspective, why do they care about 1Disco? Well, it accelerates the Azure consumer revenue for them, and they can use it against their MAC program, which is the commit to consume program. It also enables ACR by solving the data gravity problem. So wherever data lies is where the applications we'll live, and we can do this at a predictable time and cost. We can tell somebody exactly how much it's going to cost and exactly when the migration is going to be finished, like the example I with the From a global SI perspective, well, it's the fastest time to value. SIs really make money when they're building artificial intelligence and machine learning applications not lifting and shifting data on premises to the cloud, then other platform services, again, is the fastest time to value at a predictable time and cost. So that's why Databricks and Snowflake care about 1 DISCO. I'm going to hand over to Eric now who's going to talk about the financials. Great. Thanks, David. This all has to be viewed in the context of what we set out to achieve in 2020 and that's to get to go to market with market ready with Microsoft which we achieved. We emphasized going to market and that's had some effects on recognized revenue being off to 10.5 from $16,200,000 in the prior year. Cash overheads rose to $36,900,000 from $31,700,000 based upon The investments we've made in engineering and go to marketing resources that manifests itself in a greater EBITDA loss. Good cash balance prior to the fundraise of $21,000,000 and after the fundraise $55,300,000 at the 30th April, so just a few days ago. Next slide please. And in the revenue side, just to unpack that a little bit, Our big data revenue actually grew 18%. Some of the larger ALM, which is the earlier products we had, application lifecycle management in 2019 didn't repeat in 2020 and that's somewhat by design. Our future is big data and moving data to the cloud. So that's that was a pleasing result. Cost of sales are off a little bit based on lower revenues. We're still a high gross margin product company, the 90 some odd percent range And that's linked to the leverage of how we go to market. Cash overheads, as we discussed, were about $37,000,000 versus about 32,000,000 from 162 in the prior year. For this year, we're looking at about 44,000,000 in cash overheads, All of the increase going into engineering and go to market resources and a headcount between 202.20 people. So with the and going to the balance sheet or summary cash flow, just looking at that, Good balance sheet based on the fundraise we did during that year and the fundraise we did post period end of 55.3%. So we are well capitalized to attack the mission that we have and that's to go to market with Microsoft and AWS. And then turning to the balance sheet, strong balance sheet, no bank debt as of April 30, very little on this as of the end of the year and a good strong cash position at present of $55,300,000 So pretty much results as anticipated and as we discussed earlier. Next. And so we've just completed a successful fundraise to strengthen the balance sheet and what are we going to go off and do with that we're going to continue to invest in the go to market with Azure AWS further on in the year GCP and build specialist teams to go after the machine learning artificial intelligence such as Databricks and Snowflakes we have a relationship with and to continue to widen the relationships with system integrators who are building migration practices around our technology. We've begun to hire sales specialist teams for Azure and AWS. That's going to extend into building the teams to go off with the machine learning AI folks and to do a few tweaks to engineering to make that a more seamless experience. We're going to also accelerate working with the 3rd big cloud provider in the US Google Cloud Services and increased field enablement for SIs as they begin to add resources to build that out that migration practice. And one of the things we're considering and we believe having a strong balance sheet is a prerequisite to doing this is looking at a potential dual listing on a U. S. Exchange to better position the company with our commercial partners. The bulk of our commercial partners are based in the U. S. And that gives a vote of confidence to the company to see it listed. But that will be done at the time when we believe market conditions and the company is positioned properly. And with that, I'll give it back to David. Thanks, Eric. So outlook and summary. We've talked about the enormous TAM associated with data movement on premises to cloud and data migrators, specifically the product, that TAM is $50,000,000,000 to $75,000,000,000 Where does that come from? Well, when we did the work on TAM with Microsoft, they estimated 200 to 300 exabytes of on premise analytical data that we'll have to move to the cloud. I don't think there's any question that, that is also growing somewhat exponentially as we speak, as that data remains on premise and has not yet moved to the cloud. And obviously, we got partners heavily incentivized to move this with that 10 to 20x revenue multiple that they see by moving that data versus WANdisco to the cloud. And our target specifically this year, over 100 petabytes of data with Azure and over 30 petabytes of data with AWS. So 1 DISCO's that takes 1 DISCO's FY 2021 target to $35,000,000 which is a $350,000,000 to $700,000,000 opportunity for our partners. And cross and obviously, that excludes what I showed you earlier with cross cloud or cloud to cloud data moving broken data between clouds is on top of that from an artificial intelligence and analytics perspective. So pivoting to a consumption based model, Some of you may who follow Azure and Safelite Databricks, AWS and so on will be very familiar with this, but consumption really is true SaaS. So this is where you look at the way in which and the degree to which the product is actually being used. And I'll come to the differences between what used to be a traditional subscription model versus consumption in a second, but all services in the cloud fabric are using a consumption based approach. I'll explain why in a second. So in this world, product and pricing are key. This is not just change in which companies change their business model, it actually affects the entire company. So when we talk about that turnkey experience that customers have to have, A traditional enterprise company really struggles because you need a turnkey experience, you need full cloud integration, you need a billing as you move to this consumption model. And the culture focused on driving lifetime value is very important. So our sales compensation model has to move to consumption. And we are looking for those consumption 100 petabytes of data in Azure, for example, is consumption goal for this year, which then and revenue then falls out of that, obviously. So we are heavily incentivizing net new logos because that 100 PBs of data represents around 50 customers. We would we are incentivizing our sales organization heavily achieve more, a lot more than 50 net new deals, net new logos this year, as we call it. So and then the products evolution, I've already talked about that. So we have to be driven by data consumption, which is a far more agile model. And from a customer lifecycle management, We need data on how customers are actually using the product, which drives interaction with customer success. So it's not just sales, it's customer success. We have to ensure the companies are actually using the product. So what is the change? So when I have this conversation I have this conversation a lot with people. Most of us believe that the thing that changed with cloud was just this elastic infrastructure that gives us tremendous scale. But what they actually possibly don't realize is the thing that has changed is the business model that underlies this, and it was started by AWS So what used to happen, let's think about it, right? So the enterprise software company would turn up, they would do a 3 month or 2 month, and we used to talk about those 90 day sales cycles. They would turn up and do an elongated POC with a company. Now why would they have to do an elongated POC? Because the risk of whether that product worked or not was with the customer because they would the customer would sign up for an upfront 2, 3, 4, 5 year deal and pay upfront for these product. Well, actually, only 50% of those were perceived to be successful and 30% go on to be used in production. In a consumption model, the risk is with the vendor because if the customer doesn't like it, they just click it off. They churn it off and they don't they stop consumption. So this is much better for customers, and we are seeing customers only really want to license software moving forward on a consumption basis. So consequently, the POCs, if you've got the product that can do it, if you've got a turnkey product that can be pressed on in a few minutes, then those POC cycles are very short, and they continue to use it because they're using it in a production context if it works. The example I used earlier, the 13 petabytes. They started to use the product for a few terabytes of data and just left it turned on to move the rest of that data. And in a subscription basis, well, these POCs have to be long because they have to be certain that it's going to work or the perception that it's going to work. How does the contract work? Well, in a consumption model, it's metered. It's true SaaS. Or What often happens after a period of time is there is an agreed commit to consume. But the discounting on a commit to consume is nothing like the discounting that you would see from a typical software company on an elongated upfront subscription agreement. So in a subscription, of course, it's a flat rate. It's an upfront agreement, and you often see discounting in the 50% range for those larger deals. We do not see that in consumption ever. From a software perspective, it's got to be self-service. It must provide immediate value versus heavy services and a consultative sale. So revenue predictability, what you actually get is hugely predictable revenue through cohort analysis. So you get a large number of companies using your product, and you can see that with AWS, with Azure. And obviously, the market loves Snowflake because of this. And in subscription, of course, it's lumpy. You do big deals on an unpredictable basis. And what's the sales process? Well, our guys have to drive POCs. They have to drive people to start using the product versus in a subscription where it's more selective. So, that really is the definition of how cloud, native cloud services, license and go to market with their product, and we're no different. That's exactly what we're doing. So the benefits of this, it gives us increasingly reliable and predictable revenue streams. Our customers are unable to increase consumption at will. And you often see far greater consumption than you would ever believe and certainly, historically, with companies that start off with consumption and then move to a commit to consume later. So We're selling our service and how customers want to buy is aligned to the models of industry leaders such as Databricks and Snowflake and, as I mentioned, significantly lower level of discounting versus subscription. So how are we doing? Well, We are seeing we've seen a doubling of pipeline, and you can see why we're talking about pipeline given we're driving POCs continued use of the product. So if you look at those key inflection points, private preview with Azure was launched in February. We saw about 140, 150 deals in pipeline to the limited public preview where People began to see the product that was approaching 200. And when public preview was launched, we saw an exponential growth, but that trajectory has continued, I'm pleased to say, well over 200, approaching 300, opportunities, qualified opportunities in pipeline. So and you can see how quickly this can move As companies as turnkey experience, product turns on, people start using and consuming. So in summary, we've covered the 4 Ps. We're positioned to convert 130 petabytes plus in FY 2021. We've got the products. We've launched a self serve automated data migration tool in Q4. That was a critical stepping stone deeply embedded into the AWS and Azure clouds, analytics focused SIs and Snowflake Databricks Synapse, Azure HDI, AWS Kulu, EMR all recognize the value of this technology. It's from a promotion perspective, As you're talking about, this is their preferred migration solution. And from a prioritization perspective, we're laser focused I'm moving that 130 petabytes. So as we look out into 2021, the shift to consumption, where revenues recognized over the time the product is used rather than upfront, what we used to people will be used to revenue scaling over the year. So to map pipeline progress to scaling revenue, we're going to provide 3 KPIs that we think are very important. I talked about the number of new logos, the number of new customer wins, the notional MRR, monthly recurring revenue, metered plus an estimate of subscription revenue as MRR. So we're going to normalize that data for you that will help build models around the company. And the retention rate, so the percentage of customers using the product versus those using the product a year earlier, we would expect to see companies continue to use that product. We've certainly seen evidence So based on migrating 100 petabytes of data on Azure, circa 50 customers, 30 petabytes on AWS, 15 to 20 customers in 2021, we believe that revenues in 2021 will be at least $35,000,000 And with that, I'll hand back to Gaurav for Q and A. Okay. So we'll start the Q and A session, give you a minute or so to either raise your hands or type in question on the chat box. If you're obviously, if you're on the phone, then it would be a raising of hand. And if you're on the Zoom call, then you can type in or you can raise your hand. Got a couple of raised hands. First one is George, I'm about to Hi there, and thank you very much for taking my call. A few quickies from me, if you wouldn't mind. Large customers clearly love you. That's the message But when you think about TAM, is there any issue that you really appeal in terms of your sweet spot to these largest people, sort of your 10 plus PB kind of customers and that although you look at a very large addressable market, really your opportunity set is much narrower within it. And then secondly, if you wouldn't mind, so really like what you're saying about the migration from subscription to consumption. The downside though is that it does impact cash. So you move from sort of prepay to postpaid, does that sort of have any sort of deeper implications in terms of your cash burn? Thank you. Okay. So I'll start off and then maybe Eric can finish. But in terms of customer size, I think The size of the initial customers who took on the product was a legacy from us being an enterprise, traditional enterprise just like most of the companies that you follow, George, and I'm sure other people do as well. So they were just I mean, we will we are still very appealing to very large companies, but there's an awful lot of data in companies of all different shapes and sizes. And the new service that we launched in Q4, you will see a lot more and we are already seeing a lot more of those 300 or so that we saw at the end of the next year opportunities in pipeline. They were all sorts of different shapes and sizes. So you're going to see a transformation as we move to consumption and cloud service of that change. And then in terms of cash consumption, what you will see happen in a consumption based model is the POC time frame is considerably shorter because the POC really involves using the product and beginning consumption much earlier in the cycle than you would see in a traditional enterprise software business. So we don't think that there is going to be too much of a lag in revenue. Obviously, it gets consumed over time, but certainly we're confident on that. Eric, do you want to add anything else to it? Yeah. I mean, it does shift cash flows a bit into the future, but and that's typical that you one would see in that effect, but as David mentioned, it does accelerate time to revenue. I mean these things pay you know on a 30 day cycle on metered revenue. So once we get a reasonable cohort with that and one of the thinking between raising as much money as we did is to buffer that until that really gets going. We think that's a shorter term phenomenon when we build The cohort of renewals as we go forward. So I think it dampens out of the wash after about the 1st year, honestly. So I think we're in good shape with that. Thank you. Thanks, George. So There is some questions on the chat, which I'm going to amalgamate A couple of them which both talk about the same thing. Any idea, I guess, for Eric, any idea around H1, H2 waiting to get to the 35,000,000 Yeah, probably a 30, 70 split because of the way that it builds and pipeline conversion for this year. So it'll be Back end loaded for the year, typical enterprise cycle and with a subscriptions ramping, that's just sort of a natural mathematical effect. Okay. We I have another question, live question from Andreas. I'm going to unmute you, Andreas, if you want to ask a question. Yes. Hello, guys. Can you hear me now? Yes, Andreas. Yes. Perfect. Okay. So two questions from me here. So now you're more than 4 months into the year already. You talked about like 50 customers with Microsoft and 15 with AWS. Can you kind of tell how you feel about that? You are 1 third into the year and like how you feel about reaching that at least so far? Yes, we have a lot of companies who are in the process of evaluating the product, turning it on. We have in order for them to in order for some to use it, for example, we have deals in South Africa And in Korea and all over the place actually, the service, which is part of the I think in Canada, for example, had to be turned on in the Azure cloud. So I'm not saying that there's a delay because it's all now all turned on, But a number of those we've got a large number of companies currently in the process of turning on and using product, which will then convert into consumption and we'll be guiding that consumption in due course. In terms of the key metrics that we look at in Feeling comfortable with that number. It's basic pipeline analysis. So we have a lot of opportunities brought to us from Microsoft, AWS, Databricks and Snowflake as well. And we have more than enough in pipeline in order to achieve our goals this year. Okay. But you saw the slide that you've shown when you kind of opportunities that keep going up and up and up. But are you disappointed with the conversion of that opportunity? Sorry, feel that is in line or No, we're in line. It's the we, for example, ran a training session, 5 banks, I think, turned up to that training session all in the process of turning on product. This was just an open training session. So yes, we're very happy with progress. Okay. That's and then the second question is like AWS and Azure, they have around, I think, 2,000 salespeople or something each year and Snowflake and Infosys and all those a lot of salespeople. So I think last time we talked like a half year ago or something. You talked about you guys being like an IP licensing, like a royalty model on but it seems like you have reversed a little bit. Now you talked about like hiring salespeople yourself and building that up. And so can you maybe elaborate on that change? No, no, no. There's no change to the model that We call them specialist sales because we don't want to hire a bunch of biz dev people. So what you have what we need is to manage 2,000 actually, it's more than 2,000. There are 2,000 specialists at Microsoft, but there are over 20,000 sellers. So we are in the process of hiring very senior, well seasoned salespeople from those companies actually who are very interested in joining WANDSCO, which is a very good sign, in order to manage those channels. So think of them more as channel managers. So they do pipeline reviews. They sit down with Azure sellers and look at opportunities and so on and make sure that WANdisco is baked into those solutions that they are currently proposing to customers. Like I said, it's not really sales pitch, more like managers you're hiring? Yes. So we've got 2 sides to our sales organization. There's direct selling. So we don't need we don't really We're not going to be expanding that. We are hiring people to sit inside the channels to ensure that our products are sold by Azure, AWS, Databricks and Snowflake sellers. Okay. That makes sense. Those are the things I have for now. Okay. Okay. Thanks, Andres. Okay. And if there are any more questions, then please type them into the box or raise your hand. Currently don't have any other questions. I don't know whether that's a follow-up question from George or the same question he had before. But Can you tell us at all the story of the 11 Through Q1, sort of the 11 early adopters, what has their consumption been? How much data has moved? What is the attach trade line? It's still early to give you those metrics, George. We will be delivering those metrics in the very near future. It's still a little early for us to feel confident in delivering some of those, but I can say that it does look pretty good. The a number of those companies were waiting for the service to be turned on by Microsoft into their region because there are GDPR issues, Right. So if you're in Canada or in Korea, you have to only run services that are within those clouds, within that region. And that's a legal requirement. So a lot of those companies were waiting for that to be turned on, which happened 2 weeks ago. So we're still in the midst of untangling all of that lovely data for you. Great And do we know what Microsoft's sort of success factors are? Have they given you anything to add? They only care about consumption. So they every single petabyte of data from that data gravity perspective, they're very, very focused on that. So Success for them would be well, it's a lot more than 100 petabytes of new data in this in our financial year. And on Snowflake, I mean, it's Impressive to see them on the partnership, but we go way back with Databricks and we're still not seeing meaningful revenue there. How will it be for Snowflake, Do you think? Well, I think you're going to see virtually every single deal that we do in Azure be associated with either Azure Databricks or Snowflake or both. So those relationships are not just nice logos to have. They are absolutely essential for this company. When the Hadoop data moves to the cloud, it is going to go primarily to Databricks. It's not going to go just to the doesn't just go to the cloud, it goes to something. So that transformation that we have built with and you will see further announcements, I'm sure, with ourselves, Microsoft and Databricks over the coming weeks and months. Often Databricks over the coming weeks months. Very good. Thank you. Thanks, George. A Few questions are coming through the chat box. Can you speak to the opportunities from SI and other channels in 2021 versus what you've with AWS and Azure, and I think you partly answered that David, but maybe kind of broaden the topics around how SIs are helping us could be helpful. Yes. So we've got a lot of SI interest, more than I I couldn't even mention on the slides, we will only do or sign deals with SIs, and there are some that we can't announce, if they bring with them a pipeline of opportunities on one of those clouds. So that's very important, I think, for everybody to understand that We could do 10x the number of SI deals that I talked about here. For us, are 2 or 3 that are very interesting, that have strong pipeline and deals that are either in the midst of consumption or about to start consumption. And then there's a question around attach rate Live Data Plane, any updated thoughts? What I'd say is, Eric, you'd probably agree that it's early days in terms of moving from live data migrated to live data plane. What we'd normally find is that companies will use live data migrator to move the data and they will continue to use live data migrator post the migration until a point where they actually want create a bidirectional hybrid cloud environment. So it could be up to a year while they're moving applications, for example, maybe even longer than a year while that happens. So far too early to talk about attach rates of LiDAR to claim, probably something that we could report on maybe later in the year or at the end of the year. Yes. Another question coming in. Have any current pipeline opportunities withdrawn client to initiate service? If so, what reasons are they given for doing so? No. The only delays we've seen is the is associated with the native service being available in the Azure cloud in those regions. And that's the only reason, but it's they're now available and those are all beginning to say. I think for the audience sake, we have to really have to understand that moving a data lake into the cloud is a complex process. One part of that process is moving the data where we're obviously the solution for doing that. For most customers, their considerations around what they do in their extended relationships with SIs or the cloud vendor in terms of the applications they're trying to move or trying to replicate, etcetera. That leads to commitment to the use of WANdisco technology. But because it's a Chicken on a zero consumption based target doesn't give you commitment to the exact date which they turn the service on. And that's why the data that we'll provide for the first half will be much more relevant than providing it for the Q1. As far as I can see, unless you guys can see anything, nothing in the Q and A box, nothing else in the chat box, no hands raised. So David, do you want to conclude and Yes. Thanks, everybody. Thanks for the time today. Obviously, those the KPIs, I think that we talked about today are going to be very helpful for you in the future. And thanks again for all the support. Thanks, guys. Thanks, everyone.