SAP SE (ETR:SAP)
Germany flag Germany · Delayed Price · Currency is EUR
147.28
+6.58 (4.68%)
Apr 24, 2026, 5:38 PM CET
← View all transcripts

Product Launch

Oct 20, 2020

Speaker 1

Hello, everyone, and welcome to today's webinar. Thank you so much for joining us today. My name is Zermina, and I'll be hosting this webinar. So our topic for today is Smart Features and Self-service Analytics with SAP Business Objects BI, and we're very pleased to have our product expert and speaker here with us today, Priti Molchandani. So before I do hand it over to Priti, I'd just like to go over some housekeeping tips, so that we're all comfortable navigating this space.

So firstly, we really do enjoy seeing all of your throughout. There might also be some time at the end of the webinar to answer some frequently asked questions. We also have some resources listed for you in the resource box on your webinar console. So over here, you can find any extra resources to learn more about SAP Analytics Cloud. You can access the slides for today's session as well as a link for any upcoming webinars in this series.

And lastly, I'd just like to point out our survey. We really do appreciate all of your feedback and we definitely prioritize it when we're planning future webinars. So this does help us plan topics that you all would find useful and enjoy. So please do try and take some time to fill that out. It's only 2 questions, so it shouldn't take too long.

And you can actually access the survey widget by clicking on the 3 smiley faces on your bottom toolbar. So that's all for my introduction. So now without any further ado, I'll hand it over to Priti to kick off the presentation.

Speaker 2

Hi, everyone. Samina, can you hear me okay? Yes, I can hear you. Okay. So hello, everyone, and thanks for attending this webinar.

And thanks, Zamina, for introduction. So as Yamina introduced, I am Preeti Munjanani. I'm part of the product management team in the augmented BI cluster of SAP Analytics Cloud and really welcome you all to enjoy the session today where we'll be focusing on the smart features, our predictive features in SAP Analytics Cloud. I will be showing, 1st of all, slides just to set the background section stage and then I will move on to the live demo in our system. And finally, we will have the question and answer.

So I have Katrina as well to answer some of the questions. Feel free to enter it in the chat and let me know if you want anything in between. So yes, I'll take off the slides now. So the first information in order to set the background for this session is that we will be focusing on the augmented analytics space. But before that, I think from SAP Analytics Cloud, we should recognize the fact that we are honoring all the different business type of users.

So all the users personas ranging from IT then to analyst and then finally to the information worker. So what you see on the left hand side is basically an IT persona, which helps in preparing the enterprise reporting like normal visualization, historical data reporting and analysis. The IT persona would come into picture to prepare the data models, set the connections in SAP Analytics Cloud. In case of BOE landscape, it will be setting up the connection to universes either live connection or offline connection. That would be the work that IT and developers would do.

And the next one is self-service analytics. This is the place where analysts like business analyst and data analyst come into picture. This is basically the audience where the data wants the data is in place already by IT and Development, but they would want to do some slice and dice analysis using the self-service features in SAP Analytics Cloud that you will see in the demo. They would want to kind of prepare the dashboards and visualization for sharing with other users. They can do drill downs, they can do linked analysis and so on.

So this will be the analyst where we are focusing on as part of self-service analytics. Finally, this is augmented analytics. In our case, we call it as smart features. These are the features which are based on the machine learning and AI technology and they would provide automated predictive and insight out of the data. And this we think if it's into the information worker persona, which is any business user in our day to day life.

So it could be a financial controller, it could be a manufacturing or production manager or it could be a sales vice president or a sales customer service representative. So basically, if they are looking at the data, automated insights are generated out of data and shown to you in a very easy to understand visualization and very quickly. This is where the technology is helping us, and this is the place of focus for today in this session. Now the comparison to SAP Business Object BI Suite and SAP Analytics says the following. You might have seen this slide already.

So with respect to BI, traditional business intelligence where we have features like exploration, discovery and dashboards, SAC is quite okay there. Crystal reporting is one component which we probably do not have a replacement in SAC. But in addition, the complementary features are planning and augmented analytics. As I mentioned, the session will focus on augmented analytics, but planning is also a feature which is part of SSE. SSE provides the combination of 3 to help you with your business decisions.

Now when we talk about augmented analytics or augmented intelligence, we are really talking about 3 different pillars. We are talking about 3 technological enhancements that have been used recently in the analytics tools. And they are natural language processing, artificial intelligence and machine learning. So what we are saying here is that basically the existing dashboard and reporting capabilities are augmented with these technologies, AI, ML and NLP technologies in order to provide you automated explanation, predictive insights or even allowing you to do data driven decisions. So this is why we are talking about the term augmented intelligence.

Now what happens in case of business layman terms? If you are a user, either you are an analyst or a business user, what questions can you get answers from these technologies and tools that we are showing in this webinar? So it could be a question like what will be my profit next quarter or what will be the profit what was the profit last quarter. It could be the questions like how many employees I had in case of an HR scenario? Or it could be an example as in who is the best performing sales personnel in the United States region?

So these are the questions you can answer. You can also answer the questions for future. This is where our predictive features will come into picture. You could say, for example, will I be out of stock in the next few months? Or what will be my revenue or expense forecast for next six months or next year depending on your data.

And finally, we have the insights feature that would give you the drivers, the key drivers that are affecting your particular KPI in business. So these are the type of questions you can answer. We are not limited to, but this is just to give you a flavor as a business user what you can demand from SAP Analytics Cloud. Okay. So the next is a slide before I move on to demo.

I just wanted to translate these questions and translate the technologies into the features we have in SAP Analytics Cloud. So we call it a smart features, and we have 6 of them. And depending on your need, depending on the question, one or the other feature is used. And I would also come to a point later on or in order to say which one is supported on the live U. S.

Connection and which one we have supported on the offline connectivity. So first of all, all the features work with the offline connection. They are part of SAP Analytics Cloud Access. There is no separate license needed. They are part of the default basic license, standard license, which means that if you have access to Citibankix Cloud, you can get started with smart features even today.

So the first feature is Search to Insight. This is the NLQ capability, natural language query and natural language generation capability. This is where you ask a question in simple plain English and then you get an answer as an analytics or a realization, which helps you build your analysis and dashboards quickly. So this is basically, you can imagine you are using Google search or any search that you use in your sessions, in your day to day life, it's exactly similar to that. But here, you are asking the questions with respect to the data models that you have in SAP and you get answers which are charts and visualization, which you can add back to your dashboard.

Then we have smart discovery. This is our exploration capability. Every feature is proceeding with the name smart. So it would be difficult to differentiate. But if you think about discovery, you're trying to discover or explore your data.

You're trying to discover interesting patterns and relationships in your data. That's where smart discovery comes into picture. This feature is based on the machine learning algorithm that are in house. They are proprietary to SAP. We have classification and regression machine learning scenario supported in here.

And you would see that it is really simple. Once you have your data model, you basically specify what is the target of what is the interest target, what is the KPI that you want to find the patents and relationships and then this will generate an output discovery for you. Then we have Smart Insights. It's getting quite a popularity from all the customers. We receive very good feedback about this feature.

This is like without overwhelming the user with lot of different insights coming from different charts, data, calculations. It will automatically drive your attention to the key contributors. So that we would see in the demo again. And then we have time series forecasting. As the name suggests, this would produce the forecast for future depending on the hierarchy, time hierarchy you have it in your data.

You can ask the forecast for quarters, months and years and so on. And it depends on how much data in your dataset is there. More the data, the higher the number of forecast periods you would be able to generate for future. And then we have smart grouping. This is the clustering technique.

Again, this is also based on the machine learning technique that we have our IP with. So this is similar to K Means. If you're aware with the data science terminology, this algorithm is similar to K Means clustering algorithm, which helps you to group or cluster different values in your data with respect to the KPI or metric that you're analyzing. So the example of this could be group all the customers in United States according to the sales they have or according to the number of products they buy from us. So that clustering will form 3, 4, 5, 6 groups, and it will group them based on the common relationship these customers have.

It's generic. It's not limited to the sales data. I just took an example of sales because it's easy to understand, that all the features are generic. Irrespective of industry, irrespective of your process, it would be working as long as you provide it in a proper data model shape. Then finally, we have a smart credit feature.

This is more for, I would say, analyst persona and as compared to the information worker persona. This is because information workers will consume the results of smart credit. So for example, which employees would be leaving in next month, that would be the kind of answer I'm getting from Smart Credit. It's giving me predictions for future. It helps me to base my decision on the outcome generated from Smart Credit.

But for a building process, when you build actually Smart Predig, you have to create a predictive scenario. This is where a business or a data analyst comes into picture. They need to understand and knowledge not the knowledge of a data science per se, but they need to kind of recognize the fact that if a predictive model is good quality or not, they can tweak in with some parameters, they can control some variables before actually receiving the output of predictive algorithm and giving it in the hands of a business user. So that's the smart product feature. So yes, I just wanted to give you the overview at this stage about the 6 different features before I go into the demo of these features on top of Universe Data.

I'll just pause here for a minute. And Zamina, I'm just checking with you if the base is all right and it's going okay for now or if there are any questions.

Speaker 3

Hi there, Preeti. There is one question. Somebody did ask a roadmap, a question about languages. I don't know if you want to keep that for the end or address that now.

Speaker 2

Okay. So I'll I'll I'll just go

Speaker 3

to the roadmap for available languages.

Speaker 2

Okay. For the language support, right? The language support related Yes. Transmission support, yes. So basically, SAC supports various different languages.

When we generate the output from Smart Discovery or Smart Insights or Search Insights, it is language independent. That means it would consider the language that you are logged in. It will show the output in the translated language. Only the limitation is currently with Search to Insight feature. That means that you can ask a question on only English language.

We have a roadmap item for Search to Insight, which is regarding supporting the query, natural language query terminal RG in different languages. But it also means that your data could be in different language as well. So if I say the question is what was my revenue in last quarter? And if your name of a revenue in English is something else in French, if you use that, it won't understand. But if you instead of what if, you type in something else in French, it won't understand.

So this is the support I mean, like the phrases that we use are currently only for English language in Search to Insight, but then there is a road map item plan even to support that in different languages. Apart from it, all the output all the generated output and all the different features support different type of languages. Okay. I think we will I'll tackle the questions at the end. I think it's good to move on to demo because it will take half an hour to demo all the features.

Speaker 1

Yes, that would be a good idea.

Speaker 2

Okay. Thanks. Yes, so I'm going to share my screen. And can you or if you can confirm, if you're able to see everything, then that would be a good go ahead for me. Can you see my screen?

Yes. We can see your screen. Okay. So I want to before I go into SAP Analytics Cloud and show you the self-service and augmented analytics, I just wanted to quickly say that it fully we fully recognize the data preparation and the amount of information you have in the BOE stack. So we do support the connectivity to Universe and WebP documents And this would be taken care by the smart features at least with respect to acquired or offline connectivity.

But I'll quickly mention what is possible for the live as well. So first of all sorry,

Speaker 1

I have here

Speaker 2

the data set. Just a second, I need to said do not disturb for my team. Okay. So I'm sharing the screen again. Currently, I'm sharing the system which I have remote logged in.

This is the place where I have Business Object Explorer and different capabilities within it. Showing an Excel file just for the purpose of a demo, I have my data, which is focusing on the CRM use case. This is how my data looks like. In real world, it could be coming from SAP Cloud for customer system or S4HANA system, you could have various different joints with BW, your own third party data files and so on. But here for the purpose of a demo, I have this Excel file, CSV file, and this is the data about CRM.

And what I have here is different sales for different products, regions and customer across the world. And I have in a world of Q or BOE, I have some measures as well as some dimensions, a date variable to play around with different functions that I mentioned as part of the smart portfolio. So once I have this data, what I have done is, as a sample, I've created a universe out of it. So here, first of all, this was a connection against my offline scenario. So it's a simple ODBC driver for Microsoft Excel.

And then it's a simple foundation layer and then I have a universe. So here in my universe, I can set filters as you would be all aware of or I could the restrictions with respect to column or row filters or authorizations, that's all good. But for the purpose of a demo, I'm just keeping it plain, simple. And when I data preview, all the columns are there. So this will be the place basically, all information is there in the universe.

Like I mentioned, you will have different data sets. This is all respected and this will be the there will be no change with respect to how the data is situated in BOE. There is nothing needed on top. So once we have that, we move on to the SAP Analytics Cloud. So now I'm going to SAP Analytics Cloud.

Actually, maybe before that, I would quickly the Universe connection. So here, I'm in SAP Analytics Cloud, and I'm in the connections panel. First of all, in order to make use of different data situated in different boxes such as HANA, Universe and things like that. This is the place where you will be coming. This is mainly for developer and IT role.

They will be coming and forming the connection to the systems. So there are 2 connection types, as you know, acquired data connection and live data connection. In this case, I have one for wide Univer connection and one for Live. So this is the one, let's say, for Live. In that case, I specify the hosting and HTTPS port for my business object system and the username and password mechanism that I want.

And this is all that you need for the live connectivity. Then for the acquired connection, what you additionally need is a cloud connector, which is making sure that you can securely access the data from UniVoice, which is sitting behind your company firewall. So this is the place where you will be specifying your system. Again, the host and port for VoE, the location that you in the cloud connected and the username and password. So once I have that, I will be basically creating a model in SAC.

So everything in SAP Analytics Cloud is based on the model. There are different types of models, but this is equivalent to your universe on a data model that you might be familiar already. So when we create a model, we can pick the connection. We can either say it's based on the live data connection or it is based on the acquired data connection. In this case, for example, if I choose this and select my connection that I already configured, I'm saying that I want to form a new query.

And basically, this is all the universes that are exposed in that query. So he this was the one that I showed with respect to sales example. And in this query builder, you would be able to select the fees that you want or maybe the filters that you want to create for your analysis before you move to the Stories and dashboard. But this is the place where you will be able to create queries or create a model based on the Universe query. So once you do that, you will have a model, and then you can start creating the stories on top.

So moving to a different system where I have the whole demo setup, already created dashboard, which is very user friendly and which helps me to easily explain it to you. So yes, here I come into the system. So first thing I wanted to start is by actually starting scratch by creating a story. I'm going to focus on the smart features, but I'm starting with creation of the story based on the universe data I have. So like I mentioned, we created a model in the system and this model gives me a gateway to all the data that you have in Universe.

And here, I want to select that model when I create a story. So this is the model I have. I've already created for the purpose of demo in the system. And when I point my story to the model, I see already the Explorer. This again, you would be familiar with the Business Objects Explorer, very similar to that.

Here, I get automatically identified with measures and dimensions and special dimensions like version, category and so on. And I can just do some drill down, I can play around with it and do some exploration here. So for example, I have a measure called number of customer meetings, our total expected revenue. I can see it by different dimensions and all the dimensions are showed here. For instance, I had a date dimension.

So this is a hierarchical dimension. I have year, quarter and monthly hierarchy. And what I'm going to do is that basically say that I want to have a spreadsheet presentation of it just for the purpose of this demo again and to get a line chart quickly. So you see here a line chart is automatically created, which is showing me the expected revenue, which was one of my measure. It's basically the revenue in the historical periods and that I'm going to analyze in this dashboard.

So at the end of the story building, I'm basically preparing a sales analysis that I can share with my sales Vice President and the sales team. I want to see how smart features would be helping me with different types of insights automatically. So I am currently at this stage where I have excluded the data, historical data, some measured and dimension combinations and now I'm going to utilize the SmartPitch functionality. So let's say I'm interested in this chart, which is showing me the total expected revenue by date. And then here, I'm going to launch one of the first features of the Smart portfolio, which is forecast.

So this is very easy to understand. I have the data till November 3, 2020, in this example, and then I wanted to generate the forecast. So I simply clicked on this 3 buttons icon and selected add. I have different functions from SAC. I can add a reference line, I can add a tooltip, but I'm selecting a forecast here, which is based on the predictive algorithm and selecting the default automated forecast.

This is based on our IP and machine learning algorithm, but you also have a choice to move to a different type of algorithm. If you understand maybe a little bit of a data science, you could say that triple exponential smoothing or logistic regression might be a better fit in this scenario, depending on my data and the type of the patterns they hold, you can select that algorithm. But most of the cases what we have seen in the customer scenarios, this works very well. The automatic technique works very well. So here you see very quickly, my forecast was generated for like next year July 2021.

So yes, unmoved for next 7, 8 periods. And it automatically determines the next forecasting period based on the data that I had in my previous data set. So that was the forecasting feature. Here, I get some hints that the forecasting quality is good. It is actually 5x5.

In some cases, if the quality is not good, it's basically that could happen if the data is not sufficient or maybe there is not interesting patterns that the algorithm could drive. It could be that all the values for your last 18 months is same. So it is the expected revenue was 3,000,000 also. So yes, in that case, the quality may not be good, And you get also this indicator, which we call it as confidence interval, which is indicated here in the light blue screen. So I'll do a full mode full screen just that it is clearly visible for you guys.

So here you see this is the light blue kind of a patch, which indicates a confidence interval, which is indicating how accurately the predictive forecast is with respect to the historical data. So in this case, again, it's not too bad. If I would say the narrower the confidence interval and the better your forecast accuracy. So that's all indicated using this dotted line and this batch confidence interval. And from business endpoint, I did not have any data science knowledge or I did not see which machine learning algorithm I want to use.

All I clicked was a single option, which was ad forecast on my existing chart. So that's it basically. Now the forecast is obviously available for the data where you have time periods. So you will have a date variable or a time variable in your data set. It has to have a time series.

It works for the Line 9 time series chart in this example in this case. So that's the first feature, forecasting feature. And then I will move on to the next feature, let's say, which is search to insight. This is the NLQ feature, as I said. And what we are doing is using this light bulb, we are launching the Search to Insight.

So let me place this here. And there is a search bar. The search bar comes at the bottom and the analysis comes at the top. So I can ask start asking the question like I could say, show me, for example, show me the number of customer meetings. As I type in, if the typed text matches to one of your member dimension or measure names, it auto completes.

And this feature, Search to Insight is supported for Live Universe, but for Live Connections, we need indexing. So I'll quickly show when we go to the model, we have to enable indexing. The indexing, what it does is basically it scans the metadata like dimension and measure and member names and it caches so that it could auto complete as a user is typing in and basically retrieve the results faster. So that is needed because in case of live scenarios, you would have huge amounts of data. Even in acquired scenario for auto completion, you would want that flexibility to come in.

So that is why we need indexing. So in this case, I have acquired data. I did not have to index, and it is automatically completing for me. So I'm saying show me a number of customer meetings and I could say by sector or by customer name, let's first just do it for customer meeting. Now this is only a single measure, so it has automatically generated a KPI type of a chart.

It's saying number of customer meetings is 99 379. I can start typing in very similar to the query, SQL query, but this is a natural language query, even simpler than SQL. And as I start typing in, I automatically get the proposal of different dimensions I can do this by. So I can say, do it by country. And in this case, it has generated a bar chart showing me the metric that is number of customer meetings by different country.

And I can use different phases like for I could say for 2018, which will automatically create a filter with respect to time. As you can see here, it has created a filter and it has automatically provided me the output analysis for the year 2018. So yes, that's the power of NLQ, very simple to use, easier to understand. And again, fast because you can analysis quickly, and it will automatically generate different type of charts as they suit, you can also say that you want to see it as a pie chart, for example. So I use this and let's say I like this chart and it can then copy to my existing story.

So it is copied to Page 1 and this is the page that I'm already currently in. So I started with the new document and I was in the page 1 and I did this chart that you see at the bottom of the page from search to insight. I did not have to create it myself as a story designer or as an analyst or a developer or IT. It was available for me just as a business user. So that's the one of the feature, Search Insights.

And then I will use the other feature, which is Smart Insights. So I wanted to first show it for a measure here. Let's say, we have this again, for any of the features, for Smart Insights and First Insights, you do not need edit rights. They are even available for information worker for consumers. So yes, in this case, I have selected just as an example to show you, I have selected a numeric point chart.

I can see my total expected revenue is around 3,000,000. I can do the usual SAP Analytics functions. I can say that this is I want to see the scale format as million. So I have 322,000,000 as the total expected value coming in from my data that I had in Universe. And now I can launch Smart Insight feature.

So if I'm an editor, I can use this again 3 dots and select the add Smart Insights feature, which will automatically do a natural language generation for me. And as you can see here, it has run an algorithm and showed me a nice insight. So again, I'll just go to the view mode for the purpose of clarity. Here, it is telling me Smart Insights, again, a lightbulb symbol, that total so far for the December is 8,340,000. So it has calculated all the totals for the entire data set.

And the total for November 2020, in this case I have a data till November 2020, so it is telling me that JPY 8,280,000, it is a decrease of 15% as compared to October 2020. So what it is doing is it has automatically detected a change, a significant change over time. So you can call it as a change detection or how it has changed inside, but what it is doing is it is scanning your data set and automatically figuring out where did the maximum change happen. In this case, it happened in month of November, whereby revenue expected revenue seems to be 16% lower than the October 2020. And if I click on View More, I have different types of insights to help me with that.

So this is the insight that we already talked about, how has this changed. It's a change detection, and it produces the variance chart for me. It tells me that this is how the revenue has been trending. I have different types of hierarchies. I can view it by quarter as well.

And if I like, if I'm in an edit mode, even I can add this chart. Now I'm in the view mode, as you can see here. So I'm analyzing the information and consuming the information. So I can see it in the side panel here directly. So without a designer having to add these charts or overwhelming the end user with all these charts and results, what we are giving is we are empowering an end user to do the slice and dice and insights out of the data by themselves.

So that's the really powerful mechanism and the feedback that we are continuously getting. This is not available in any of the tools like Tableau or Power BI. Power BI has kind of insights, but our insights, as we feel at least, is more attractive, more business friendly, and it answers various types of questions. So the second type of question, for example, we are answering is a top contributor. This is again very popular insight type used in our customer scenarios apart from forecasting and search as you already saw.

So here, what it has done is again run a statistical calculation and for giving out what are the top 5 contributors. So it has looked into all the dimensions in the data set and said for a given total expected revenue measure or a KPI, what would be most interesting for you to focus on? In other words, we could say that the customer this customer, Cartesen's GMTS, sorry for wrong pronunciation, but this seems to be driving highest revenue. Out of the 3.22, dollars 2,030,000 alone is driven by this top 10 values and around 303,000 is driven by this particular customer. So this gives me the hint that these are my top 10 customer for the customer name dimension.

And I should really focus on them. I should keep maintaining a good relationship if I were to sales VP. So that would be one indication. And there were different dimensions like length of sales cycle, contract level, this was the CRM data. And let's say I have a contract level at C level.

We see again that 97% C level contact type is working for us. There is a manager level contact type and individual contact contributor level for salespersoners, and C level is fitting better for me. Similarly, I have a length of sales cycle for closing the sales deals, where 1 year length of sales cycle is actually better for me. Now I can launch Smart Insights again by clicking on the Smart Insights context menu and selecting this Smart Insights light bulb. What it is doing now is rerunning that algorithm for the context that I chose.

So I have selected total expected revenue now specifically for 1 year. This time, it has completely refreshed the results as you can see. This has actually focused on the data, which is with respect to length of sales cycle as 1 year. And here, it has shown me different type of parameters. Now some of the top contributors could be same, but some of the top contributors could be different.

So for example, we did not see that we are particularly good with the customer segment Fortune 500 when it comes to 1 year. So I think if I have to summarize the output of Smart Insights is that it helps me giving the insights automatically. It is doing really fast. It is context sensitive. As I change my data or in case of live scenarios, if the data changes, it automatically recognizes that data change and it helps me to answer the question.

So it helps me things like what are my best contributors. And if I want to focus on specific contributor, I can go on and on by launching the Smart Insights only. So in this sales example, I got to know that we are doing particularly well for this particular customer. Length of sales cycle should be 1 year, contract level should be C level and the customer segment is Fortune 500. If I focus on this combination, my total expected revenue will be awesome.

So that's the kind of an output we have got from this functionality. So we are halfway through our demo. And these were the 3 different features out of 6 smart features. I'm just repeating the fact that we had forecast, time series forecasting, we had search day insights, we have smart insights. Now I'm going to share this Smart grouping feature, which is a clustering feature.

So in that case, I'm going to draw the chart again. Let's say, I have total expected revenue as my measure. And this is a correlation graph. So it's a bubble chart. Maybe I can choose the scatterplot.

This feature is available for bubble and scatterplot charts. And I have different measures here. And let's say, I wanted to understand this grouping by country. Maybe in this case, the country is not good. So what I do is that I see a smart grouping feature.

Here, you can see that all the countries are grouped together, U. K, U. S, India, Russia, all the countries that group together, it makes me hard to analyze this information. And I wanted to see how total expected revenue and number of customer meetings are related for country. So I switch on the smart grouping option.

By default, it has 2 groups, but you can increase the number of groups that you want to generate. And let's say, I increase it to 4. And just for the purpose of visualization, I'm changing the color. Maybe I just do 6 and go to view mode. So yes, here you can see that 6 groups have been generated.

Group 1 is a cluster that only has basically country China. And it indicates that the total expected revenues are around 29,000,000 and number of customer meetings is total customer meetings is 9,000,000 to 57,000,000. You can use the calculation instead of total number of customer meetings, you might want to see average number of customer meetings, that's absolutely fine. But this would give you an idea that how many customer meetings, what is the stage at which you can convince your customers, so how many customer meetings drive how much revenue for which country. So this is very easily done by the smart groups because it has grouped similar countries together.

So we see here in the Group 4, we have country Brazil indicated with this pink dot, Group 4. And then in Group 4, I also have India. So India and Brazil have been grouped together based on the total expected revenue they drive and based on the number of meetings I have to have in order to convince my customers. So that's my grouping feature. It's based on machine learning algorithm and you can configure the number of groups.

You can see here the quality of the machine learning algorithm, which is green, which means it's a good one. And in some cases, if you do not see good quality, again, it would mean that more data points will be needed for that to work. So that's my grouping. We are 4 features all on and then I'm going to use the Smart Discovery feature. So you see here, I'm in the dashboard and story area of SAP Analytics Cloud.

And 1 by 1, I'm just making use of the smart features to give me insights to generate forecast or to help me with my analysis. So then I move to Smart Discovery feature. This is an exploration part. It automatically kind of builds some dashboard for you as well. So in this example of the sales demo, which was based on my Universe data that I just shared, I actually renamed the order value measured there to total expected revenue.

But what I'm interested in this is this sales revenue or expected revenue figure in order to see what relationships and patterns can be found based on the machine learning algorithm. I have advanced option. For example, I can exclude dimensions that I don't need. So for instance, if I have a transaction ID, I think I modeled it. It was modeled as a measure.

So okay, no, we don't see so. Sorry about that. I think that's already excluded. Anyway, so I don't want to exclude any of the dimensions. I want to derive the results based on all dimensions in this case.

Oh yes, here is the transaction number. And then there are no filters. If I want, I can perform this analysis. For example, only United States or I can choose any filter. I could say that only do it for the customer segment Fortune 500 and so on.

So this is the configuration options I have got as an analyst and as a business user. And once I do that, it will generate the output for me, which will be 4 different pages full of insights and full of nice output that you would see out of the predictive algorithm. As you can see here, it is telling me that Smart Discovery uses machine learning algorithm to generate a story based on your data. It would take few seconds because it's training the data, it is materializing the output and then it is automatically building the charts for you. So there you go.

You see here 4 pages. I was in a Page 1 earlier. And now certainly, I have 4 different pages coming from Smart Discovery. There is an overview page, there is a key influencer page, there is unexpected values page and finally simulation. So smart discovery feature, 1st of all, is not supported for any live source to see it.

We have it on the plan. So this would be for your case, it will be only working for the offline or acquired Universe connectivity or any acquired data if you upload it in SAP Analytics Cloud or any of the data sources that we support as part of import connection like BW import, like Universe import and so on. But obviously, the live connections are in roadmap. It is a bit tricky to support the live connection for smart discovery because it needs the machine learning library that is present in HANA DB. So in SAP Analytics Cloud, we already have this machine learning library present and that's why we are able to generate the Smart Discovery users in the offline scenario because the data is directly present there.

But in case of a live remote scenarios, we would need that library, which is only possible in case of HANA today or Sipoti Raw HANA. So going back to the Smart Discovery results, you see here all the 4 pages were generated for me. I have different types of charts. First of all, forecast is automatically generated like I showed before. So even you don't need to generate forecast, it would be if you're using Spark Discovery, it automatically do it for you.

I see the total expected revenue. There is a variance which tells me difference between forecast and actual and forecast. So actually, we were doing better historically as compared to the forecast that is predicted for next few months. I have these as the 2 versions. And then we can see here the distribution of this revenue in terms of number of records I have.

So here, this is indicating that 12 42 records in my dataset have the expected sales revenue at RMB37 1,000,000,000. So most of my customer transactions tend to fall in this category and the total expected revenue is RMB 37,000,000 out of that. So that's the output that it is producing and it do the combination with other dimension and it has done the combination analysis with other dimension and measures. So how is my revenue with respect to version, with respect to date, with respect to product type, contract level and so on and so on. And you see here in Smart Discovery itself, we have utilized 2 different features of Smart portfolio, forecasting and Smart Insights.

So we are automatically generating the Smart Insights for the charts where possible and forecasting again for the charts where possible. So here, I see that in case of C level and on premise, this is the type of a combination of stock contributor. Then the most interesting page in case of a smart discovery or most useful we are seeing is the key influencer information. So machine learning is saying that for expected revenue, key metric that you have asked for this machine to give answers on, It has found this top 9 key influencer in this case, and it seems that the customer segment has the highest influence. So let's see how the customer segment is influencing.

So it is saying that most of my sales is coming from Fortune 500 segment followed by Enterprise and SMB. And this is the top influencer. And this is how the required frequency is coming up. And followed by the next influencer, which is number of customer meetings, you can see that actually higher the number of customer meetings, higher the expected revenue amount. So we have different batches or banks.

We have range where we have 0 to 4 customer meetings and between 15 to 20. And then we have where we have 32 customer meetings. So for 1 customer, we have 32 meetings and we seem to drive in total, dollars 149,000 revenue out of those. So that helps me in understanding or deciding for future how should I base my different sales parameters in order to achieve highest sales revenue in future. So this is an analysis on the historical data in order to give you hints for your future.

Then we have an unexpected values page. You can call it as outliers as well. Basically, what it is saying in this case is that the total expected revenue in terms of actual what we already have, for instance, is 181, but the predictive algorithm was predicting less in this case, which was sixty-sixty. So this is a good sign. And this is a combination of value.

It has picked up an exact recall from the data set where we have this situation.

Speaker 3

So unexpected values are nothing but out of

Speaker 2

the ordinary or has analyzed the pattern, which was usually followed in the data set, but the It has analyzed the pattern, which was usually followed in the data set, but these are 7 or 8 some records, in this case, 17 records, which are out of the ordinary. And then there are some charts to help you further, like a public chart where you can see that how is the actual versus expected is different. Finally, we have a simulation page. Again, it is automatically generated for you. As I mentioned, this gives you a watchlist simulation capability.

This allows you to simulate a combination that you haven't tried in your business, in your use case. So for this example, I can see here all the variables that have influence, either positive, negative influence, neutral, weakly positive, strongly positive and so on. So let's say, I have never made a combination of country China, so let's select China and number of customer meetings as 13. Maybe the customer segment is prior. And I hit simulate.

So you see there is a 15% decline if I go for this combination. So it appears that if we go for a China customer segment and if we select to prior customer status, then maybe the expected revenue will be less. Let's see how if I choose the current customer segment. And then it is plus 12%. So depending on what values you are selecting, it will be allowing you to simulate the output and show you what will you be expecting in future for this combination.

Sorry. So that's the Smart Discovery feature. And I think we are coming closer to the time and I need to answer questions. So I'll quickly show you the Smart Predict feature, which is the final thing. So this is a Smart Predict.

Here, it allows you to do the business analyst or data analyst type of let's say, understanding of predictive models. And here, you give your model either a planning model or a data set and then you get to this is a predictive scenario, 1st of all. And then you select the type of a predictive scenario, which is classification, if you want to predict if the employee or customer will churn, it is regression, if you want to say, okay, how many days my delivery is expected in future, how many days it will be expected in future and the forecasting algorithm, which is basically I've selected gross revenue from my data. And I want to say that create a predictive model based on the date. And also I've selected an LOB.

So for every individual LOB, I want to generate a forecast for revenues. So once I do that, I hit train and forecast, it is creating many different predictive models for me. Sorry. So for every individual LOB, it has generated 1 predictive model. And that's the power of SnapCredic.

It's segmented forecasting scale. If you have, let's say, 2,000 different entities, LOBs or maybe you have 2,000 different customer segments and every customer segment varies from each other and you want to use either the forecasting scenario or you are doing the classification scenario, it helps you to scale your predictive models. Vacation scenario, it helps you to scale your predictive models or predictive results. So as you can see here, the forecast is generated or predictive model is generated for different LOPs. And it shows me some kind of performance indicators like MADE, which could be understand by data analysts.

It shows me details on signal analysis, like if the signal is trending downwards or moving upwards, if there were fluctuations, if there were cycles like Christmas period and so on and if there were outliers. So these are the details which your data analysts will want to analyze on different data sets with different combinations in the side panel before they actually can give you a model that you can trust. And once you trust that model, you will be selecting to apply that. So once you apply it, that will be generating an output in your stories or in the planning model. So this is the forecast.

Again, this is generated. This is analysis that is helping you to decide your predictive model quality. You can create as many predictive models as you want with different combinations. And once you get happy, like I said, you will apply it. And finally, you'll be seeing the results.

So here, this is my final dashboard. This is combining all the smart features I showed to you. Here I am showing you the results of SmartCredit. So I had actual data and then I had this copy actual data where basically I copied the 2019 data, used SmartCredit to generate the forecast for 2020. So here you can see the 2020 forecast is generated based on the screen that I was sharing with you earlier.

So and it has generated for different LOBs. I have here, this is my kind of a P and L report for the same sales example, but I have different LOBs. If I want, I can see specifically there is a filter for a LOB and I can move it from there. So that was the power of SmartPredict, gives you more control over predictive algorithm predictive model and applying the results back to your business for your business users. So yes, I'll stop here and sorry for the time with the demo, but I hope it was useful and I can move on to the question and answer.

Katrina, so I mean that you want to kind of read some important ones that are not answered, that would be nice.

Speaker 3

Yes, absolutely. The biggest question here is what features are supported with live connectivity, I would say? I'm not sure if you're planning to share a slide or we can share something with them afterwards?

Speaker 2

Yes. So I'm currently sharing the screen with respect to the live connectivity support on this presentation mode. So this is the complete metrics against all different data sources that are supported in SAP Analytics Cloud and specifically for different smart features what is possible there. In short, time series forecasting, smart grouping and search insights are currently supported on Live Universe. For example, most of the live connectivity like ASCO HANA, HANA, BW and so on.

Smart Insights, we have started supporting on Live HANA, but Universe is on our road map. Next is BW and then there will be Universe. Smart Discovery and Smart Predict is something that will be quite far. So that's why you see the blue indicator, which means it is in future. So yes, by future, we don't know the date yet.

It could be in a year or a 2 years' time. We can't say for sure, but we are evaluating the options of providing Smart Discovery and Smart Predict also for live universe and different live connectivity support. So yes, I think I mean the strides will be distributed. So you can take your time and see basically which combination works there for you, right? So they would get the slides.

Speaker 3

Okay, perfect. The other popular question is the language support. I know that you provided a bit of an answer earlier on, but I mean, we probably don't have time to get into the details for the answer, but in the following Q and A blog, let's just make sure we include some information there, I would suggest, because we're at the top of the hour.

Speaker 2

Right. Yes, what language is the support, there is an online help, and we can like you said, we can include FAQ page, we can share with them, but actually it is already available online. If you just search for supported languages in SAP Analytics Cloud, I think there are 70 languages supported from Spanish to Hindi to French. Yes, so I think you should not be short of the output support there. And all the smart features support respect those languages except asking the question for such insight.

There's a question on does SAP Analytics Cloud as a feature to keep the actual and show forecast so as to have a comparison? Absolutely. And ideal can a dual access be done for actual versus forecast? So yes, you have table control, you have charts, you can definitely show actual and you can generate the forecast using the capabilities at Red Dead mode or if you have forecast already in your data set as a column, that will be shown. And you can create a variance basically to compare.

So we call it as a feature called variance. You can compare actual and forecast to see how both of them compare with each other. So yes, I think for the live connection, the decks will be shared most of the questions on that. Katrina, like you said, and I cannot answer for the conversion tool question. Thanks, Katrina, for answering many of the questions already.

Is it possible to have more influence on the ML algorithms? So I should have possibly clarified not too much. Through SmartPredict, you can select, for example, which type of predictive scenario you're creating. Is it classification, regression or forecasting? In future, maybe clustering, but you cannot control the specific algorithm that you choose.

So Asset Analytics Cloud is a business friendly tool. Our intention is to provide the ML and AI technology in the hands of business users. And for that, we don't expect them to know machine learning algorithm. I can share the blocks that are that will show you the exact machine learning algorithm that we use inside, but all the features are based on our own top right machine learning algorithms. What about integration of results from Smart Credit existing charts with an SEC dashboard?

Would it be possible as it now with the time to be forecasting feature? So yes, there is smart credit means a concept of a data set. You can create a data set against Universe Connection or offline, and it is not supported for Live Universe, for instance, or LiveBW, for instance, but it is supported for Lighthana. So the output of dataset can be seen in a table. You have to create a story and you have to link it to the data set.

But there is also a feature on the road map, which talks about the smart credit output with models. So yes, model is a different artifact, yes, it is a different artifact, but we have it on the road map that SmartCredit would support models also, which means all your visualization and charts would be taking that output. As a workaround today, we create a model based on the output data set, which is provided by SmartCredit. So you create a predictive scenario, you apply it, you get the output, which is a data set and finally, you create a model on top of the data set. So it's a 2 step process currently, but we are looking in future to make it straightforward or smooth in it so that it's only based on the BAN models.

What's the option with connectivity like big data, Data Lake, AWS and Google? So there are options available as part of SAC. You can connect to Google BigQuery, AWS and Azure Data Lake. I think it was possible or on the roadmap, we can search for that. But yes, for predictive features, it doesn't matter.

It's the SSE connectivity that offers you. Once you have the data from these sources, then you can use all of the smart features as I shared. Is it possible to get the samples of migration to SAP HANA Cloud? Sorry, I did not understand the question. Maybe, Catherine, you would know about that or?

[SPEAKER UNIDENTIFIED COMPANY

Speaker 3

REPRESENTATIVE:] No, I'm not sure either. Sorry.

Speaker 2

Okay. We are using the Numera discovery as self-service tool at this point. We have dashboards on energy component. After deploying the dashboard to BI server, end user has to key in the user credentials for EC system. So I think there are some questions which fit in out of my expertise area.

And as you probably as a customer, you have means to raise tickets or you can answer it on the SAP Community page and there will be colleagues to answer these questions. I keep monitoring the SAP community page very frequently. You can tag it to SAP Analytics Cloud, ask your question in the forum and then we will get back to you.

Speaker 3

There is one question here. I think you may I didn't hear you address it. Can you take the results from SmartPredict and insert it into an existing chart within a story?

Speaker 2

Yes, I did answer that. The way to do is that you will be having a model connected to the data set, which is an output of SmartCredit. And then, yes, it will be shown in the chart. So the example that in my case, I was showing was a table that shows the output of SmartCredit. I did not have to do anything other than that other than having a model.

Is Crystal reporting being deprecated? I think that's one for his book actually.

Speaker 3

But there's we do not have any plans to deprecate crystal reports at this time.

Speaker 1

All right. So if we're done with questions, then I'll go ahead and conclude the webinar for today. Is that okay with you, Priti?

Speaker 2

Yes. You don't see any new questions, right? So

Speaker 1

Yes, Yatin, we I think you answered most of the questions.

Speaker 2

Yes. And the slides will be shared for most of the questions because they are attached to the live connection support.

Speaker 1

Yes. The slides will be shared.

Speaker 2

Okay. Yes, I'm sorry for staying back, but I'll just answer the last question and then we can close. The question is key to features availability in embedded analytics. Currently, there are none, but we are having a road map to offer and all the smart features not all the smart features, but forecasting and smart insights search to insights into the embedded analytics, which is embedded directly into the SAP applications like SuccessFactors, S4HANA. Otherwise, all of them are SAP Analytics Cloud Enterprise Edition already.

Speaker 1

So thank you so much, everyone, for joining our webinar today. Firstly, I'd like to say a huge thank you to Preeti for her time and effort in putting this webinar together and for really giving us all a thorough presentation into smart features and self-service analytics. I'd also like to say a big thank you to our audience for your participation using the question box. We really did enjoy seeing all of your engagement throughout this webinar. And I'd also like to mention again our survey, which you can access by clicking on the 3 smiley faces in the bottom widget bar.

This really does help us in planning new webinars that you all would enjoy. And lastly, I also want to point out again the link in our resources box to any upcoming webinars. We do have one more session coming up as part of this series next week. So I highly recommend that you do check that out. We will also be having 2 follow-up blog posts coming out from this webinar.

1 will be a summary blog and another will be a Q and A blog. And this will be posted on the SAP community as well. I'll be linking them in this current webinar console as well. So do keep an eye out for that. And with that, I'll just close off this webinar.

So thank you so much all for joining us today, and we'll see you in the next one.

Speaker 2

Thank you, everyone. Thanks, Almina. Thanks, Katrina.

Powered by