Cortana Analytics Suite:
Three Creative Solutions to Overcome Organisational Obstacles

A couple of months ago we blogged about the integration of Cortana and O365. Now Microsoft are doing something pretty incredible with Cortana and big data.


The Cortana Analytics Suite (CAS) is about making sense of historical data collected across your organisation. Utilising advanced information management, data storage, machine learning, and visualisations, it provides a robust, cloud-based, end-to-end platform for creating insights and predicting outcomes from large amounts of data.

Cortana Analytics Suite, Diagram, Microsoft Azure

The suite features many products. In this blog post I touch on which CAS products are relevant to obstacles affecting education, finance, and not-for-profit organisation, but are also applicable within each step of data processing.

Big data is currently a hot topic. A lot of people are excited by the idea that vast amounts of data can be used to predict the future. For large organisations, this kind of technology could be advantageous, especially when making decisions of how to move forward.

I have used technical expressions in some areas of this blog, please refer to the glossary for definitions.

Solution 1 – Education: Pin-pointing Student
drop-outs before they happen

UniversityStudent results, reports, course information, historical drop-out informationPin point drop-out risksAlerts and statistics analysis

Obstacle summary

Universities are continually concerned with student drop-out rates. Predicting when and which students are most likely to drop-out, but also providing the reasons that could motivate a drop-out would allow universities more opportunities to improve retention.

Suitable products within CAS

This table summarises the products (within CAS) which could be used together to provide an end-to-end solution for this obstacle:

In terms of big data characteristics, the solution would need to focus on accommodating ‘data volume’.

Each column represents a step in processing an organisation’s data.

1. Management2. Storage3. Surfacing4. Interaction
Azure Data CatalogSQL Data WarehouseAzure Machine LearningPower BI

Azure Data Catalog

“Data source discovery to get more value from existing enterprise data assets” – Microsoft

It often can take a long time to audit unstructured data sources such as documents and spreadsheets. It is also hard keeping track of new data sources which are created across a large university. In this scenario, the Azure Data Catalog could be applied to discover and organise data.

SQL Data Warehouse

“Elastic data warehouse-as-a-service with enterprise-class features” – Microsoft

For privacy and security reasons, universities often setup their IT infrastructure across two environments – on-premises and the cloud.

Azure Machine Learning

“Powerful cloud-based predictive analytics” – Microsoft

In order to make sense of the stored data, and predict drop-outs automatically, Azure Machine Learning can be setup to find correlating patterns in the data.

Power BI

“Power BI transforms your company’s data into rich visuals…” – Microsoft

Using Power BI to visualise predicted university drop-outs gives clarity by converting the processed data, into beautiful chart and graph infographics.

Power BI allows for the crafting of fully customised dashboards and reports.
Find out more information in our recent blog post.

Solution 2: Financial Services | Insurance:
Predicting natural disasters before they happen

Insurance companyInsurance policies / accounts, geographic, chronological, meteorological, climatologist, seismologicalPredicted natural risks to customersAlerts, and premium adjustments

Obstacle summary

As an Insurer it’s critical to understand every environmental, and natural risk posed to customers.  Present risks are important, but future risks are even more important, especially when it comes to creating premiums which are viable. Many large pay-outs can cripple an insurance company during a natural disaster, so it would be sensible to have a solution in place which could estimate where and when there is a risk to existing and potential customers.

Suitable products within CAS

In terms of big data characteristics, the solution would need to focus on accommodating ‘data velocity’.

1. Management2. Storage3. Surfacing4. Interaction
Event hubAzure Data LakeAzure Stream AnalyticsPower BI and automation within Dynamics CRM and interaction with Cortana.

Event Hub

“Ingest, persist and process millions of events per second” – Microsoft

In a situation when climate data is collected in real-time such as unprocessed sensor data from a vast range environmental monitoring devices then the Event Hub is the right choice. This is particularly that case when it comes to collecting data attributing to millions of climate events fired from global devices and applications every second.

Azure Data Lake

“Batch, real-time and interactive analytics made easy” – Microsoft

The Data Lake would be a good choice for processing and storing raw climate data, collected on a large scale. Its on-demand scaling and security features provides a safe, cost-effective platform that reduces the need to buy hardware.

Azure Stream Analytics

“Real-time stream processing” – Microsoft

Stream Analytics would be required to process the real-time and historical data, managed through the Event Hub, and stored in the Data Lake into analysis-ready data.

Stream Analytics could be then developed to relate an Insurer’s customer accounts to geographical climate data. Using SQL based syntax, an algorithm could be developed to then ‘flag’ customers and places which are at risk from future natural disasters.

Power BI and Dynamics CRM (not included in CAS)

Customised reports and dashboards from analysis-ready data could be generated so that Insurers can clearly see trends and forecast natural risks. Also automated ‘risk’ alerts can be set to appear within customer contacts and accounts stored in a customer relationship management (CRM) system.

I would recommend Dynamics CRM based on the reduced effort in connecting Microsoft products, and integration with Cortana.


“Cortana is your clever new personal assistant” – Microsoft

Being mobile as an insurer is an advantage. Speaking to Cortana on a mobile device to query the information in customer accounts makes it easy to surface ‘risk’ data on the move. Also Cortana can talk back, alerting an insurer of risks to their customer base.

Solution 3: Not-for-profit: Creating personalised communications to existing donors

CharityExiting Donor Profiles, Social Media Communications, Press releasesRecommend how to personalise communications to specific donorsSend out personalised email communications to existing donors

Obstacle summary

Effective communication with charitable donors can be difficult, especially when judging how to
re-engage after they have made a generous donation.

It is important to keep individual communications ‘sensitive’ to a donor’s perception, and how open they are to involvement with the charity.

Being an ‘insensitive’ charity, can cause loyal donors to be deterred from donating again, or create inefficient spend on communications.

Of course, for a large charity it is difficult to constantly keep in-touch with all donors.
Wouldn’t it be great to implement a solution which recommends how to communicate with individual donors based on their charitable activity? Now it could be possible.

Suitable products within CAS

In terms of big data characteristics, the solution would need to focus on accommodating ‘data variety’.

1. Management2. Storage3. Surfacing4. Interaction
Azure Data FactorySQL Data WarehouseAzure Machine LearningAutomation within Dynamics CRM and Click Dimensions

Azure Data Factory

“Compose and orchestrate data services at scale” – Microsoft

When required to initially collect and transform unstructured data from sources such as social media and press releases, the Data Factory is recommended to quickly turn this kind of information into structured data. Later, the newly processed data can be compared with pre-existing structured data, such as donor profile information within a data store.

SQL Data Warehouse

Cost saving is crucial for charities, and the SQL Warehouse is designed for just that. Growing and reducing (scaling) a charity’s data storage is cost efficient. Its quick to setup, and fast querying of relational and non-relational data leaves charities with choices when personalising the structure of their data.

Azure Machine Learning

Using an Azure Machine Learning API such as ‘text analytics’ to compare sentiment within social media (incoming activity) and the key phrases in recent charity press releases (outgoing activity), could produce excellent insights into the specific communications a donor is engaging in. Comparing these insights with existing customer profile information might enable accurate recommendations on how to further communicate with this customer.

Dynamics CRM and Click Dimensions (both not included in CAS)

Lastly, the recommendations produced could be used to tag donor accounts in Dynamics CRM.  Based on the recommendations, email communications could be tailored using an email marketing tool such as Click Dimensions (CRM plugin). Then with customisation, the email communications could be automatically assigned to each donor account. Time is saved before adding a ‘unique’ touch to a personalised email.


So far I have mentioned most products featured in CAS, and how they apply when overcoming hypothetical obstacles affecting the three industry sectors; Education, Financial, and Not-for-Profit.

Although I have not elaborated on obstacles affecting the industry sectors; Professional Services, and Local & Regional Government, the products within the Cortana Analytics Suite can be leveraged to also improve situations in these sectors.

To find out more or to enquire how your organisation can benefit from Cortana Analytics, please contact us.



Hadoop is an open-source software framework for storing data and running applications on clusters of commodity hardware. It provides massive storage for any kind of data, enormous processing power and the ability to handle virtually limitless concurrent tasks or jobs.

Commodity hardware

Computer hardware that is affordable and easy to obtain.


Connecting two or more computers together in such a way that they behave like a single computer.

Structured Data

Tabular data in databases and spreadsheets.

Unstructured Data

Data that doesn’t fit ‘neatly in a database’. Includes text and multimedia content. Examples include e-mail messages, word processing documents, videos, photos, audio files, presentations, webpages and many other kinds of business documents.

Parallel Processing

The simultaneous use of more than one CPU to execute a program.

Stream computing

A high-performance computer system that analyses multiple data streams from many sources live.

Relational data

Data in tables which relates to other data in other tables within a database.

Data Characteristics

Characteristics that define the nature of existing data that don’t exist in isolation to each other. There are usually five, and they all effect the overall cost of data processing. ‘Volume’ is cheapest and ‘Complexity’ is the most expensive;

Volume – Amount of data

Variety – Different types of data

Velocity – Speed in which the data is processed

Complexity – The intricacy of how the data structured

You may also be interested in:

Business meeting using Power BI