The oft used phrase “knowledge is power” draws an explicit parallel between information and might, and the 21st century – also known as the information age – has certainly shown this to be true. Today, staying ahead of the competition requires you to know what’s coming and anticipate an effective response to ensure not just the survival of your enterprise, but its growth.

Well, that’s where predictive analytics comes in: using a variety of techniques, you analyze data to decide the best action to take, factoring in any uncertainty life could throw your way. There are numerous techniques that come under the umbrella of predictive analytics such as data mining, classical statistics, machine and deep learning but their ultimate purpose is the same – to help you make the optimal decision when faced with uncertainty.

Predictive analytics is used mostly in the business setting, where financial figures are of paramount concern. This covers domains such as insurance costs and fraud, trends for the stock market, marketing for industries like telecommunications, and operations for services like general healthcare and the government.

analytics maturity
 Competitive advantage driven by analytics maturity (Image Source: Delaware Consulting Firm)

Because of the different techniques employed in predictive analytics, it can be difficult to comprehend as a concept. However, at its core, it can be viewed as a collection of techniques used to break complicated problems into simpler, less complex ones that are easier to model, and thus easier to grasp and solve.

Sometimes predictive modeling and predictive analytics are thought to be the same thing. While they are closely tied, they’re not the same: predictive models are used solely to create a representation of conclusions and information, whereas predictive analytics is what make that conclusion.

The predictive analytics ecosystem

Predictive analytics is dependent on various approaches for problem solving. Some of the techniques that are part of predictive analytics are data mining, artificial intelligence, and machine learning.

Data mining is the process of collecting huge amounts of raw information. For a program to make the best decisions possible, and to graph what conclusions can be extrapolated from a situation, it needs to have as much initial information as possible. Thus, data mining underlies the core existence of predictive analytics, giving it the ability to make predictions out of what has already happened. It is only the initial stage of analyzing raw information though, and its output must go through several subsequent stages to obtain the final workable decision.

Artificial intelligence employs algorithms to connect all the data points and come up with solutions to specific problems, whether it’s playing chess or graphing trade forecasts. In essence, artificial intelligence aspires to make computers reason and behave like humans. Machine learning might be the phenomenon that makes these programs grow more efficient and effective, but it is the greater structure of artificial intelligence that enables it to use the available data effectively, and to make the required predictive models and decisions.

Conceptual visualization of relationship between AI, Machine and Deep Learning. (Image Source: NVIDIA )

Machine learning uses mined data to create intelligent systems that learn in either a supervised or unsupervised environment before being exposed to new data. A subset of artificial intelligence, it is responsible for the efficient and effective retrieval of useful information from what was mined, and the weeding out of unnecessary data from the predictive analytics process. Because much of predictive analytics is heavily reliant on raw data collected on numerous subjects, the program must learn to separate useful information from noise and make the most logical connections. The machine learning algorithms used in predictive analytics make this possible and are applied in several fields including robotics, computer vision, NLP and fraud detection.

Deep learning is a recent phenomenon that has over taken machine learning as being the hottest buzz generating topic in AI. Deep learning is really a subset of machine learning that has been unlocked due to the increase in computational power and the ability to store large amounts of data economically and efficiently. We are all aware of Google Brain’s cat experiment where 10 million unlabeled and random images were fed to a large scale neural network. Fascinatingly, without being told a description of a cat, or given an image of a cat, one of the artificial neurons discovered what a cat looked like on its own.


Google Brain experiment – Artificial neuron network discovered cats without supervised learning (Image Source: Google)

The success of predictive analytics is often dependent on the ability of the programs to encompass these techniques – the ability to mine a great amount of data, the machine and deep learning to extract what is necessary from that data, and the artificial intelligence to generate meaningful predictions and models from that data. Because of the synergy needed for this entire process, it must, therefore, be viewed as an amalgamation of its subsystems, rather than as a discrete, indivisible process.

Predictive analytics in our world

In general, vast amounts of data are needed in many different fields, such as:

  • Fraud prevention and insurance
  • Customer segmentation and retention
  • Resource planning

Fraud prevention and insurance deals with the vast amounts of money exchanges central to insurance claims process. People have to pay for their plans and the premiums they come with, deductibles have to be assessed, and payouts need to be verified for things like medical and vehicle claims. Because so much money is constantly being shifted around, it would be impossible for human managers to look at all the data points and assess where they should price their packages for individual users, let alone the market as a whole.

Add to that the fact that insurance systems have to be checked at all hours every day because of the massive number of claims that are filed (a sizeable portion of which are fraudulent). Using predictive analytics, claims as a whole can be gauged to make an educated guess about how many fraudulent claims will slip through the cracks for a company, allowing it to set aside money to cover the cost of such mistakes. In addition, predictions and models can be used to help fight against the growth of fraudulent applications for a business year by year.

Customer segmentation and retention relies heavily on collecting a huge amount of customer data and refining it until actionable insights can be gleaning about a customer. In this case, predictive analytics allows a business to hypothesize about their customers, how invested they are in the company and its offerings, what their preferences are, and how they may evolve over time. This empowers the business to focus on targeted ways to increase the retention of those customers, instead of using blanketed, general means to do so inefficiently.

Resource planning is critical for any business. Since a business may need to tap into one or more of its resources such as employees, assets, revenue, utilities, taxes, product and service cost, etc. at all times, they need to be well understood in relation to each other. Predictive analytics in this context would enable a business to forecast the increasing or decreasing need of its resources, so that it could prioritize only that which is needed to function optimally. Without these predictions, the business could very well be at a loss of vital resources when they are needed the most or be overstaffed when it needs fewer hands.


Predictive analytics is a business’s best friend in a world where more data is available than can be imagined. If knowledge is power, then it’s the ability to quantify that knowledge in usable, relevant predictions and models that makes that power absolute, and for that reason it is absolutely worthwhile to explore available technologies that can model, analyze, and predict efficiently and effectively for whatever situations that may arise.

Related Download
A Big Year for Cognitive Computing in Canada: Is Your Infrastructure Ready? Sponsor: IBM
A Big Year for Cognitive Computing in Canada: Is Your Infrastructure Ready?

Register Now