Putting predictive analytics to work

FRAMINGHAM, Mass. — The Orlando Magic’s analytics team spent two years honing its skills on the business side.

“Eighteen to 20 months ago, we knew virtually nothing about predictive analytics,” says Anthony Perez, director of business strategy for the National Basketball Association franchise. While his team was in fact working on predictive analytics well before that, Perez added, their tools weren’t powerful enough to give them insights they needed, and the group needed to scale up its efforts. So Perez brought in new, more powerful software from SAS and began climbing the learning curve.

Today, the established practice is not only helping optimize ticket sales but is also providing tools to help the coaching staff predict the best lineups for each basketball game, and which potential players offer the best value for the money.

Perez’s team began by using analytic models to predict which games would oversell and which would undersell. The box office then took that information and adjusted prices to maximize attendance — and profits. “This season we had the largest ticket revenue in the history of our franchise, and we played only 34 games of the 45-game season due to the lockout,” he says.

Now those models are run and prices fine-tuned every day. Ask him about how the models are used to predict the best team match-ups and the game strategy, however, and Perez is less forthcoming. “That’s the black box nobody talks about,” he says.

Although it’s still fairly early going, other organizations are tackling the learning curve with predictive analytics, the forward-looking data mining discipline that combines algorithmic models with historical data to answer questions such as how likely a given customer will be to renew a season ticket. The models assign probabilities to each person. Armed with that data, the business can prepare to take action. Additional analysis can then be applied to predict how successful different courses of action will be.

The use of predictive analytics is common in industries such telecommunications, financial services and retail, says Gareth Herschel, an analyst at Gartner Inc. “But overall it’s still a relatively small percentage of organizations that use it — maybe 5 per cent.”

Nonetheless, interest is high in organizations that are still focused on historical, “descriptive analytics,” and in businesses with established predictive analytics practices that are now moving outside of traditional niches such as marketing and risk management. They’re predicting website click-through rates and overall behavior, and helping HR anticipate which employees are likely to churn. Another area is help desk call routing, where models can be used to determine which agent is likely to do the best job of answering a given customer question.

“There’s more interest because there’s more data,” says Dean Abbott, president of consultancy Abbott Analytics. “The buzz is about momentum. People are saying this is something I need to do.”

But you have to walk before you can run, and with its data-heavy demands, predictive analytics isn’t something to take up lightly, or haphazardly. We asked businesses that are new to the game as well as seasoned professionals to share their experiences. Start small, they say, partner closely with the business to define the problem, continuously test and refine the model, put results in terms business decision makers can understand and, above all, make sure the business is willing and able to act based on those predictions.

Making the business case

Consumer products company Procter & Gamble Co. makes extensive use of analytics to project future trends, but it wasn’t always that way, says Guy Peri, director of business intelligence for P&G’s Global Business Services organization. “This used to be a rear-view mirror-looking company, but what happened six months ago isn’t actionable. Now we’re using advanced analytics to be more forward looking and to manage by exception,” he says, which means separating out the anomalies to identify and project genuine trends.

P&G uses predictive analytics for everything from projecting the growth of markets and market shares to predicting when manufacturing equipment will fail, and it uses visualization to help executives see which events are normal business variations and which require intervention. “We focus the business on what really matters,” Peri says.

“We’re using advanced analytics to be more forward looking and to manage by exception,” says Guy Peri, director of business intelligence for one of Procter & Gamble’s business units.

The place to start is with a clear understanding of the business proposition, and that’s a collaborative process. “Be clear on what the question is, and what action should be taken” when the results come back, he says.

It’s also important to keep the scope focused. Mission creep can destroy your credibility in a hurry, Peri says. Early on, P&G developed a model to project future market shares for regional business leaders in a line of business he declined to identify. It was successful until the company tried to use the same model to accommodate the needs of other business leaders.

Those requests required a more granular level of detail, but his group tried to make do with the same model. “The model became unreliable, and that undermined the credibility of the original analysis,” which had been spot-on, he says.

New users need to take several steps to get started: Hire a trained analyst who knows how to develop a model and apply it to a business problem, find the right data to feed the models, win the support of both the business decision maker and an executive sponsor in the business who are committed to championing the effort — and take action on the results.

“Notice I didn’t mention tools,” Peri says. “Resist the temptation to buy a million-dollar piece of software that will solve all of your problems. There isn’t one,” he says, and you don’t need to make that investment for your first couple of projects. Instead, invest in training staff on advanced spreadsheet modeling.

“All of this can be done with Excel to get started.” Only when you are ready to scale up do you need to investigate bigger, platform-level types of tools, he says.

Keeping users close

Bryan Jones started on a shoestring budget — but that’s not why his first effort at predictive analytics was a failure. Jones, director of countermeasures and performance evaluations at the Office of Inspector General within the U.S. Post Office, wanted to use predictive analytics to help investigators determine which health care claims were most likely to be fraudulent.

After eight months he had a working model, but the independent analytics group working on the project wasn’t fully engaged with the department that would be using the tool. As a result, the raw spreadsheet output mailed to each office was largely ignored by investigators.

Fortunately, Jones’ group had the unwavering support of the Inspector General. “You’re dead in the water if you don’t have that support from the top,” he says.

The second time around Jones hired a consultant to help with modeling and data prep, and embedded an analyst within the group that would be using the results.

Heat maps such as this one, used at the U.S. Postal Service, make data much more actionable than raw numbers alone, implementers say.

And they made those results more ‘real’ to users. For a team investigating contract fraud, for instance, his team placed the results in a Web-based interactive heat map that showed each contract as a circle, with larger circles representing the biggest costs and red circles the highest risk for fraud.

Investigators could then click on the circles to drill down on the details of each contract, as well as related contracts, that are at risk. “That’s when people started to notice that we really had something that could help them,” he says.

Now departments come to him with requests, and they assign someone from their own group to work with Jones’ team. Once the output is ready, the person returns to their group and teaches others how to use it. Jones doesn’t have hard return on numbers yet but will be reporting on increased arrests, dollars recovered and other metrics.

Bryan Jones, who heads up analytics at the Inspector General’s office at the U.S. Postal Service, says, “You’re dead in the water if you don’t have that support from the top.”

There were also unanticipated benefits. “Now people understand that we can be proactive. That was a return we didn’t have on our list,” he says, and new investigators that don’t have 25 years of experience and connections can get up to speed quickly. “These tools can level the playing field,” Jones says.

Jones’s advice: Get close to your customer, get professional help building your first model and present the results in a compelling, easy to understand way. “We didn’t have the right people or expertise to begin with. We didn’t know what we didn’t know,” he says, so Jones turned to an outside data-mining expert to help with the models. “That relationship helped us understand why we failed and kept us from making the same mistakes again.”

Overcoming business skepticism

While hiring a consultant can help with some of the technical details, that’s only part of the challenge, says John Elder, principal with Elder Research, a consultancy that worked with Jones’ team. “Over 16 years we have solved over 90 per cent of the technical problems we’ve been asked to help with, but only 65 per cent of the solutions have gone on to be implemented.”

The problem, generally, is that the people the model is intended to help don’t use it. “We technical people have to do a better job making the business case for the model and showing the payoff,” Elder says.

Organized analytics: Who runs the practice?

Where you locate an analytics practice may be as critical a success factor as the teams skills, experts say. Should you embed it within the IT organization, establish it as an independent group or embed the function within each business unit? Here are the tradeoffs.

Option 1: Run analytics as an IT operation. IT has the data expertise. Making the analytics team part of IT helps foster a common sense of purpose, and that collaborative relationship may foster faster integration of analytics with other enterprise applications. But without a close working relationship with stakeholders, such groups risk producing great models that no one uses. Analytics groups may also disappear into the IT fabric, says John Elder, principal at consultancy Elder Research. “I wouldn’t embed them in IT because other priorities might take over.”

Option 2: Let each business unit operate its own analytics group. Keeping data analysts embedded in the individual businesses ensures alignment with business needs and facilitates collaboration. However, the relationship with IT, which manages the data, may be more distant. While analysts want to innovate, IT may be more concerned about system availability and performance demands. Some members of the IT group may view analytics as an annoyance, says Dean Abbot, president of consultancy Abbott Analytics Inc., “because it means more hours in their day dedicated to things they don’t care about.”

Option 3: Create a shared services group. This approach allows for standardization of a common set of models and methods, eliminating redundancy and increasing productivity. But being outside of the business team can reduce buy-in by the business, as Bryan Jones, director at the USPS Office of Inspector Generals countermeasures and performance evaluation unit discovered. “We couldnt get much traction because we were an outside entity,” he says, so he moved analysts into the business groups. Today, however, the organizations send auditors and investigators to his services group to help develop new models. “Were an organizational support group again,” he says.

Procter & Gamble has a shared services group that reports to the CIO, but embeds some 300 analysts in the individual businesses to act as “trusted advisor” to the company’s presidents and general managers, says director of business intelligence Guy Peri. Using an agile development model, the group can tweak an existing model within 24 hours, deploy new ones within 30 days and scale a good one out to other business units within 90 days.

Ultimately, the right decision depends on the existing structure of the organization, says Gartner Inc. analyst Gareth Herschel. “If youre IT-centric, put them in IT. If you have business units all doing different things, embed the groups in the business. If the businesses share customers and suppliers and have product overlap, use a shared services group.”

Convincing decision makers to use the results can be as difficult as getting them to go along with the project in the first place, because the predictions may be the exact opposite of what their business intuition tells them, says Anne Robinson, president-elect for the Institute for Operations Research and the Management Sciences (Informs) the professional society for business analytics. “As you get more involved with analytics it becomes counter-intuitive. But it’s those deviations from what you’re doing that bring the rewards, because when the results are intuitive you find that most people are already doing them.”

Several years ago Cisco Systems created “propensity to buy” models — to figure the probability that customers will buy this quarter or next, or never. The models cover every product in every sales territory. The salespeople felt they already knew what some of the people identified by the model were going to buy, so Cisco excluded those sales when calculating the return on its effort. “The first year we did it, we generated $1 billion in sales uplift,” says Theresa Kushner, Cisco’s senior director of customer and influencer intelligence. “We had an experience to line up against what they thought they believed.”

Peri learned the hard way that 80% of a predictive analytics project is cultural. “I came in naively thinking that if I had a model that does all of these great things it will just work. But you have to be aware of how people make decisions and how it will transform that process.”

P&G once developed a model designed to provide an “early warning” on how each business was going to perform. “It was actually quite accurate, but the warnings were given in such as way that people didn’t understand how to take action on them, and so we didn’t get the proactive decisions we wanted,” he says. Lesson learned: “Analytics is only valuable when you take action on the insight.”

People can also feel threatened by analytics. “There’s a concern initially that the model is designed to take over decision-making or doesn’t respect my business knowledge,” Peri says. Users need to understand that the predictive model serves as a decision support tool and how to use the output in their own decision-making processes.

Don’t waste time trying to get people to believe in the model, says Cisco’s Kushner. Instead, do a test and present the results. In this way you’re not countering their knowledge: The science is. “This is math; this is fact; this is statistics. You have an experiment to line up against what they thought they believed.”

Ultimately, however, predictive analytics is forcing a showdown between data-driven and intuition-based decision making, says Eric Siegel, president of the analytics training firm and conference organizer Prediction Impact Inc. “That’s the big ideological battle. It’s a religious debate.”

Data: Getting to good enough

On the technology side, both building the model and preparing the data can be stumbling points. Predictive analytics is an art as well as a science, and it takes time and effort to build that first model and get the data right, says Abbott. “But once you build the first one, the next one is much less expensive to model” — assuming you’re using the same data. Analysts building a new, entirely different model that uses different data might find that project just as time consuming as the first. Nonetheless, he says, “The more experience one gains, the faster the process becomes.”

Data preparation issues can quickly derail a project, says Siegel. “The software vendors skip that point because all of the data in the demo has already been put into the correct format. They don’t get into it because it’s the biggest obstacle on the technical side of project execution — and it can’t be automated. It’s a programming job.”

When the Magic’s Perez got started in 2010 he grossly miscalculated the time it would take to prepare the data. “We didn’t set the right expectations. All of us were thinking that it would be easier than it was,” he says. Pulling together data from Ticketmaster, concession vendors and other business partners into a data warehouse took much longer than anticipated. “We went almost the entire season without a fully functional data warehouse. The biggest thing we learned was that this really requires patience,” he says.

“Everyone is embarrassed about the quality of their data,” says Elder, but waiting until all of the data quality issues are cleaned up is also a mistake. Usually, he says, the data that really matters is in pretty good shape. “I urge people to go ahead and make a salad and see what you can get,” he says.

Blue Health Intelligence (BHI) had no issues with the patient health care data coming from its 39 Blue Cross Blue Shield affiliates — but with seven years of data about 110 million members, they had a lot of it. “Health care is way behind in analytics because of the complexity of our data,” says Swati Abbot, CEO. “People tend to run after the data and not know why they need it.” Clinical insights must come first, she says. “Then the math takes over.”

BHI developed models to predict which of its highest risk members were most likely to be hospitalized, who had an avoidable risk, who was most likely to respond to intervention and which actions were most likely to work in each case.

Initially, BHI focused on diabetes patients at highest risk for hospitalization — a patient group expected to cost the healthcare industry $500 billion by 2020. “The first round [of analytics projects] was not successful because the business stakeholders weren’t involved. It became an IT exercise,” she says, and because the reports didn’t say why a given patient was at high risk, clinical professionals ignored them.

So Abbot made sure that they understood the underlying “risk drivers” in the model, explaining not just who was at high risk but why, and what interventions would be mostly likely to improve the outcome.

“Provide transparency,” she advises, “and serve up the information in the right way, so it’s not disruptive to workflows.”

Iterate first, scale later

At Intuit every project starts small and goes through continuous cycles of improvement, says George Roumeliotis, data science team leader. “That’s our process: Iterative and driven by small scale before going big.” The financial services business started using predictive analytics to optimize its marketing and upsell efforts, and now focuses on optimizing the customer experience with its products.

Intuit developed predictive task algorithms to anticipate how customers will categorize financial transactions in products such as Mint and QuickBooks, and makes suggestions as they enter new transactions. It also proactively offers content and advice as customers use its products in an effort to anticipate questions before the user has to ask.

“Start with a clearly articulated business outcome, formulate a hypothesis about how the process will contribute to that outcome, and then create an experiment,” he says. Through A/B testing, analysts can gain the confidence of business leaders by creating parallel business processes and demonstrating a measurable improvement in outcomes.

Just be sure to start by choosing an existing business process that can be optimized with minimal risk to the business, he advises. Customer support, retention and user experience are great places to get started.

While predictive analytics projects can require a substantial investment up front, return on investment studies show that it does make an impact, as Cisco’s experience shows. Ultimately, even small-scale projects can have an enormous impact on the bottom line. “Predictive analytics is about projecting forward and transforming the company,” says Peri.

The risks are high, but so are the rewards, says Informs’ Robinson. “Take it to the end,” she says. “Be successful. And act on what you learn.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now