...

Putting predictive analytics to work Good business sense—but where do you begin?

by user

on
Category: Documents
656

views

Report

Comments

Transcript

Putting predictive analytics to work Good business sense—but where do you begin?
Putting predictive analytics
to work
Good business sense—but where
do you begin?
January 2012
Table of contents
The heart of the matter
2
Predictive analytics: Businesses want to
do it, but few know where to begin
An in-depth discussion
4
Asking the right questions up front
is a keystone of success
What this means for your business
Companies that do their homework
can reap multiple benefits
January 2012
14
The heart of the matter
Predictive analytics:
Businesses want to do it,
but few know where
to begin
Today’s companies have vast amounts
of existing data on hand. And, as if
this were not complicated enough,
they continue adding to the pile by
collecting extraordinary amounts of
data. Designing an effective predictive
analytics initiative—one that stems
from a careful review of the business, the problem to be resolved, and
the desired outcome—is a complex
undertaking that requires a significant
investment of time and effort up front.
As we see it, before setting out to design
and implement an analytics initiative,
companies should step back and find
answers to some key questions.
If your company is ready to leverage
predictive analytics for a specific
problem or decision, first ask yourselves: What will the cost be to our
company if management makes a
‘wrong’ business decision? What types
of business decisions are we looking to
inform with predictive data? Is there a
correct tool we should plan on using?
The answers to these and other key
questions will help you identify and
successfully engineer a solution to your
company’s specific analytics goals.
The information garnered during this
process will contribute, in large part,
to a “how-to” guide that will serve as
a strong foundation on which to build
your company-specific predictive
analytics initiative.
The bottom line: Those companies
that invest the time and effort up front
to inform decision making and carefully tailor their analytics strategy can
move forward confidently to implement
a well planned initiative—one that is
designed to meet identified analytic
goals, and reap multiple benefits stemming from better use of their data.
When thoughtfully designed and properly implemented, predictive analytics
can deliver improved operations, and
significant revenue growth.
The heart of the matter
3
An in-depth discussion
Asking the right
questions up front is a
keystone of success
To illustrate this point: Let’s assume that the proactive manager of a
hotel is looking for ways to attract new customers during the holiday
season, including those of the local competition. To that end, she
makes it her personal goal to do all that she can to create a quality
customer experience.
So, you have identified a few problems
you’d like to tackle with predictive
analytics and you’re about to invest
in this initiative. But first, there are
some things you need to consider.
Finding answers to the right questions
early on is a critical success factor. Ask
yourselves:
1) What would the cost
to our company be if
management were to make
a ‘wrong’ decision to a
specific problem?
Before embarking on a costly initiative, you will want to determine the
extent to which you should run extensive analyses to determine the return
or cost savings of a proposed strategy
or new action. Often, the best way to
answer that question is to understand
the cost of a wrong decision—‘wrong’
being when an undesirable outcome
of the strategy or event occurs. For
example, let’s say a company decided
to send a piece of direct mail to a
particular customer, but the mailing
failed to elicit the desired response
from the recipient. The company made
a wrong decision by sending the mail
to that particular recipient. But other
factors should be considered: What
was the margin on that effort? What
was the overall response rate for that
mailing? And how does that compare
to the return?
In keeping with this goal, the manager spends as much time as
possible in the lobby area, making herself available to resolve any
issues, or simply acting as a warm and welcoming presence. She soon
observes that families with small children form a large percentage of
the hotel’s holiday clientele. Realizing that happy children produce
happy parents—and that family groups generate significant revenue—
the manager decides to make families a priority.
Since toys are often the way to a child’s heart, she decides to purchase
and wrap a variety of toys, and to keep the “toy sleigh” on hand for
visiting children. Whenever the manager spots a family with small
children, out comes the sleigh. To their delight, each child is invited
to choose a toy—even if they were not guests and had only eaten at
the restaurant.
Imagine the value that can stem from each of these small interactions.
Specially, these families are now more likely to return to this hotel
whenever they are in the area because of this positive interaction.
If, for example, a family were to return and book a suite for a week,
this savvy manager will have converted a few nice toys into a stay that
would generate over $2,000 in room charges, meals and parking.
And even if families that made their home in the neighborhood
returned to dine in the hotel’s restaurant, the value would also exceed
the investment.
Now suppose that one of the families
in the above example did not return to
that hotel after all. What would have
been the cost of the manager’s wrong
decision? The hotel would only be out
the cost of a few toys—a negligible
cost compared to the potential upside.
In this case, does it make sense for the
hotel to build a predictive model to
determine which lobby guests should be
offered toys in the hope that they will
come back? Probably not! But when the
cost of being wrong is high, we believe
a predictive solution does make sense.
For example, an $8 billion beverage
distributor will lose sales revenue when
products are out of stock on a retailer’s
shelf. If out of stocks for the distributor
are more than twice the industry
average, does it make sense to invest in
a predictive solution? Of course! This
simply serves to underscore our belief
that determining the cost of making
a wrong decision early on can save a
company from making low value investments in predictive analytics.
An in-depth discussion
5
To illustrate this point: Let’s suppose that a healthcare firm was
facing two problems—1) predicting a fraudulent insurance claim, and
2) predicting the impact of lowering reimbursement rates for patient
re-admissions in hospitals.
2) What types of decisions
do we want to make?
Once a company has determined a
need for an investment, it is important
to categorize the type of decision to be
addressed. This will inform and guide
the modeling approach and related
resource allocations toward specific
skill sets and technologies.
Decisions can be broadly categorized
as either operational or strategic.
Operational decisions are those
that have a specific and unambiguous
‘correct’ answer, whereas with strategic decisions an unambiguous
correct answer is not always apparent.
This is because strategic decisions often
have a cascading effect on adjacent
and related system dynamics that can
sometimes be both counterintuitive
and counterproductive to the original
strategic goal.
We see many firms focusing their
analytics efforts on operational decisions, while very few have tried to
integrate analytics into their strategic
decision-making process.
6
Putting predictive analytics to work
Regarding the first problem, an insurance claim is either fraudulent or
it is not. The problem is specific and there is an unambiguous correct
answer for each claim. Most transaction-level decisions fall into this
category. But the second problem is more complicated. The firm is
hypothesizing that lower rates would incent physicians and hospitals to focus on good patient education, quality follow-up outpatient
care, and complete episodes of care for patients during their time in
the hospital. This would presumably lower costs for the health plan.
However, it could also lead to delayed discharge of patients during
their first admission, or to physicians treating patients in an outpatient
facility when they should be in an immediate-care setting.
Think about it: This cascading effect may actually increase the cost of
care and undermine the intent of the original policy, in that strategic
decisions have myriad causal loops that are not apparent and make an
unambiguous right answer hard to nail down.
How should companies attack each
of these problems? From where we
stand, operational decisions require
both established statistical techniques
such as regressions, decision-tree
analysis, and artificial intelligence
techniques such as neural networks
and genetic algorithms. The key focus
of our example is to predict, based on
historical data, whether or not a claim
is fraudulent.
For the strategic problem, however,
predictive modeling approaches
that are more explanatory in nature
are needed, since it is critical to
understand systemic relationships
and feedback loops. A model should
accurately capture the nature and
extent of relationships between various
entities in the system and facilitate
testing of multiple scenarios. In the
readmission policy example above, a
model will help to determine the cost
impact based on the various scenarios
of provider adoption and behavior
change—the percentage of providers
and hospitals that will improve care
versus those that will not adapt to the
new policy. Simulation techniques—
such as systems dynamics, agent-based
models, Monte Carlo and scenario
modeling approaches—are appropriate
for such problems. In fact, one of the
benefits of using simulation techniques
is that historical data isn’t necessary in
a traditional sense. One could actually
develop useful assumptions about the
nature and extent of relationships by
doing primary or secondary research,
developing proxies, leveraging expert
opinions, etc. These techniques are
quite effective when there is a lack of
historical data.
Bottom line: It is important to
remember that strategic and operational decisions need different predictive modeling approaches. There
are two questions to ask: First, is the
decision you want to drive operational
or strategic in nature? And, second, are
you using the appropriate modeling
approach and tools?
3) So, how do we get going?
Is there a correct tool we
should plan on using?
Actually, choosing the right tool is
the easy part! The popular statistical
techniques frequently used in business
analytics, such as linear regression
and logistic regression, are more than
half a century old. System dynamics
was developed in the 1950s, and even
neural networks have been around for
more than 40 years. SAS was founded
in 1976, while the open-source statistical tool R was developed in 1993. So
the tools and their benefits and their
limitations are already well understood. Instead of focusing on choosing
the best tool, companies seeking to
invest in a predictive analytics initiative should be focused on these important aspects of a predictive analytics
solution:
• Unambiguous definition of the business problem
• Clear analysis pathway toward
desired outcomes
• Thorough understanding of various
internal datasets
However, while choosing the right
tool is not a great mystery, the real
challenge lies in finding an experienced user who understands the
pros and cons of each tool and adapts
the techniques to solve the identified problem. Companies will be
well served by investing in the right
analytical team—one that can actually
be a source of competitive advantage.
This notion is understood within the
analytics-practitioner community, but
the same cannot be said of business
users and executives. In our experience, many senior executives still
believe that ‘cutting-edge’ techniques
such as neural networks should be used
to solve their business problems, and
that predictive analytics tools are a key
differentiator when selecting analytics
vendors. That’s why we believe that
the analytics community needs to do a
better job of educating business users
and senior executives—helping them
to understand that the decision around
vendor selection should depend upon
the specific business problem and the
latest and greatest tool may not be the
best answer to their business challenge.
• Clear idea of the business value of
the proposed investment, along with
a sense of the potential return
An in-depth discussion
7
4) Have we considered thirdparty data?
In predictive analytics initiatives,
companies place a large focus on the
entirety of their internal data—and
rightly so. However, it is also important
to consider that incorporating relevant
third-party data into analysis and the
decision-making process can offer
valuable supporting information that
few companies leverage. Investments
in targeted, relevant datasets can
generate far greater returns than does
time spent developing sophisticated
models that are data poor. Some degree
of creativity must be employed here to
think through not only what data could
possibly be relevant but also how to
measure or procure this data. It’s not as
easy as it would seem!
To illustrate this point: Now suppose that a wedding-gown retailer
wants to pursue a geographical expansion strategy. How would the
company determine where to open new stores? How should it evaluate
the performance of existing stores? Should a store in the Chicago
suburbs produce the same volume of business as a store in Austin,
Texas? To answer these and similar questions, an organization would
need a great deal of data that is not within its firewalls—specifically:
How many competitors’ stores sell wedding gowns in the same area?
(competitive market); how far are potential brides willing to travel to
buy a wedding gown? (travel consumption preferences); and what are
the income and spend profiles of people in the market? (customers’
budget set).
Data about existing store sales and customer base are important, but
they tell only part of the story and do not provide the entire context on
which to base smart decisions.
Other potentially useful inputs to the puzzle may include:
• Marriage registration data (from the National Center for Health
Statistics)
• Socio-demographic data (from a company such as Claritas or
Experian or the US Census Bureau)
• Business data (from Dun & Bradstreet, Hoovers or InfoUSA)
• Real-estate prices and custom survey data of potential brides
All of these variables should be woven into the retailer’s store-location
analysis.
8
Putting predictive analytics to work
5) How can we make
analytical insights
available to the decision
makers at the point of the
decision?
In our experience, successful decision
delivery is challenging, as it requires
cross-organizational coordination
between the analytics, business, and
IT groups to ensure that information
lands in the hands of the decision
maker at the point of decision. The
job of the analytics group is to make
their analysis results relevant to the
decision maker’s workflow. This could
take the form of a mobile handheld
device for a distributed sales force,
Customer Relationship Management
(CRM) systems integration for call
centers, or an executive dashboard for
reporting system integration. All of
these examples require close collaboration with the IT group, whose job it is
to take the results of a predictive model
and integrate it with the relevant front
end or reporting infrastructure. There
is also a need for training to ensure
that end users know what to do with
the information. The program management effort required to execute such a
cross-organization initiative is significant and, all too often, unanticipated
by the project sponsors.
Underlying all of this is the fact that
a good program manager is critical
to most complex predictive analytics
projects. He or she is tasked with
coordinating the various stakeholders
and achieving alignment on problem
definition, outcome format, technology
integration, and training to drive
user adoption of predictive analytics
solutions. These individuals also play
a critical role in the final steps of a
project—measuring and assessing the
efficacy of the models. We have seen
predictive analytics projects derail
due to lack of coordination among the
various groups within the organization
and/or under investment in program
management resources.
To illustrate this point: A physician was struggling with the fact that
some of her patients were not taking the medications she prescribed
for them because they believed they could not afford to purchase
them. As a result, before going to work, she took the time to log on to
the website of a large pharmacy chain and print out a list of popular
generic drugs that were covered by that chain’s low cost prescription
plan. She subsequently would check this separate list upon creating
prescriptions to inform her patients they could, in fact, afford their
medications after all. Why were patients and their doctors not able to
access this information automatically? The chain’s covered drug list
was not integrated with the mobile medical and drug decision support
software for doctors used to verify drug dosage and interactions
prior to writing prescriptions. Therefore, when it came time to write
prescriptions, physicians who had not made a personal effort to inform
themselves (as our physician did) had no way of knowing whether or
not a specific drug was covered within the chain’s plan. Since the right
information was not made available at the point of decision, patients
likely paid too much for their medication or simply did not fill the
prescriptions—neither of which were optimal outcomes. The situation
has since improved since the chain’s low cost prescription drug list has
been integrated into the drug decision software application.
Resolving delivery problems or ‘last-decision-mile’ issues such as this is
critical to a successful analytics initiative.
An in-depth discussion
9
Good data visualization can communicate the information that drives smart
decisions. Above we just touched on
the importance surrounding the lastdecision mile, where analysts should
ensure timely and appropriate information for decision makers. That said,
it is never easy to convince decision
makers to deviate from their modus
operandi. Specifically, convincing
them about the efficacy of a solution
based on predictive analytics and
getting them to approve change typically takes more than an R-squared or
a mean absolute percentage.
Relationship of client market spend to cost per call and number of calls
450
400
$350K spend
350
Cost per call
6) How can we begin to
leverage data visualization?
300
$250K spend
250
200
150
Marginal cost per call increases from
$200 to $800, or by 400%!
$120–$150K
spend
100
700
800
Model I
Recently, an online services client was
using multiple mass-market media
channels to execute their marketing
strategy. The strategy generated a lot
of inbound sales calls, but the client
lacked a clear understanding of which
media channels were contributing in
what amounts to the sales activity. PwC
was able to accurately model the relationship of spend by media channel to
response activity. With the model, we
could flex spending inputs by level and
channel to predict the effect on sales
10
Putting predictive analytics to work
900
1,000
1,100
Number of inbound sales calls
Model II
activity, cost per unit of activity, and
cost per closed sale. Rather than using
equations and tabular data to demonstrate these findings, we used a simple
but effective visualization. Specifically,
we were able to show that spending
an incremental $100,000 beyond the
base level media spend of $150,000
only generated an additional 125 sales
opportunities. Consequently, the
marginal cost of the sales opportunities
rose from about $200 per opportunity
to almost $800! Factoring in conversion
rates and customer value, it became
clear the incremental spend was not
viable. The simple picture shocked
the audience into rethinking their
media strategy!
To understand the importance of visualization, consider this: In 1854,
a visualization map by Dr. John Snow saved many lives. When a
cholera outbreak struck the SoHo district in London, Dr. Snow began
working to uncover how the deadly disease was spreading so quickly
through that area. At the time, little was known about how diseases
were transmitted. After interviewing residents in the area, Snow
hypothesized that a water pump was the source of the outbreak. He
created a map to illustrate that cases were centered around the pump.
Thanks to this visualization, he succeeded in his effort to have the
authorities shut down the pump, thereby helping to prevent further
spread of the disease.
We believe every predictive analytics
project needs a single ‘money-visual’
that ties together the analyses and an
unequivocal demand for change or a call
to action.
7) How important to
ongoing success is
prototyping, piloting and
scaling?
Truth be told, Thomas Edison did
not invent the light bulb. Rather,
he commercialized it! Edison took
a working concept and developed
hundreds of prototypes, rapidly tested
them, and then figured out improvements that were required to scale the
invention for commercial use. This
took into account both economics and
manufacturing.
This says it all!
“In 1879, Edison obtained an improved
Sprengel vacuum pump, and it proved
to be the catalyst for a breakthrough.
Edison discovered that a carbon filament in an oxygen-free bulb glowed for
40 hours. Soon, by changing the shape
of the filament to a horseshoe it burned
for over 100 hours and later, by additional improvements, it lasted for
1500 hours.”
—Julian Trobin
[http:/www.juliantrobin.com/bigten/bulbexperiment.html]
PwC took a page from Edison’s book
when we developed our own ‘proto­
type, pilot and scale’ approach to
deploy analytics solutions for our
clients. Rapid prototyping is essential
to showcasing the value of the initiative
to senior executives who are needed
to drive such projects and the viability
of the initiative, by proving that the
predictive model can work. Piloting
helps to refine the prototype and
plan for potential adoption pitfalls
among end users. Finally, scaling
effectively implements the sum of the
necessary changes or solutions across
the identified scope to achieve the
identified value.
An in-depth discussion
11
The following example illustrates the importance of the prototype, pilot and scale approach.
To illustrate the complexity of these projects—a real-world example: One of our healthcare clients wanted
to institutionalize a data-driven culture within its sales organization. Specifically, they requested our help in
identifying and focusing sales efforts on high potential customers. However, they also faced two problems that,
if unresolved, would get in the way of a successful national rollout: First, there were skeptics among the sales
personnel who did not trust the model—a situation that would make it difficult for the company to enact any
suggested changes. And 2) change management blind spots existed in the company’s current decision making
process which—coupled with de facto processes that govern the day-to-day activities—would be difficult to
incorporate into the strategic initiative, given that they are not always perceptible or documented.
We recognized that an understanding of the reality of how things are getting done helps to anticipate roadblocks
and align all processes to achieve a more successful strategy that everyone in the company could believe in. So,
we set out to identify and focus sales efforts on high potential customers and resolve these two stumbling block
issues. First, we developed a prototype of a predictive scoring model that identified high potential customers.
Then we mapped the results of that model to existing efforts, which showed that greater than 50 percent of the
sales force’s time was being used ineffectively, and that there was money left on the table. We designed a pilot
with the following objectives:
• Prove the validity of the predictive model
• Create ‘evangelists’ from the sales team of the pilot regions
• Identify the big-data gaps and establish a process of continually refining (CRM) data
• Establish and refine the key performance metrics to report to senior management
• Understand the key questions and concerns of the sales team in adopting the system.
During the pilot phase, we collected a great deal of rich quantitative and qualitative data. That data not only
conclusively proved the value of the predictive model’s impact, but it also provided us with insights to incorporate
into the rollout process. To cite two examples: some customer addresses were not updating in the data warehouse, and sales managers wanted to understand the factors behind the predictive customer score before they
felt comfortable using it. Scaling the pilot required cross-organization coordination and strong program management to ensure that the pilot’s lessons learned were incorporated in the rollout. Positive word-of-mouth regarding
the solution, and minimal impact on day-to-day business were also priorities.
Finally, we designed the compensation rules and reporting metrics around feedback garnered during the pilot,
which helped us to build a system that achieved buy-in from sales force leadership.
Our client saw a significant uplift in revenue after the first three months of rollout. By that time, the sales organization had come to appreciate the value of a data-driven approach, to the extent that they hired a team to support
other sales analytics initiatives.
12
Putting predictive analytics to work
Long story short: Whether you opt to
use PwC’s approach or to develop your
own, adopting predictive analytics
based solutions is essential for organizations seeking improvements to their
bottom line and a place at the forefront
of the competitive arena.
These are some of the lessons learned
and pitfalls we have encountered as we
have helped our clients adopt predictive analytics solutions. Predictive
analytics should be on every company’s
radar and once ready to invest, this
guide will help firms get started on
their journey.
An in-depth discussion
13
What this means for your business
Companies that do their
homework can reap
multiple benefits
The sooner you start, the
sooner you’ll reap the
rewards
What this isn’t—and, more
importantly, what it IS
Clearly, designing a company specific
strategy and implementing a carefully planned predictive analytics
initiative are complex undertakings
that require significant investment of
up-front time and effort. Recognizing
this, some leading companies have
turned to outside advisors with a
depth of experience to help them
begin this journey. Whether you
opt to leverage the experience of an
external predictive analytics team or
to tackle it on your own, it is imperative that you do the necessary homework before taking any action.
Asking, and answering, the right
questions will not provide you with
a complete roadmap for a successful
predictive analytics initiative, nor will
it magically deliver a formal framework for determining where your
company should focus its analytics
investments. But what it will do is
provide you with a basic guide to
get you started and help you figure
out the scope of both the problem to
be resolved and the solution. It can
also help you get a sense of potential
returns on your investment. In short,
with these answers in mind, your firm
can more effectively engineer a solution to its sought after analytics goals.
Dealing with these
complexities takes a
holistic approach
Is it worth the time and
effort? You bet!
We believe that there are several
extremely important facets of a
successful analytics project—data,
modeling, skills, business context,
technology, and program management—all of which play key roles.
All too often, though, we see companies focusing on just some of these
components while neglecting the
others, and ending up paying a steep
price down the road. To spur organizations like yours down a holistic
analytics path—making it possible
to consistently repeat and recreate
success and reach a true ‘information
advantage’—we compiled the aforementioned questions.
Organizations that go through the
recommended fact finding and planning process up front and base their
next steps on the input garnered
during that phase can position themselves to enjoy multiple benefits.
Well-planned and properly implemented predictive analytics initiatives lead to improved operations and
bottom-line growth. The time to get
started is now.
What this means for your business
15
www.pwc.com/advisory
To have a deeper conversation
about how this subject may affect
your business, please contact:
William Abbott
312 961 4672
[email protected]
Amaresh Tripathy
312 909 3098
[email protected]
© 2012 PricewaterhouseCoopers LLP, a Delaware limited liability partnership. All rights reserved. PwC refers to the US member firm, and may sometimes refer
to the PwC network. Each member firm is a separate legal entity. Please see www.pwc.com/structure for further details.
This content is for general information purposes only, and should not be used as a substitute for consultation with professional advisors. NY-12-0465
Fly UP