Agile Insights into Predictive Analytics
Updated · Jan 19, 2014
Page Contents
The promise of predictive analytics seems straightforward and powerful: semi-automated, flexible tools that allow us to predict customer and supply-chain trends better and quicker, leveraging Big Data for more cost-effective implementation of strategies.
And yet, one of the principles of Agile Marketing (in one version of its manifesto) is, “The process of customer discovery [is to be preferred] over static prediction.” What is it that agile tenets are warning us about with regard to prediction, and how can IT use it to deliver more effective strategic targeting?
Fundamentally, agile marketing is talking about three ways in which un-agile use of predictive tools can – and often does — lead the organization astray:
- Over-reaction to short-term data without taking into account historical context;
- Focus on the wrong aspects of the data that leads to poor prediction; and
- Reinforcement of “happy news” that leads the organization away from insights into the customer and supply chain and toward further investment in a flawed strategy-development and implementation process.
Note that agile does not say not to use predictive analytics; “over” means to shift your emphasis, not to deep-six predictive analytics. In fact, properly used in an agile process, predictive analytics may become more valuable to the organization.
So, with that understanding, let's examine each of these problems with existing predictive analytics in turn, and consider how agile would change predictive analytics' use.
Ready, Fire, Aim, Analytics
A couple of years ago, Sloan Management Review published an interesting experiment in which students were asked to manage inventory given data on a weekly basis, and then on a daily basis. Inevitably, the students did worse when given daily data — comparable to the flood of new, more up-to-date data available through Big Data.
The reason was that they reflexively tended to react to each day's data without considering weekly order cycles, and as a result they “overshot” — re-ordering in both directions, too much and too little. Aside from ERP, there are plenty of other examples, such as seeking excess server capacity in a panic because of a rapid ramp-up in usage just after a product is released, where your predictive analytics is telling you that if this continues over the next three months, you will not be able to handle the orders, but historical context would tell you that the initial surge will probably not last for the full 3 months.
The agile approach says that the problem with this approach is that your use of the predictive tool is encouraging “tunnel vision” that excludes key data. The solution is a better analytic process that is incremental (“let's try this” instead of “this is how things will be”) and aimed at discovery of what the customer or environment is like rather than assuming that we know everything that's worth knowing.
Thus, the process that uses predictive analytics will emphasize generating new hypotheses that move toward the truth, including historical context, and then testing them incrementally, instead of committing whole hog to a prediction. It's prediction in the service of discovery, not prediction that makes you continually over-shoot.
Data Blindness
Many years ago, I along with other analysts was asked to predict how the market share of particular vendor products would change over the next year. Year after year, following the results of their surveys that showed IT buying plans for the year, other analysts predicted the market share of the dominant vendor would fall, and I predicted it would rise. Year after year, I was right. Why?
The reason, in this case, was that, in the lower parts of the organization, the dominant vendor was extending its dominance, making replacement or commitment to a new product more difficult and further buys in the dominant product more cost-effective when the actual buying decision was made – and because I had an open mind and heard from the lower parts of the organization, I could see where straightforward predictive analytics would go wrong.
Again, in this case, agile says that the problem is with the process and with uncritical use of the tool, rather than with the tool itself. Because you are focusing on prediction rather than better long-term understanding of the customer, you fail to notice the invalidity of the assumptions underlying your prediction until it is too late. The solution, as before, is to first focus on discovering or improving your understanding of how the customer operates, and then call in the predictive tool, not only to predict but also to test whether that understanding is valid. It's prediction in the service of data understanding, not prediction for prediction's sake.
“Happy News” Prediction
In any product or service rollout, there is strong emphasis on focusing on the positive; this will work, because it has to work. Inevitably, this slants predictive analytics toward insights that seem to allow additional sales with a little fine-tuning of the product, rather than insights that show short-term “milking of the customer” or failure to consider the ultimate consumer of a business-to-business product that can lead to unanticipated major product-rollout failures. And each short-term success seems to validate this use of predictive analytics, meaning that more money flows into what is going to be a bigger disaster.
From “Crossing the Chasm” to more recent Sloan-Management-Review articles on how one's biggest customer may turn out to be less cost-effective to pamper than a raft of smaller ones, a rich literature shows the danger of reinforcing a flawed process, with or without predictive analytics.
Once again, agile says that the problem lies with a process that fails to be open to “bad news” – and that means building both business value for detecting bad news and rapid reaction to it into the process and into the use of the predictive tool. Practically, this can mean setting up specific guidelines that ensure predictive analytics spends as much time on “danger signals” and understanding of the customer buying process and its evolution as predictive analytics spends on confirming initial success of the product.
It can also mean implementation of a predictive-analytics version of “alerting” that specifically calls out danger signals in reports to management. This is predictive analytics in the service of improving the process (i.e., making it more agile), not predictive analytics that uses “happy news” to reinforce an existing problematic process.
The Predictive Analytics Bottom Line
Briefly, then, three key agile insights into predictive analytics are:
- You are in danger of cost-ineffective over-reaction in response to the prediction. You should bake discovery and consideration of historical context into the process.
- You are in danger of complacent reliance on inappropriate data in making your prediction, leading to costly slow reactions to problems. You should focus on a process that aims at discovering repeatedly which data is appropriate.
- You are in danger of organizationally-reinforced focus on “happy news” predictions, with initial apparent successes leading to greater disasters down the road. You should focus on a process that values understanding how the organization should change and what the long-term consequences of short-term decisions might be, and then use predictive analytics to test out scenarios rather than to lock in problems.
I should note that carrying out these fixes does not require that you adopt an agile process – rather, you can fine-tune the aims and uses of predictive analytics to handle the problems described above. However, anecdotal evidence reinforces my strong belief that your use of predictive analytics will be much better over the long term, tactically and strategically, if you do set out to create an agile predictive-analytics-using process.
Predictive analytics using Big Data, deployed correctly, can deliver major value-added insights; so can agile marketing. Why not combine the two?
Wayne Kernochan is the president of Infostructure Associates, an affiliate of Valley View Ventures that aims to identify ways for businesses to leverage information for innovation and competitive advantage. Wayne has been an IT industry analyst for 22 years. During that time, he has focused on analytics, databases, development tools and middleware, and ways to measure their effectiveness, such as TCO, ROI, and agility measures. He has worked for respected firms such as Yankee Group, Aberdeen Group and Illuminata, and has helped craft marketing strategies based on competitive intelligence for vendors ranging from Progress Software to IBM.
Wayne Kernochan has been an IT industry analyst and auther for over 15 years. He has been focusing on the most important information-related technologies as well as ways to measure their effectiveness over that period. He also has extensive research on the SMB, Big Data, BI, databases, development tools and data virtualization solutions. Wayne is a regular speaker at webinars and is a writer for many publications.