Big data, business intelligence and data analytics have dominated marketing and operational leadership mindshare for several years now, and there have been real breakthroughs in both computing and organizational outcomes using data analysis tools and strategies. In the federal government, for instance, the ability to collaborate and share data has enabled agencies to thwart planned terrorist attacks.
To my mind, however, the most critical challenge facing agencies and other organizations is finding ways to use data to illuminate and reduce complexity. Every organization has key “jobs-to-be-done” (a term of art from the practice of systematic innovation), and this is where the real return on data investment is needed. The common core of jobs-to-be-done in all modern organizations includes activities such as developing a strategy, managing risk and optimizing resource allocation.
Data complexity looms over all of these, and performing any of these jobs in today’s environment requires the ability to cut cubes out of the fog. Big data and data analytics play a key role, to be sure, but tools alone will not be enough.
A worthy approach for addressing complexity and enabling an effective attack for jobs-to-be-done, is what I term “advanced modeling,” and what Scott Page (author of The Model Thinker and Complex Adaptive Systems) calls “The Many Models Approach.” This approach allows organizations to develop low cost, persistent and surprisingly accurate environment-wide virtual worlds in which they can try out potential solutions at no risk.
A major benefit of the advanced modeling approach is that it enables developing solutions and optimization across the entire enterprise architecture – which is the Holy Grail for leaders faced with multiple constraints.
Art and science
There is both art and science to this approach. The science lies in understanding and applying the models; the art is in knowing which to apply when. At any rate, the basic idea is to follow this cycle:
- Model the mission environment using the appropriate techniques, which might include things such as business process analytics, Lean Six Sigma or game theory. (The full list of possibilities is quite long, but the good news is that most organizations already have skilled practitioners in the majority of them.)
- Test solutions through simulation, prototypes and pilots.
- Learn from the results.
- Deploy, scale and evaluate solutions that test out.
- Enrich the model (render it increasingly more “robust,” that is, realistic) through successive approximation leveraging feedback, emergent findings, data analysis and the use of additional modeling techniques.
In a previous role, others and I used advanced modeling to illuminate several rather profound findings relative to federal counter-trafficking efforts at the U.S. southwest border. While I will not detail the findings here, suffice to say we were able to pinpoint aspects of border security that needed improvement.
The findings required only a couple of megabytes of data, subjected to several modeling techniques and evaluated by domain and modeling experts. The findings gave the government a deeper situational understanding of the threat and operating environment and enabled them to allocate resources optimally, develop more effective capability development plans, and improve operational excellence.
As 20th century mathematician John Tukey said: “Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise.” Modeling enables the former, while common organizational practice tends to provide the latter.
Advanced modeling framework gives us a sufficiently robust method to enable all organizational investments to be evaluated along the same pathway, and provide leaders with a roadmap for sustained competitive advantage.
For another perspective on the importance of data sharing, read my colleague Thomas Krall’s post, “How 9/11 crystallized the urgency of data sharing.”