‘Big Data’ gets all the attention, but ‘Small Data’ can be sexier

Analytics has the power and potential to fundamentally transform business outcomes.

The confluence of fast-falling costs for sensors and computing, a renaissance in the application of mathematical and statistical tools in business, together with innovative software to harness them, is generating a powerful new opportunity for competitive advantage in data-driven decision making.

Sexy examples of increased profitability, such as calculating on-the-fly optimal individual price offers based on actual consumer buying behavior gain widespread attention. Sorting through billions of unstructured particles of “Big Data” to drive better business outcomes, while elusive for most firms, has a tantalizing appeal.

However, what is usually overlooked in the effusive Big Data vendor spiel is that all firms have structured “small data” which, when exploded into their atomic elements, can tell a very different story about what is happening right now and what is likely to happen next.

A classic example of “small data” is conventional management accounting reports which tell us, through a “rear view mirror”, what has happened.

One way to think about being in business is to imagine that you are in a sandstorm with grains of sand swirling around you. In the middle of the storm, it is difficult to understand what is happening.

Extend the metaphor to think of each grain of sand being an element of “Big Data”, representing, say, a minute of an employee’s time, a minute of a machine’s usage or a dollar spent.

At the end of a selected time period, conventional accounting processes aggregate each “grain of sand” into “buckets of sand”, according to the firm’s code of accounts. The obvious problem is that the summary “bucket” information loses the benefit of granularity.

Even in exemplary world-class implementations, this is woefully inadequate. Conventional accounting does not look to the future.

Subtle, fast-changing trends will usually be “invisible” until enough time has elapsed and damage is done before it becomes recognizable.

If, however, those data “grains of sand” could be easily and economically measured and repositioned into different dimensions, powerful new insight can be gained.

Valuable outcomes may include much more accurate measurement of the profitability of product or service SKUs, the profitability of individual customers or the profitability of alternate distribution channels.

New ways of measuring a firm’s performance become feasible such as the idea of viewing customers as investments, with a life-time value, comprising an acquisition cost, a stream of future returns and a risk profile.

Applying optimization algorithms, well established in stock portfolio decision-making, to a firm’s customer “investments” can identify the initial less-than-optimal combination of customers, represented by point A in the graphic.

With a targeted change in customer portfolio mix over time, it will be possible to move towards an optimal position anywhere on the efficient frontier. For example, higher returns can be earned for the same risk (point B) or the same returns can be earned for less risk (point C).

The rapid growth and enormous potential of the “Internet of Things”, with low cost embedded sensors measuring all types of activity and wirelessly communicating those results, is inextricably entwined with this idea.

Highly granular data can be collected from sensors tracking human activity, plant usage and the expenditure of each dollar. Each “grain of sand” of data will likely have different speeds and trajectories. Threshold reporting, with predetermined trigger alerts, focuses attention on what matters, clear of the fog of massive data sets.

Predictive analytics, employing statistical forecasting tools, can cut through a firm’s dynamic data as it unfolds, telling us about where and how fast those elements are moving. Confidence rises about what we think may happen next.

Cloud computing is an essential partner, providing supercomputer capability at the end of a credit card, used only when needed, processing an avalanche of data, all with rapidly declining costs.

How does one get started in applying these tools to a typical firm? Firstly, it is important to begin with a manageable low risk project and prove the return on investment.

Software tools are rapidly emerging which allow firms to effectively run a “parallel” accounting system which focuses on granular analytics, without any changes to the existing code of accounts or reporting processes.

This lowers the barriers to entry and risk, as well as providing an easy reconciliation between the existing and new reports, thus gaining a high level of confidence in the outcomes.

It is important to understand that becoming a data-driven firm is an on-going journey, not a once-off transition. Internal cultural adoption is vital and may take some time for many firms. Committed leadership from the top is the most important prerequisite.

Analytics presents a compelling competitive advantage opportunity. Start now, or be left well behind.

See the original NBR Article from May 2014 here.

Have some questions? Please let us know how we can help.