Commentary - We live in a time of change: a time of changing preferences and attitudes; a time of economic uncertainty driven by turmoil in major financial markets and increasing political tensions; a time where environmental pressures from extreme weather shifts or natural disasters routinely disrupt the supply of key commodities. While companies can’t control volatility itself, they can control the effects of volatility. This is the promise of big data.
Recent years have been markedly defined by the rate of change, which seems to have accelerated for just about everything. Data has now become “big data”, powered by an explosion of new devices that publish data at increasingly shorter intervals. The sheer scale is turning the traditional application-data model inside out; we can no longer simply push more data into continually supersized versions of existing platforms. Data now becomes the platform, surrounded by and supporting a host of real-time applications whose role is to extract and operationalize information on a massive scale. The result is a new and dynamic ecosystem that will drive innovation and fuel profitable growth.
As data density grows exponentially, so does noise, obscuring useful information. A new breed of applications with advanced pattern recognition capabilities is required to systematically separate signal from noise and create meaningful information from volumes of real-time data, to quickly make sense of a huge mass of dynamic data and use this knowledge to make good business decisions in a fast-paced and ever changing world. If this sounds far-fetched, be assured that it is happening right now.
As an industry, manufacturing is one of the sectors most threatened by volatility, but also one of the best suited to benefit from big data. First let’s examine the risk. Traditional manufacturing supply chains run well in static conditions, but are challenged by dynamic markets with unpredictable demand. The root cause lies in systems whose mathematics remains fundamentally unchanged during the last half century. Popular demand forecasting techniques are still based on time series statistical analysis of historical orders. Last year’s orders are by nature disconnected from current events and have no way to account for turmoil in the Eurozone or a hurricane hitting land. Today, we have access to much richer data sources - with more than 100,000 times the data density - that better reflect the current state of the supply chain. Wouldn’t you rather make business decisions based on what is happening right now?
What started in 2003 with the first commercial deployment of demand sensing software has grown quickly in recent years as leading companies look to build a new generation of agile and responsive supply chains that can dynamically sense and react to change. Their formula for success includes:
- Better Math to harness reams of data for forecast accuracy is already available to manufacturers; they just need to implement it. This math is specially designed to process data intensities several magnitudes higher than current systems and at finer resolution – using daily-data-daily to quickly sense changes, react and capture new sales opportunities. Knowing more tissues are required in the Northeast is only important while there is a flu outbreak, not a month later when everyone is feeling better. Manufacturers can improve forecast accuracy, on-shelf availability and customer service by taking advantage of better math.
- Even More Data from retail locations such as point of sale and retailer inventory levels provides an end-to-end account of the supply chain. With large multinational companies easily serving 100,000 retail locations, data intensity increases dramatically with each new echelon in the supply chain. But with the right math, the business case is compelling – why make a billion dollar inventory decision using significantly inferior information?
- Efficient and Touchless is a requirement for meaningful business impact. The promise of big data comes from the systematic operationalization of data for all products and locations. With short run-time windows to process and publish masses of data, applications must be efficient. There is no sense having daily-data-daily if it takes a week to run the model. Likewise, there is too much data to review forecasts by product and location manually - without automation, companies would need armies of planners.
It is the promise of big data in practice.
Robert F. Byrne is president and CEO of Terra Technology.