May you live in interesting times! With the first few months of 2016 seeing an unusually severe El Nino event, severe commodity price volatility, and the Chinese economy bursting a few bubbles, players in the global commodity markets may be wondering whether the ancient Chinese Curse is as apocryphal as experts claim.
As markets fluctuate and uncertainty prevails, the need for fact-based decision-making is stronger than ever. Any business for whom the wholesale price of commodities is a major input cost should be familiar with market volatility by now – and the months ahead are set to demonstrate, if further demonstration were needed, that volatility is the new normal.
The sheer volume of data points available on which to base these decisions continues to grow. Of course, the commodities business has always been data-intensive. Physical trades that are hedged with financial derivatives create thousands of individual data points around suppliers, quantity, quality, storage types, transport options and the rest. Each of these represents multiple decisions to be made and often re-made before the trade is eventually closed, creating a significant amount of complexity – and large volumes of data.
But the pace of change and amount of data that organizations in all industries and all business sectors must now process in order to inform effective decision-making is picking up fast – and commodities is no exception. Enterprise data is growing at a rate of 40-60% a year – from under one zettabyte in 2010 to 40 zettabytes by 2020 – a 50-fold growth rate in ten years.
Part of that growth can be attributed to the arrival of the Internet of Things (IoT). Gartner’s IoT forecast of 2014 estimated that, by 2020, more than 35 billion things will be connected to the Internet. That means more data can be generated from previously unconnected devices and non-computerized equipment.
In the field of commodities, this can include bulk-handling equipment at grain storage sites, ports, and mines. Digital ships already feature in a number of merchant fleets, while talk of digital oilfields suggests that data available from daily operation of even the most remote wells could increase exponentially. And that’s just relatively structured data: insights from sources like social media add to the burden of data management. The volume of data available and requiring manipulation is growing by orders of magnitude.
Distinguishing the signal from the noise, when the noise is so overwhelming, requires a new way of thinking about the tools that are used. To avoid drowning in the data deluge, organizations need the toolkit that is capable of querying, analyzing, manipulating and deriving real insight to be used as a basis for fact-based, real-time decision making. Learn more in "Better Decision Making with Real-Time Data Analytics."
As a result of the big data phenomenon we are slowly seeing a shift from simple query and reporting to more advanced analytics. Industry watchers at Gartner have provided a useful definition of what is meant by the new analytics capability. They describe advanced analytics as:
“...the analysis of all kinds of data, using sophisticated quantitative methods (for example, statistics, descriptive and predictive data mining, simulation and optimization) to produce insights that traditional approaches to business intelligence (BI) — such as query and reporting — are unlikely to discover”
This distinction from traditional analysis and BI methods is an important one. Data analysis techniques, such as regression, forecasting, optimization, and simulation, have existed for many years. But their use in daily operations was limited both by access to data and by access to technology.
Because it took so much effort to prepare, clean, and normalize data sets using traditional tools, there were limits placed on how much of the data would be put into a data warehouse. Data analysts would typically pick and choose only the most important data sets to be entered into the warehouse in order to reduce what was already an overly burdensome and lengthy implementation process. In other words, the analytics were performed on incomplete data sets.
The second point is that not all users had access to the data sets that were selected. System design meant that analytics remained in the hands of data specialists or IT experts, rather than business users who needed to run an ad hoc query. Not only does this approach limit the people who can apply analytics to their own tasks, but it applies constraints to the types of queries that can be run.
The limitations of these kinds of systems are becoming all too apparent under the weight of the data they must now process, and the expectations of commodities businesses and their stakeholders. It is too easy to assume that, just because some analytical capability is currently available in the trading or schedule department, that a firm has the full analytical capability necessary to operate in today’s business environment. However, this can prove to be a dangerous belief.
Investigations conducted into the role of analytics across business of all kinds have shown overwhelming positive results. Research from management consultancy Accenture shows that the:
“Greater use of analytics is supporting firms as they cope with the inexorable acceleration in the pace of change. If you expect to surpass competitors in the race to keep up with change in the marketplace, you want analytics on your side.”
Accenture’s findings are backed by an experiment at the MIT Center for Digital Business, which showed that companies using analytics for decision-making are more successful than those that don’t. As reported in the Harvard Business Review:
“Companies in the top third of their industry in the use of data-driven decision making were, on average, five percent more productive and six percent more profitable than their competitors.”
Not surprisingly, Gartner found that facilitating analysis and decision-making is regarded as a top business priority for technology investment. In other words, working with the right analytics technology gives businesses an immediate advantage over their competitors.
Fortunately, technology advancements in distributed data storage and in-memory computing have made it possible for businesses to take advantage of advanced analytics. (Learn more in "The Technologies Behind Advanced Analytics in Commodity Management.")
With the right solutions, businesses can:
What’s more, domain-specific solutions, such as Eka's Commodity Analytics Cloud, have been built to address the particular data challenges experienced in the commodities markets. They enable commodities companies to gain advanced analytics without investing millions of dollars and untold man-hours.
This is a significant step on the evolution of commodity management systems: from simple data capture to advanced data analytics. Because these systems offer the ability to analyze information and create predictive models, firms can develop accurate, repeatable formulas to identify optimal scenarios while taking into account market conditions. Companies that do so will have a competitive advantage.
For any firm still dependent on standard business intelligence, basic analytics, or even still-lingering spreadsheets, the big data phenomenon is overwhelming. For those with the right tools it will open up new opportunities. As data volumes grow daily, the window for deciding whether to be a data champion or a data victim is growing ever smaller.