The way Big Data is being used is changing and transforming supply chain management along with it, Johanna Parsons investigates.
Big Data has formed the backbone of supply chain management for years, but the Internet of Things is developing to a new level with “digital twins” offering warehouse omniscience and the potential to simulate entire supply chains. The sophistication and sensitivity of such systems are enabled by the volumes and granularity of data being produced, primarily as a result of e-commerce.
Monica Truelsch, director of solution strategy at Infor Nexus, says that e-commerce has upended a lot of thinking in supply chain management. Some of the basic tenets, such as forecasting, have transformed entirely and Truelsch puts this down to the volatility of consumer trends.
“Forecasting by and large has not been able to reliably prepare people for the volatility in the modern market place, the dramatic swings in consumer behaviour that come about because of social media, viral trends and so forth,” she says.
“The speed of change and disruption and demand or lack of demand has accelerated dramatically beyond what people were used to five or ten years ago.
“So forecasting in the old way has become much less valuable and has been replaced by a desire to more rapidly sense what is happening in the market place or in operations at the moment and to perhaps use artificial intelligence or even some less sexy but still robust data science tools and statistical methods to give a more immediate understanding of trends… and to adapt more quickly, so that you’re building in agility into the business to respond to changes quickly as they happen,” says Truelsh.
She says the focus on visibility enables businesses to adapt and respond more quickly and more precisely, in order to catch up quickly or ideally get ahead of the trends. And that’s where digital twins come in.
Tom Charlton, operations director at Relex, explains that digital twins are computer simulations of a warehouse and that allows retail companies to take their planning to the next level. “With this technology they have the power to run countless simulations of their logistics operations, test a variety of parameters, and assess the various outcomes to identify the optimal setup.
“This can be a game changer when one considers the extent to which that takes the risk out of applying changes to logistics operations; single changes can be made confidently in the knowledge that one has found the best route to minimising operational costs,” says Charlton.
“Simulations can be run for exceptional trading periods in order to stress-test the current setup, or to suggest further improvements… However Relex takes the whole digital twin concept further. I was chatting with our chief executive Mikko (Kärkkäinen) and he’s particularly excited that we create digital twins of entire supply chains, not just warehouses. Just think! You can simulate and scenario test any portion of your supply chain operations from supplier handover to check-out,” says Charlton.
This reflects a shift whereby more operators are willing to share data to create a more robust and responsive supply chain. This will involve financial investments and, crucially, faith and goodwill. But the consensus is that such collaboration would be win-win for all involved.
Infor Nexus’s Truelsch says that such end-to-end visibility will also become increasingly valuable because of its impact on brand integrity. “A negative consumer perception of an unexpected product recall or a reputation of poor sourcing practices, child labour for instance in the supply chain, can have enormous repercussions. It’s important for these brands to protect their reputations, to monitor their global supply chains with much greater control and granularity than ever before,” says Truelsch.
“It’s about the digitalisation and the visibility upstream into the supply chain to ensure that the brand is keeping its promise to consumers about what they’re doing in terms of sustainability, reduction of carbon emissions… that all requires record keeping, sensing, tracking, correlation of data at an extraordinary level in global supply chain.”
The consensus is that as yet, it is the bigger hitters, typically firms with revenues of a billion US dollars or more that are investing heavily in exploring the parameters of such data. “Mid-market companies… don’t necessarily like to be out on the bleeding edge of some of these investments just because the risks of failure for early adoption can be pricey,” says Truelsch.
“The problem for many organisations… is that they don’t necessarily have the tools or systems in place to have this extensive data sources to create this virtual copy or digital twin… We see in our global practice at Infor that companies are all along the spectrum of maturity in terms of IoT consumption… what they envision they will be able to do with it… and how it will transform their business varies considerably,” says Truelsch.
Eric Rice, principal product marketing manager at Honeywell Intelligrated, gives sage advice for firms considering developing their use of data. “Begin with the end in mind – what problem are you trying to solve? As you’re starting, create a business case. Many projects stall because they don’t have a business case or constrained scope. Don’t ‘boil the ocean’ – start small, prove your case by showing value quickly and then reinvest by iterating the process to expand. Also along your journey, select a trusted vendor to help with execution,” says Rice.
The sheer scale of data could be overwhelming, and Charlton agrees that assessing what your business actually requires is of key importance. “You need to make sure that insights are capable of being acted upon within the bounds of the business’s logistics constraints. For example, having real-time inventory tracking in warehouses may sound very cool, but if the warehouse is only geared up to do one or two picking waves per day then it doesn’t actually benefit you,” says Charlton.
Peter Ruffley, chairman at Zizo, says that when it comes to IoT the most important question for an organisation is: what is the business attempting to achieve?
“The goal is to enable better, faster and more informed local decision making – both automated through machine learning and to provide individuals on the front line with the information they need to make instant decisions. And that means analysing data where it is created.”
This presents us with the concept of “edge computing” – essentially managing IoT data where it’s created, only transmitting the most relevant data to the centre for analysis. Ruffley reckons this approach is gaining ground. “It is the ability to analyse data at the edge – effectively on site – that opens the door to significant new opportunities.
“Just consider the value of providing each individual warehouse manager with immediate insight into the operational performance of that warehouse in real- or near real-time. Or the highly skilled petrochemical engineer working on a remote site, who can make critical data driven decisions based on the actual events occurring. The ability to collect and analyse data at the edge fundamentally changes the way IoT can be leveraged – and provides the opportunity for IoT deployments to realise their business goals,” says Ruffley.
Collaboration eases the flow for Thatchers
Thatchers Cider has been making traditional ciders for over 100 years. Since July 2018 the manufacturer joined Atheon Analytics’ cloud-based analytics platform, SKUtrak, to ease collaboration with grocery retailers such as Co-op, achieving a sales boost of some 22 per cent.
Thatchers now presses 500 tons of apples each day at peak season and supplies the UK trade industry, grocers, and exports to over 22 countries worldwide.
Chris Milton, off-trade & export sales director of Thatchers, says retailers and suppliers are starting to recognise the value that comes from sharing the same data source. “The Co-op is changing massively,” says Milton. “Earlier this year, they embarked on a major initiative to understand how they worked with all their suppliers and how they could better share data in a user friendly and timely manner. They currently take six core SKUs from us, but this initiative means that – irrespective of the number of SKUs you stock – you need to be able to read your performance across the whole supply chain.”
In July 2018, the Co-op partnered with Atheon Analytics and their multi-retailer SKUtrak service. The Co-op now shares all its daily “Flow-of-Goods” data with suppliers including sales value; supplier (inbound) service; and availability.
“With SKUtrak, we can track in real-time what’s happening from the point the stock leaves the factory, to the point it’s sold off the shelf to the consumer. It’s enabling us to make collaborative, meaningful business decisions with the Co-op.”
The 2018 summer included challenges such as a shortage of CO2 across the UK and Europe. “We were missing a vital ingredient that the whole supply chain depends on, while facing demand at an all-time high due to the hottest summer in 40 years, England’s performance in the Fifa World Cup and the royal wedding,” Milton explains.
“But with the Co-op SKUtrak on board for August, it meant we were able to see daily, exactly what was in the supply chain. We were able to manage stock, based on a conversation with the Co-op supply chain managers founded on the same data, to make sure it was sent to the right place.
“As a result, stock-outs for us have been at a minimum and we’ve maximised sales with the stock that we’ve got,” he says.
By sharing its daily ‘Flow-of-Goods’ data through SKUtrak, Co-op helped Thatchers achieve a sales uplift for the 2018 summer period (May to Aug) of some 22 per cent. In addition, despite a few availability issues, Thatchers also maintained its supplier availability, which did not drop below 98 per cent.
This feature originally appeared in the October 2019 edition of Logistics Manager: click here to subscribe to the print edition