Adopting Big Data and analytics technology in supply chains
Business Chief sits down with supply chain experts to discuss the benefits and challenges of adopting Big Data and analytics in supply chains.
“People talk a lot about data being ‘the new oil’, and cognitive supply chains are indeed making a huge impact, allowing businesses to use Big Data to drive themselves onto the next level. Using artificial intelligence and machine learning to process data makes it increasingly realistic for systems to make smart decisions without the need for human intervention,” says Fred Baumann, GVP for Industry Strategy at Blue Yonder. “When businesses are able to identify disruptions and act with immediacy and decisiveness, the effect will be transformational. Alongside the short-term problem solving, cognitive supply chains provide longer-term learned recommendations to enable businesses to stay ahead of the curve.”
Agreeing with Baumann, Grant Millard, Director and Technology Services Specialist at Vendigital, explains that traditional data analysis methods are outdated and inefficient. “More often than not, companies are operating in a data vacuum. Analysis is based on static data sets which are created, and then recreated, from the ground up. Companies are continuously manipulating the data to get the insight they are after and then repeat this process every time insights are required. This is not only inefficient, but costly and the result is reliance on systems that fail to deliver clear and credible data-based insights. This is where Big Data and analytics can help so that the user is no longer required to analyse data. Rather, the system is telling them what action they need to be taking.” Kirsty Braines, COO at Oliver Wight EAME adds that, “it is a proven benefit that advanced analytics for the supply chain industry increases yield, whether through improved production or reduction of waste. Advanced analytics can play a vital role in identifying issues that can impact yield, as well as help to reduce operating costs, manage inventory and create a more personalised customer experience.”
The challenges of adopting Big Data and analytics within Supply Chains
“The world is becoming more complex as more business and consumer interaction channels migrate into the digital space. This complexity is evident in the amount of data these interactions create across an increasing number of channels,” says Jonathan Clarke, Manager, Statistical Modelling at LexisNexis Risk Solutions. As a result, when it comes to Big Data and analytics, there are a number of challenges that companies can face including data manipulation, adherence to GDPR, credible data, talent and digital maturity. “Technologies such as AI, Industry 4.0, blockchain, Big Data and analytics are game changers for businesses, however it’s all advanced technology and the clue is very much in the name. A huge proportion of companies haven’t reached the maturity to completely handle data, with the technology not fully understood, let alone successfully implemented. If organisations don’t align technology with their business plans, they risk making a very expensive mistake in terms of time and money. This applies to data too. Unless organisations dedicate time beforehand to understand what information they want, what purpose it’s going to serve and how they’re going to manage it, analytics becomes an exercise in futility,” comments Braines.
“Additionally, there is little point in importing this technology into the business if the data that exists is not credible, as this could lead to incorrect predictions,” adds Millard. “It is also important that business leaders import the right expertise. Sometimes, they fail to do this and either get a data scientist who doesn’t understand the business context or an industry expert who knows nothing about data science. Getting Big Data and analytics to deliver value is a multi-disciplinary activity.” Ultimately, Millard stresses that “for organisations considering investment in Big Data and analytics to improve their supply chain management, they need to understand that there is no one-size-fits-all. If these factors are not fully considered at the outset, any investment could deliver negligible value.”
Contemplating the future of Big Data and analytics within supply chains, Baumann speaks of the potential of the technology. “The use of Big Data and analytics in supply chains is rapidly increasing, with it being possible to achieve a near-autonomous supply chain in the future. However, for this to happen, businesses need to get to a point where they feel confident and can trust that technology can identify disruption and subsequently take action. Once this has been achieved, the effects will be incredible: just imagine the possibilities that will be provided by a self-learning, self-healing supply chain that is able to predict challenges and transform them into opportunities for growth.” Agreeing with Baumann, Peter Ruffley, CEO of Zizo, sees emerging technologies, such as the internet of things (IoT) and AI, having the ability to generate greater efficiency within the supply chains of the future. “Edge computing is also going to provide a much easier way for businesses to quantify and understand what they are investing in when looking at collecting data, processing it and moving it. It provides the opportunity to have greater agility and real time analytics.”
Clarke does however comment that, in order to speed up the adoption of these technologies, “government and regulators have a role to play to ensure that legislation is clear, to guide companies on the correct usage of this technology. The significant benefits offered by the increased use of Big Data and analytics has to be balanced with the lawful, compliant use of data.” Raj Bawa Operations Director at JBi Digital adds that, “while the culture has improved significantly in this area,” he too believes that the need for impactful enforcement or policing of big companies is urgently needed to truly reap the benefits of the technology.
GfK and VMware: Innovating together on hybrid cloud
GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.
In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade.
“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.
Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.
By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.
One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.
“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.
Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs.
“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.
The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment.
The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.
One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.
“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.
“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client.
“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”