Jan 17, 2021

Accenture: Prioritise data to pave the way for AI models

Accenture
Data
ArtificialIntelligence
Kate Birch
3 min
Prioritisation over perfection is the message from Accenture to companies stuck in data paralysis as they prepare for AI technology
Prioritisation over perfection is the message from Accenture to companies stuck in data paralysis as they prepare for AI technology...

Prioritisation over perfection is the message from consultants Accenture to companies stuck in data paralysis as they prepare for Artificial Technology (AI) technology.

Companies can be overwhelmed by the volume, velocity and variety of their data and find it difficult to access its fourth, and probably most important point, v: value, delaying the transition to AI models claims the report from Accenture.

“My advice is to assess what needs to be done with your data across business functions, but then isolate a small area that is a priority for the organisation,” said the author Jean-Luc Chatelain , Managing Director – CTO Accenture AI (Applied Intelligence).

“This is where you can make a focused, valuable start and gain some momentum, with a view to gradually expanding or replicating the approach over time.”

Prioritisation over perfection

AI projects need data and according to Accenture it doesn’t have to be perfect, but it needs to have enough quality and consistency for useful patterns to emerge. 

“The better the data, the better the AI,” is the message in Accenture’s report, Overcoming data paralysis when preparing for AI.

But Chatelain points out that for many companies, there’s a problem: 85% of their data is either dark (whereby its value is unknown), redundant, obsolete or trivial.

“It’s not always easy to determine where you will find value, but to understand the landscape, the data needs to be cleaned up and integrated into your business. It’s all about making sure that the data has a structure and format that will enable you to develop it into the training data you need for your AI models,” he said.

How do you clean up data?

Restructuring the data is an onerous challenge due to format inconsistency. For example, an address could be provided in a variety of ways from New York, New-York or NYC. AI models will represent these as three separate entities unless trained to associate them.

However, Chatelain doesn’t advocate putting data scientists on the task as this isn’t always the best solution as they could feel trapped in “the data dungeon”.

“To make the most impact from data and pursue valuable data-driven transformation, companies will want to avoid this data paralysis and uncover ways to move the AI agenda forward,” he said. 

Chatelain’s advice to businesses stuck in data paralysis is to choose 10 pain points where it needs to improve, then rank them and focus on the top two which will achieve a substantial improvement on a key metric. 

“Find an example like that in your business and zero in on it before moving to the next pain points. In this way, you secure tangible AI successes, win the confidence of stakeholders and establish methods you can replicate and scale up,” said Chatelain.

The report also highlights the risk of over-ambitious strategies and cites a case study of a global financial firm with large-scale disorder resulting in a failure to change when it came to their AI agenda.

“When we began to help, rather than trying to revolutionise all the organisation’s data in one go, we found the better strategy to be looking at the firm’s most critical pain points and its most valuable business units and geographies,” commented Chatelain pointing out the firm should have achievable ambitions.

“This helped us determine where data improvements and AI would have the greatest impact. Focusing on smaller-scale, but highly valuable, transformation has increased the speed of returns and will accelerate enterprise-wide transformation in the long run. Now, we have established a repeatable process which helps the firm rapidly replicate and scale digital transformation in other areas of their organisation,” he said. 

Share article

Jun 18, 2021

GfK and VMware: Innovating together on hybrid cloud

GfK
VMware
3 min
VMware has been walking GfK along its path through digital transformation to the cloud for over a decade.

GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.  

In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade. 

“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.

Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.

By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.

One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.

“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.

Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs. 

“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.

The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment. 

The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.

One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.

“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.

“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client. 

“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”

Share article