May 19, 2020

How much is your data worth? Putting a price on cyber security

Cyber Security
Nick Pollard, Guidance Softwar...
3 min
How much is your data worth? Putting a price on cyber security

According to a report from Symantec, 500 million identities were stolen or exposed online in 2015. And, with the recent acceptance of the new EU General Data Protection Regulation (GDPR), all organisations have a responsibility to protect their IT infrastructure to ensure their data is secure. Alongside the hefty fines that can now be imposed for improper handling of customer data, the loss of said data can easily ruin a company’s reputation.

However, not all data is created equally and a fundamental part of effective security and crisis management is understanding the relative risk associated with the loss or theft of different types of data; within each organisation there’s typically a ‘hierarchy’ of data which means that, should a breach occur, a proportionate response can then be triggered. Calculating the relative ‘value’ of different data is key to implementing the right response. This can save valuable time in the aftermath of a breach and ensure priorities are set according to your sensitive data profile.  

Aligning data value with the correct response

A recent report highlighted the low cost of cybercrime services available but it’s the relative worth of sensitive data that needs to be understood. Without this, it’s almost impossible to perform a risk assessment. There is no ‘one size fits all’ approach to security protection or incident response. The response to the loss of multiple customer records would be very different to the response following the loss of intellectual property such as the blueprint for a new product. 

Here we outline the key steps that can be followed to ensure you assess the value of your data and can implement processes to protect it adequately.

  • Take stock of all data. A thorough audit of your IT estate will ensure you have the full picture regarding sensitive data locations.  
  • Classify and identify high risk, high worth data. Assessing the value of data is a process that varies depending on the organisation size and sector. This considers factors such as:  the regulatory impact of the loss of data; the cost of downtime / replacing or recovering this data, the financial impact in terms of the organisation’s reputation and, for public companies, how it would impact the organisation’s share price, credit rating, and regulatory burden. 
  • Map and track data within the organisation: you need to understand not only where it’s stored, but also how it moves across the network. What safeguards are in place to restrict this movement within and beyond an organisation?
  • Share the hierarchy with relevant teams. This is a cross-departmental exercise with the ultimate aim of ensuring that the IT/security teams know where the most valuable data is, and can implement the appropriate security controls.
  • Tailor the Crisis Management Plans. Once you know what the significant risks are, crisis management plans can be tailored and customised so appropriate measures are in place to cover different scenarios. Protecting sensitive data involves a chain of decisions impacting different departments across an organisation from IT to legal, PR and HR. With a well-documented and tailored plan, individuals across the organisation will know the correct processes and their responsibilities, according to different incident types.  
  • Educate staff. Everyone in the organisation has a responsibility to protect the data they handle. Understanding its value and educating staff on the commercial worth of records they’re working with can help to reinforce that it’s an asset that needs to be protected, just like physical property.

Understanding the worth of your assets is an important step on the road to effective security protection and response strategies. It not only means that you can implement that right safeguards around your data, but also that the response fits the magnitude of the breach. 

Nick Pollard is UK General Manager, Guidance Software

Share article

Jun 18, 2021

GfK and VMware: Innovating together on hybrid cloud

3 min
VMware has been walking GfK along its path through digital transformation to the cloud for over a decade.

GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.  

In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade. 

“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.

Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.

By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.

One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.

“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.

Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs. 

“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.

The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment. 

The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.

One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.

“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.

“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client. 

“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”

Share article