Know thine enemy: think like a cyber attacker in order to keep your data secure
Data protection in today’s business environment represents a very difficult prospect. Data is everywhere: on laptops, with vendors, on mobile devices, and in the cloud. As a result, there are countless potential avenues of attack that must be considered and effectively protected.
From an attacker’s perspective, it’s a target-rich environment with numerous avenues open to exploitation. Whether going for a full-scale assault on a target’s infrastructure, exploiting backdoor vulnerabilities in web applications, or phishing for credentials of individual employees, the options are nearly limitless.
Who, what, why?
To effectively combat this, security professionals must determine who might want to attack them, the types of attacks those adversaries have used in the past, and which avenue might be most attractive. The concept of “thinking like an attacker” is not new, it’s a simplified way of describing threat modelling that dates back as far as 500BC and the legendary Chinese military strategist Sun Tzu. Understanding an enemy properly can give defenders a significant upper hand. However, just as Sun Tzu warns in The Art of War, in order to be truly effective, defenders must not only know their enemy, but they must also know themselves.
It’s not enough to assume a perimeter defence will keep criminal hackers out. Effective strategies need to start with the assumption that adversaries will sometimes get inside the outer walls. Deploying a layered defence model will help to thwart those who do manage to bypass initial layers. Common weak points include mobile devices and laptops that connect to the Internet when off the corporate network, and social engineering attacks such as phishing. It is also important to remember that attackers can target internal employees or external members of a supply chain, so any effective security measures must also extend outside the walls of the business itself.
Effective multi layered security
What does this look like in reality? Many companies discuss the ‘kill chain’ approach to layered data protection. Essentially, this means deploying defences to counter each stage of a possible cyber attack (planning, code introduction, command and control, expansion/lateral movement, target identification, exfiltration). This is done on the acknowledgement that some attacks will inevitably succeed in breaching some defensive barriers but by having numerous layers in place, it means defenders get multiple opportunities to ‘kill’ the attack before any data is actually lost.
Let’s take a simple example: the case of an attacker who is targeting a company’s design information. This information may be stored in many places, but the “mother lode” will be in the data stores used by the design software (or a version control system, in the case of source code). These devices are undoubtedly in secure areas, and network defences are monitoring external traffic.
The attacker has a few options, but his first goal is to gain extended access to the systems, ideally with valid and privileged credentials. This will provide him with greater freedom to operate and explore the target at his leisure. If he is successful, we have the worst-case scenario: an attacker with authentic credentials and access to the information they wish to steal.
The options now open to the attacker for removing sensitive data are numerous, including:
· Copying files to an external device
· Copying files to a cloud server
· Emailing files to external accounts
· Taking screenshots of files, and exfiltrating the resulting image file (without a CAD or similar file extension)
· Printing the files, locally or on a remote printer
The only way to protect data in this scenario is to control what actions can be taken with the data itself. Access control doesn’t help, as the credentials are authentic. Privileged user management can help, but is typically used to control administrative rights, not what can be done with data. Instead, we need to apply user privileges directly to the data. This way, even if a criminal gets in it is still extremely difficult for them to get anything back out. Think of it like this, if a burglar breaks into a house, it’s a lot harder for them to steal the furniture if it has all been nailed to the floor.
A data-centric approach is the key
Data-centric security allows organisations to control the actions taken with data, by any user. Even privileged users can be monitored and prevented from accessing data if their behaviour is deemed suspicious or abnormal. By looking at each requested action in context – accounting for the data, the user, and the action (for example, printing, on unapproved printers) – organisations can protect sensitive information without hampering legitimate business behaviour.
Understanding all avenues of attack is part of threat modelling; businesses need to understand who might be an adversary and their likely approaches. At the same time, they also need to know themselves and the processes they have in place to deal with any scenario that may arise. At a practical level, it doesn’t matter if the attacker is a malicious insider, or an outsider with stolen credentials, both require the same protections and security precautions to be in place. Threat modelling helps set effective policies that protect the data from malicious (or inadvertent) actions. In the words of Sun Tzu, "Know thy self, know thine enemy. A thousand battles, a thousand victories."
Erik Driehuis has 17 years of startup and sales leadership experience. He launched his sales career with Parametric Technology Corporation (PTC) and had success at MatrixOne, the leading independent product lifecycle management vendor, which was acquired by Dassault Systemes in 2006. At Matrix One, he led the Benelux and Nordics operations with major customers such as Philips and Nokia — his region was a key driver for Europe’s growth numbers. Erik was also Vice President of Sales Europe for LogMeIn, where he set up the European headquarters, drove strategy and execution, and grew the European business exponentially. Recently, he was with enterprise mobile startup Vervio, where he was Vice President, Sales EMEA. Erik holds a degree in business economics from the Erasmus University in Rotterdam.
GfK and VMware: Innovating together on hybrid cloud
GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.
In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade.
“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.
Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.
By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.
One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.
“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.
Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs.
“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.
The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment.
The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.
One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.
“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.
“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client.
“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”