May 19, 2020

Where is Big Data Taking Us?

Big Data
DCSL Software
Nick Thompson
3 min
Where is Big Data Taking Us?

The term Big Data has actually been around since World War II, and was originally used to describe working with huge amounts of information. But when Big Data is talked about today, it’s referring to datasets that are too large or too complex for processing using traditional data management applications. As this mass of data continues to grow at an ever-increasing speed, businesses constantly face the challenge of handling, storing and analysing it in the most cost-effective way.

Nick Thompson, Managing Director, DCSL Software, highlights some of the trends he expects to influence the future of Big Data Management.

  1. Edge Computing
    Many companies deal with unnecessary data that has limited use and becomes irrelevant quickly. A solution to this is to move the actual data analysis closer to where the data is collected, which could be an IoT device, a piece of machinery or a sensor. With edge computing, you can reduce the amount of data that needs to pass through your networks, improving the performance of your systems – and making the analysis faster. This also means the IoT data can often be easily deleted once it is no longer needed, which saves storage space and costs.
  2. Machine Learning
    Machine learning will continue to play a central part in the future of Big Data according to analyst firm Ovum. This technology can help businesses become more agile in their use of the data generated. We expect to see a lot of machine learning as well as Artificial Intelligence in the management of Big Data going forward.
  3. Algorithms for Sale
    There are industry voices that predict that the business of the future will buy key algorithms rather than software. This would allow the organisation to get the exact processing of data that they are looking for, with the ability to customise the algorithm to perfectly fit their data needs. However, there will of course still be a need for visual interfaces and analytical applications, so we will most likely see a combined improvement of how software and adaptable algorithms work together.
  4. Continued Data Growth
    Data is growing at a speed and scale that is becoming almost impossible to imagine. Here are some figures from IDC that illustrate the sheer enormity of Big Data:
    40,000 search queries are performed per second on Google. This means 46 million searches per day and 1.2 trillion per year.
    Facebook users send roughly 25 million messages and watch 2.77 million videos per minute.
    300 hours of video are uploaded to YouTube every minute.
    By 2020, the new information generated for every human being will amount to 7 megabytes per second.
    By 2020, the accumulated volume of Big Data will consist of approximately 44 zettabytes – or 44 trillion Gb.
    By 2020, business transactions (including both B2B and B2C) via the internet will reach 450 billion per day.

The Bigger The Better? 
Many businesses are nowhere near being able to tap into the vast amounts of data that is being generated. Big Data isn’t helpful unless you can access it and make sense of it. Data software provides businesses with the access and insights they need in order to move forward and improve, but many don’t know how to use these tools effectively. Disruptors build their companies around data, and to compete in today’s business arena you need to be able to not just understand data quickly but use it. In the future, organisations will need to shift towards what many call ‘Fast and Actionable Data’ – an approach that allows businesses to easily analyse data and draw useful, actionable information from it. It’s not about how much you have, it’s about what you have, data sets should be relevant and used proactively while the software behind it should be agile and progressive. 

Share article

Jun 18, 2021

GfK and VMware: Innovating together on hybrid cloud

3 min
VMware has been walking GfK along its path through digital transformation to the cloud for over a decade.

GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.  

In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade. 

“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.

Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.

By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.

One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.

“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.

Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs. 

“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.

The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment. 

The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.

One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.

“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.

“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client. 

“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”

Share article