May 19, 2020

How to unlock your IoT potential

IoT
capgemini
Real GDPR
3 min
How to unlock your IoT potential

Last year, the automated pet feeding company PetNet experienced a system failure that stopped pet owners being able to feed their pets remotely. The glitch, which was blamed on a third-party server service, affected about one in 10 users and triggered an outpouring of complaints on social media from worried pet owners via the hashtag “internet of stupid”. For PetNet, a company only made possible thanks to the rise of smart devices, ubiquitous connectivity and the Internet of Things (IoT), this error could have been avoided if system failures had been accounted for and properly tested in the design phase of the service.

Start-ups led, IoT based initiatives are quickly becoming a core building block of digital business transformation initiatives at established organisations. According to Gartner, 50 percent of businesses are planning to complete at least one IoT project by the end of the year. However, these early projects will not be without their challenges as organisations learn new skills and practices. Until 2018, Gartner predicts that 80 percent of projects will take twice as long as planned due to poor due diligence, skills shortages, inadequate sourcing practices and cybersecurity issues, IoT technologies can often cause more problems than they solve.    

Until recently, businesses were mainly using the IoT to connect operational systems and assets, driving efficiency and enabling the creation of new revenue streams through greater intelligence and automation. However, the emphasis is quickly shifting to consumer facing services as companies seek to create differentiating and data rich experiences for their customers.

Yet the speed of consumer business means these newer IoT initiatives can become a poisoned chalice as businesses rushing to deploy new services ahead of competitors forsake quality for speed. The result is often low quality IoT infrastructures that are constantly crashing and a fault finding service based on customer complaints. What follows is money and time wasted as organisations rush to fix bugs and glitches at the critical point in the chain where any service errors are visible to the consumer.

In the face of pressure from the line of business teams to deliver new IoT-reliant services, one way CIOs can maintain quality assurance is by investing in a robust testing strategy that can handle IoT workloads and fix problems before they happen. The amount of IT budget devoted to testing is already expected to rise to 40 percent by 2019, and 40 percent of organisations see predictive analytics and cloud based environments as the key to handling IoT induced workloads. Machine-led testing, driven by automation and artificial intelligence will be one of the key enablers helping businesses adopt a predictive approach to testing.

With 48 percent of businesses failing to manage the demands of multiple test environments, one way organisations can move beyond delivery pipeline automation is by adopting agile methodologies and DevOps into their testing strategies. Building automated tests is a complex process, but the key to success is early engagement with key stakeholders, such as business management and development teams to ensure quality is maintained at every level of an IoT initiative development cycle.  

To achieve this change, strong leadership is required from CIOs around the importance of governance and education for the rest of the business on the potential consequences of not balancing security and testing with speed to market for IoT initiatives. By breaking down business siloes and actively collaborating with other leaders such as the CMO and executive leadership team, CIOs can help their companies find success from IoT, not brand damage. By deploying a risk mitigation-based strategy, businesses can ensure the IoT has a positive impact on their business and most importantly, help to build and maintain positive relationships with their customers.

By Sathish Natarajan, Testing Leader, Group Strategic Accounts, Capgemini

Read the April 2017 issue of Business Review Europe magazine. 

Follow @BizReviewEurope

Share article

Jun 18, 2021

GfK and VMware: Innovating together on hybrid cloud

GfK
VMware
3 min
VMware has been walking GfK along its path through digital transformation to the cloud for over a decade.

GfK has been the global leader in data and analytics for more than 85 years, supplying its clients with optimised decision inputs.  

In its capacity as a strategic and technical partner, VMware has been walking GfK along its digital transformation path for over a decade. 

“We are a demanding and singularly dynamic customer, which is why a close partnership with VMware is integral to the success of everyone involved,” said Joerg Hesselink, Global Head of Infrastructure, GfK IT Services.

Four years ago, the Nuremberg-based researcher expanded its on-premises infrastructure by introducing VMware vRealize Automation. In doing so, it laid a solid foundation, resulting in a self-service hybrid-cloud environment.

By expanding on the basis of VMware Cloud on AWS and VMware Cloud Foundation with vRealize Cloud Management, GfK has given itself a secure infrastructure and reliable operations by efficiently operating processes, policies, people and tools in both private and public cloud environments.

One important step for GfK involved migrating from multiple cloud providers to just a single one. The team chose VMware.

“VMware is the market leader for on-premises virtualisation and hybrid-cloud solutions, so it was only logical to tackle the next project for the future together,” says Hesselink.

Migration to the VMware-based environment was integrated into existing hardware simply and smoothly in April 2020. Going forward, GfK’s new hybrid cloud model will establish a harmonised core system complete with VMware Cloud on AWS, VMware Cloud Foundation with vRealize Cloud Management and a volume rising from an initial 500 VMs to a total of 4,000 VMs. 

“We are modernising, protecting and scaling our applications with the world’s leading hybrid cloud solution: VMware Cloud on AWS, following VMware on Google Cloud Platform,” adds Hesselink.

The hybrid cloud-based infrastructure also empowers GfK to respond to new and future projects with astonishing agility: Resources can now be shifted quickly and easily from the private to the public cloud – without modifying the nature of interaction with the environment. 

The gfknewron project is a good example – the company’s latest AI-powered product is based exclusively on public cloud technology. The consistency guaranteed by VMware Cloud on AWS eases the burden on both regular staff and the IT team. Better still, since the teams are already familiar with the VMware environment, the learning curve for upskilling is short.

One very important factor for the GfK was that VMware Cloud on AWS constituted an investment in future-proof technology that will stay relevant.

“The new cloud-based infrastructure comprising VMware Cloud on AWS and VMware Cloud Foundation forges a successful link between on-premises and cloud-based solutions,” says Hesselink. “That in turn enables GfK to efficiently develop its own modern applications and solutions.

“In market research, everything is data-driven. So, we need the best technological basis to efficiently process large volumes of data and consistently distill them into logical insights that genuinely benefit the client. 

“We transform data and information into actionable knowledge that serves as a sustainable driver of business growth. VMware Cloud on AWS is an investment in a platform that helps us be well prepared for whatever the future may hold.”

Share article