Upskilling: How to win the battle for data and AI talent

By Gregory Herbert, Senior Vice President and General Manager, EMEA, Dataiku
From proper governance to monitoring the right metrics, how to win the battle for AI and data science talent – Gregory Herbert, SVP and GM, EMEA, Dataiku

To achieve the best possible business value from AI, companies are looking to scale it and shift from one-off, custom builds to a future where they use data, machine learning (ML), and AI so ubiquitously that it becomes part of everyday business.

True scale can only come to those companies dedicated to teaching everyone how to analyse data with ML. However, many companies are still dealing with challenges to operating with AI at scale. These include difficulty in hiring people with AI skills and complications that arise from a lack of ability to identify good use cases.

Enter upskilling programmes. When an AI upskilling programme works well, it creates a virtuous cycle where business analysts can acquire AI skills and create value, which increases awareness of the value of AI. Companies that do this well are able to generate a sustainable flow of AI/ML talent and drive ROI across many business units and functions.

Read on to learn more about increasing the odds of success, including best practices for upskilling your current talent, and monitoring their performance.

Stop looking for unicorns

Many companies try to hire people with both exceptional AI and business skills. Those people are so rare that they’re like unicorns, and if you wait to find them to fuel your AI efforts, you could be waiting awhile. According to consulting firm QuantHub, there are three job openings for every data science candidate, and 83% of data science teams were hiring in the first half of 2022.

If companies rely solely on hiring or outsourcing data scientists, they are likely to experience delays in AI-related transformation programmes. Beyond this, they may miss out on opportunities to upskill business people and close talent gaps internally.

Getting domain experts involved is key to reducing risk and increasing ROI. A good upskilling program can get 10 to 100 times more business domain experts involved with AI development by providing self-service training for frontline domain experts. McKinsey said it’s “nearly impossible for businesses to deliver impact with AI without business domain knowledge.”

Some managers may fear that upskilling will increase talent attrition but, rather, the opposite is often true. Knowledge workers have more information about skills and talent markets than ever before. The information imbalance has tilted toward workers. When they feel that they’re falling behind, that’s when they leave.

Gregory Herbert, SVP, EMEA at Dataiku says companies that successfully generate a sustainable flow of AI/ML talent drive ROI across many business units and functions.

Choose a strategic upskilling approach: Interdisciplinary or functional

We recommend upskilling paths for every data worker, from spreadsheet users to your most sophisticated data scientists, all supported by a common AI platform.

There are two main types of upskilling programs. Interdisciplinary upskilling programs group all workers with the same skills regardless of their business function. So, for example, Excel power users from finance learn alongside Excel power users from supply chain, and Python developers from marketing learn together with Python developers from manufacturing.

An advantage to this approach is that people get interdisciplinary team exposure and groups are big.

The second approach is functional and more use-case specific. In functional upskilling, workers with similar skills in the same business functions learn together, such as PowerBI or Tableau users from finance with SQL programmers from finance.

The advantage of functional upskilling is that common use cases and terminology can be used, and these groups tend to drive quicker adoption and time to value for AI products.

Ultimately, the best upskilling approach for your organisation depends entirely on your workplace culture, the size of the groups, and your adoption and time to value goals. However, it does pay to monitor group sizes. If the groups are too small then they may break up over time. If they’re too large, then they may never truly gell. Starting with groups of about 100 seems to work well.

The role of Governance in winning the battle for AI talent

Having all skill levels working on a common AI platform provides administrators with a single pane of glass for governing users, projects, products, data, computation, and models.

For many reasons, governance is key to winning the battle for talent. Firstly, beginner users can make costly mistakes, and having proper governance in place can help to prevent this. Secondly, managers may not understand the scope and extent of their worker’s AI knowledge, especially in AI/ML, where they may not be familiar with the techniques or technology everyone is using. A governance platform, aggregating real data about data worker performance, enables data-driven identification of rising AI talent.

Lastly, a common platform can help managers to benchmark business units and functions against each other, and identify talent in every nook of a company. When looking to identify where your next AI/ML stars will come from, it’s great to be able to cast a broad net and monitor their performance. Today’s AI/ML platforms are so usable that almost anyone who understands their data and business problems can use them effectively, which means that upskilling can increase knowledge application very quickly. 

Monitor the Right Metrics

Many of the strategic metrics for measuring the success of an AI/ML upskilling program are the same as those for other digital initiatives. ROI, the percentage of tech budget spent on AI/ML, productivity (including AI/ML products per data worker per quarter), the portion of business leaders’ incentives tied to AI/ML ROI, and AI/ML talent retention and promotions — these are all great examples.

However, it’s also important to look at more operational and tactical metrics, including the percentage of data workers using AI/ML, and the percentage using a common platform. Skill retention after three, six, 12, and 24 months is also an important metric, as is the number of AI products per data worker per quarter.

The portion of a workforce that adopts AI/ML has a big impact on the value generated. If too few participate, then a sustainable community never develops, and people go back to their old ways or don’t use advanced analytics at all, which means that the AI/ML initiative could generate negative ROI.

Choose the right platform

A common AI platform enables interdisciplinary team collaboration, high reuse, and automation covering the entire AI product lifecycle. Many data scientists like to develop models by hand so they resist adopting a platform that takes away the fun parts of modeling. The ideal platform allows those that prefer to code to still build things by hand, but also get big productivity gains. A platform should be highly usable so that workers of all skill levels — from business analysts to graduate-level data scientists — want to use it.

Share

Featured Articles

SAP creates new EMEA region and announces new President

SAP has announced it has appointed a new President for a newly-created EMEA region, aiming to make the most of the opportunities of cloud and AI technology

How SAP is facilitating continuous business transformation

Technology giant SAP has expanded its portfolio with the acquisition of LeanIX, a leader in enterprise architecture management (EAM) software

Siemens and Microsoft: Driving cross-industry AI adoption

To help businesses achieve increased productivity, Siemens and Microsoft are deepening their partnership by showcasing the benefits of generative AI

Sustainability must become central to corporate strategy

Sustainability

The endless benefits of putting your people first

Leadership & Strategy

Working from anywhere: SAP uncovers secret life of employees

Human Capital