Data & AnalyticsLearning & Development

Targeted Training

Data can help design L&D programs as well as measure their success.

By Debbie Bolla

Many organizations are looking to data to help drive their training and development programs. In fact, according to EduMe, 90% of respondents report high-quality data is integral to improving learning delivery in their organization. Data can help identify gaps and steer training programs in the right direction.

Pooja Singh Mehta, principal consultant of NIIT, a skills and talent development corporation, says today’s landscape of technology-enabled work processes and sophisticated performance tracking systems provide learning leaders with more data than ever before. “This data when used effectively can help support identification, prioritization, and personalization of training needs,” she explains.

Mehta says one of their clients had success with this very type of approach. She provides an example: A mining company was building a training program for haul truck operators. In the past, the course of action would have been to spend many hours conducting interviews to understand areas for improvement. That’s no longer the case with the data that’s being tracked.

“Given the automation, we could look at very detailed performance data such as truck load efficiency, braking patterns, and many other variables. We could even isolate these performance gaps by weather conditions and other environmental factors,” she says. “This access to data can help us create very targeted training solutions.”

Targeted training is a key outcome when organizations use baseline data to pinpoint employee skill proficiency, patterns, and trends. “Instead of creating a learning program based on a topic, leaders can develop training programs that are targeting critical skills and behaviors that business stakeholders require and underpin their broader business goals,” says Sowmya Sudhindranath, chief services officer for ETU, an immersive learning software company. “Data can be used to identify skill gaps, prioritize areas for employee development, help shape program design, and clearly define the learning program goals.”

Iyad Uakoub, senior director of behavioral science at coaching management platform Sounding Board, recommends a training needs analysis, which can help organizations prioritize and ensure that training goals align with business goals. “A training needs analysis also helps to establish a benchmark, so you can effectively measure individual/organizational improvement. It can then help to determine what is the most effective learning solution for investment, be it coaching, e-learning, etc.,” he explains.

In addition to helping design and develop training programs, data plays a key role in measuring overall success. Organizations should look at both quantitative and qualitative data points in order to gain an overarching picture of the program’s effectiveness. Helpful quantitative measures include completion and satisfaction scores, engagement with training materials, results from skill assessments, and goal completion. Common qualitative metrics include learner feedback, learner self-reflection, and manager and business stakeholder feedback. There are many factors to consider when building a scorecard.

“One or two data points will not be the magic bullet for every training program,” says Nikki Eatchel, chief learning officer for Mursion, a virtual reality training platform. “However, we have found that data related to employees’ sense of self-efficacy and confidence pre- and post-training can help gauge the overall effectiveness of a training program.”

Sudhindranath also recommends tracking skill changes when incorporating new programs. “Employee skills levels before the training versus after training provides evidence of program effectiveness and is a greater predictor of performance on the job than knowledge levels or self-reported learner feedback,” she says. “Comparing performance between those who received training and those who did not can also be very insightful.”

Learner feedback can be one of the most impactful quantitative data points. Eatchel says providing participants with the opportunity to document self-reflection can make a big difference. “Rather than telling a learner, ‘do this, not that,’ learners will have better outcomes when they draw their own conclusions, ideally by being guided with specific open-ended questions tied to the organization’s and the individual’s learning objectives,” she explains. “Reflection allows the learner to take cognitive ownership of their learning and growth, and supports a deeper mode of thinking where they can unpack and have those ‘a-ha’ moments.”

Eatchel recommends the following reflection questions in order to encourage specific feedback from participants.

  • Did you feel like you accomplished your practice objective within this session?
  • How do you think the other participant in the conversation felt about your approach?
  • If you had the opportunity to do the activity over again, what would you do differently?

One of the most valuable aspects of learner feedback is actually acting on it. “While this is often collected, it is not analyzed as often,” Mehta explains. “Automation such as a sentiment analysis tool is helping us draw more insights from learner feedback.”

Sudhindranath agrees that qualitative data, such as learner comments and manager and business stakeholder feedback, can help add context to quantitative data around the program experience, engagement, and relevance. “L&D can leverage this feedback to make improvements to program content, delivery or audience profiles. Learner quotes also help frame the data story of program impact and value,” she says.

Learner feedback can help in the future design of programs as well. “We have seen the greatest success from clients who proactively evaluate learner engagement, and more importantly, incorporate that feedback to customize and expand programs and gain greater stakeholder buy-in,” Eatchel says.

And tracking metrics over time is key. “Whatever data point you choose to measure, you should look at behavior change over time,” notes Uakoub. “Many L&D leaders measure immediately after a program, but if the employees don’t retain the knowledge—as is too often the case with traditional training programs—that data point is flawed.”

There are a few other challenges HR and L&D leaders need to factor in when designing their approach. Mehta says it’s critical that organizations take a composite view of the data in order to gain the most insights. “In many organizations, data is disconnected,” she explains. “For example, consumption data, feedback data, spend data, and learning operations data may be sitting in different reports. In such cases, it is hard to make correlations and drive meaningful insights. Organizations that have built data lakes are significantly more effective in leveraging data-based insights for learning.”

Sudhindranath says the basics like completion and satisfaction are easy to measure, but when it comes to understanding skills and behaviors, it becomes much more difficult. “It is impractical to observe on-the-job performance at scale for most roles,” she says. One potential solution is an immersive simulation that records these types of metrics within the technology. “This is particularly valuable for human skills which are the hardest to measure on the job, but are critical during the rapid business transformation that is happening now.”

Coaching can also help track if knowledge and skills are being retained and used. “Engagement doesn’t guarantee that learners can apply new knowledge and skills. You need to go beyond engagement to measure how well employees can apply learning on the job,” says Uakoub. “Coaching can be a great addition to feedback-driven development programs because you get real-time feedback focused on individuals’ specific needs—needs that often stem from current challenges they’re facing on the job.”

And like with any human capital management investment, it’s key to measure ROI. One way is to link outcomes back to business performance. “Mapping leadership skills and competencies to other objective measures of business outcomes can provide invaluable information to organizational stakeholders about how performance trends in training drive concrete business impact,” says Eatchel.

Tags: Data & Analytics, Learning & Development

Related Articles

Menu