This blog post is the first in a LEO Learning series on learning analytics and measuring learning impact, which will run throughout the year and culminate in an ebook of practical advice for learning professionals on how to evaluate the effectiveness of your training programmes. This LEO Learning-led series involves input from a number of experts and thought leaders across the company, as well as our learning partners at Rustici Software and Watershed.
This blog, the first in the series, comes to you from LEO Learning’s Imogen Casebourne and Gareth Jones, and Watershed’s Andrew Downes.
The world is moving and changing ever more quickly. As a result, organisations and the individuals who work for them, are finding themselves having to adapt to constant change. Learning and Development (L&D) departments should be playing a vital role in helping people and organisations thrive in times of change. However, to do this, they must themselves be nimble and agile – ready to try new methods and tools and to discard them quickly if they are not effective. Unfortunately, this is not always the case. Where marketing departments have moved to using research, testing and impact measurement to select the most effective approaches and to demonstrate the value of the marketing spend at board level, L&D has not yet followed far down the path of measuring learning impact.
RECOMMENDED READING | 'Measuring the Business Impact of Learning: The Definitive Guide'
Proving value with learning analytics
Learning analytics is a powerful new tool in the learning and development ‘toolbox’. Being able to quickly see the impact of different approaches enables designers to experiment with and assess new approaches to meet changing needs.
The increasing demand for better evaluation and analytics in learning has largely been driven by better data in other parts of the business. In particular, much like learning, the marketing function has moved from being a ‘black box’ where investments had an unknown impact to an area where organisations can very clearly and immediately see the direct benefits of specific investments.
As a result, marketing departments have moved from a ‘spray and pray’ approach, to a much more evidence-driven strategy where they can predict the impact of their initiatives. Marketing has gone from one of the first things to be cut in a downturn to one of the last.
Now it’s the turn of L&D to go through a similar transition…
Winning a seat at the boardroom table
At LEO Learning’s partner company, learning data analytics experts Watershed, clients are keen to implement learning analytics to get a view of what learning resources and platforms are being used and by whom. In organisations where learning happens everywhere, there’s an urgent need to expose and bring together a record of both the formal and informal learning into one place, viewed ideally through visually effective dashboards.
If you don’t know who’s using the resources and experiences you create, pay for and curate, then your approach to elearning is very similar to that ‘spray and pray’ approach of old-fashioned marketing. It’s likely that at least some of your courses, campaigns and platforms either aren’t being used or aren’t working. But how do you know which elements are ineffective?
L&D teams in organisations like Visa are using learning analytics to assert a far more strategic role in the business. They are gathering data on the key learning moments that help to develop great leaders. This is enabling the L&D team to develop a better understanding of how these leaders are learning, how often, and from what sources.
As we all know from experience, many senior executives still treat learning and development as a cost. When we can demonstrate – using a big data approach – that we know which capabilities to fine-tune in order to achieve business goals quicker, then we will be invited to take a seat at the boardroom table.
Proving the value of learning isn’t the whole story; as learning professionals, we already know that learning works. Learning analytics gets really interesting where we not only prove, but also improve learning. This is possible with more detailed data and analysis that tells us not only whether our learning programmes, strategies and offerings are working, but which elements and approaches are most effective. These detailed analytics can also be used to highlight practical and cultural blockers that need to be addressed.
Real-world learning analytics
Watershed has lots of examples of this principle at work:
- AT&T used learning analytics to prove that more engaging, higher fidelity compliance learning content was more effective than existing training. During the project, they were able to save 670,562 production hours and 160,380 employee course hours.
- Medstar uses learning analytics to evaluate the effectiveness of training on clinical metrics. Almost immediately after launch, they were able to identify a problem with a particular step of their simulation app when they noticed a surprisingly high number of clinicians missing that step.
- Nuance Communications uses learning analytics to report on completions of required training. The greater level of detail made possible with Watershed enabled Nuance to uncover instances where learners are trying to ‘game’ the system by taking the pre-test after having already attempted another copy of the same test included in the course to get the right answers.
- What’s wrong with Thursdays? Another client recently uncovered the fact that significantly more people drop out of courses on Thursdays than any other day, and they are now exploring ways of rescheduling courses to reduce drop-out rates in future.
Changing perceptions by measuring learning impact
Gathering learner data is a win-win situation. More than that, with big data making ever greater inroads into other areas of most organisations, by not embracing a more data driven approach, L&D departments risk being viewed as old-fashioned and slow moving.
So what are you waiting for? We asked a number of organisations how they’re measuring learning impact and found that while a number of organisations had already started using learning analytics to prove value, others were being held back by a variety of factors. You can view the results of that research here.
Enjoyed part 1 of LEO Learning’s measuring learning impact series? Click below for more:
Part 2 – ‘‘The top 10 components for measuring the business impact of learning‘
Part 3 – ‘‘New approaches to learning measurement‘
Part 4 – ‘‘How to define your business impact measurement strategy‘
Part 5 – ‘Getting to grips with learning data’:
Part 6 – ‘Demonstrating an evidence-based approach to business success’
Part 7 – ‘How to use learner analytics to your advantage’
Gareth Jones is LEO Learning’s Product Development Director.
Andrew Downes is a Learning and Interoperability Consultant at Watershed.