When you put your training course out into the world, how do you know it’s successful? What are you really trying to measure?
This is an age-old concern for the instructional designer. (And for the project owner. And especially for whoever is paying for the course.) Too often an instructional design team prematurely congratulates itself on a training course well done, just because it’s been launched—way before anyone knows how effective it was for the learners.
Here are three ways success is typically proclaimed in a training course and the problems with relying on these signals:
- People participated! You built it. People showed up. Of course you want people to experience the course, but that’s only part of the plan. Sometimes learners are required, coerced, or even guilt-tripped into taking a course. Maybe there’s a competition, or an incentive for the team with the highest participation. These can be effective motivators, but they don’t necessarily result in actual learning.
- All pages were visited! The learner was not permitted to move ahead in the eLearning course until all the pages were clicked. And guess what, all the pages were clicked! Hooray! Not so fast. Clicking the “next” button is not an accurate measure of engagement or comprehension.
- The end-of-course assessment was completed! Even if you’re measuring scores—80% is a typical “pass rate” for final quizzes—it doesn’t mean much if there’s no indication how that knowledge is actually applied in real life.
When measuring success, could it be that you’re asking the wrong question? Instead of “Did people show up?” or “Did they pass the test?” ask, “Did the learners’ behavior change after taking the training?”
- Mind the gap before, during, and after the training event. Identifying the gap is the first thing to do when designing a new learning experience. Figure out the distance between the current state and the desired state, and map out the journey to guide the learner. Clearly document the progress made, and make note of any new mini-gaps that open up along the way.
- Follow up with participants after training is launched. Conduct post-training interviews or surveys with participants. Ask for their feedback. Ask for stories about how their jobs are different or how their behavior has changed as a result of the training. Consider making this information available to other participants or those who will be taking the course.
- Build in opportunities for perpetual learning. Training does not need to be a discrete event. As part of the master training plan, create on-the-job shadowing or teach-back opportunities. Give learners templates and other resources to document their experiences and share their wisdom with others.
These three strategies focus on closing the gap identified at the beginning of a learning plan. What other strategies do you use to guide learners from their current level of knowledge and behaviors to the desired state? How do you measure success?