Assess Training Success!

brain2

What is at the heart of any training initiative? The ability to make a difference. That could mean an employee able to do their job with more efficiency, consumers buying a product after understanding the value, or preparing a university student for success in a course to pass that final exam. Training is done for a lot of reasons, but the one thing we all want to know is: Did it work?

Let’s look at how we can find out. First, let’s look at a brief overview of some standards for any training evaluation project. Most people in the training industry use the Kirkpatrick levels for training evaluation. I’ll break them down for you, below.

Level 1:  Did they enjoy the training experience?
The first thing most people want to know is if the people who took the course enjoyed the experience. This is important to know because if you have a group of people who were bored, angry, irritated, or frustrated the technology didn’t work, any impacts of the training aren’t going to be seen if all of these blockers stand in their way.

So how do we find this out? Most people do a survey that is emailed out after the course, or pops up as part of the course at the end.

Level 2:  Did they learn what they were supposed to?
So next, we want to know if they actually gleaned what was intended from the course.  Did they understand the material? After they took the training, could the learner describe or remember what they learned?
How do we determine this? Usually by a knowledge check or test at the end of the training.

Level 3:  Were they able to use the training?
After they learned the critical objectives from the training course, we want to know if they actually did something with it. For example, say Company B is training its employees to push Button X instead of Button Y at the end of the day. Now that they are outside of the training environment, and back at work, did they actually change the way they behaved? Did Button X get pushed or were people still pushing Y?

How do we determine this? This one can be a little tricky. So if you have a way to measure something like button pushes, this is easy. But if you’re training people on soft skills, like politeness or customer service EQ, then it gets more complicated. You might have to have observation involved, or some kind of pre- and post-test simulation exercise to determine if the learner’s behavior and skills changed.

Level 4: Did we make a difference?
Now that your group of learners has enjoyed the training, remembered it, and you were able to see they actually used it, the next question is:  Did we make an impact? This is when you can look at metrics and data and say, “Wow, look at the increase in Button  X pushes! We’ve had a 95% increase in Button X usage.” Assume Button X actually saves you $10 every time an employee pushes it. Well, do a little math and you just saved your company a lot of money!

How do we determine this? This is usually relying on cold, hard data. You want numbers. You have to look at what you want to achieve. What is the end result? Are you trying to save time, steps, volume? Get the data to show that, and then add the money factor to it. If this training initiative shaved 2 hours off an employee’s weekly workload and you had 500 employees take the training, that’s 1000 hours. Say the average employee’s salary is $20/hour. You can just tell your boss that your training idea saved the company $20,000 a week! And who wouldn’t want to give you a promotion?

Now that we have the basics down, let’s examine a case study to give you some more specifics. With an actual example!

Case Study:  The call center.

Call centers are a mecca of training initiatives. There is always opportunity for employees to do their job quicker, more easily, and with fewer support resources. So this is what happened when I was tasked with revising a new hire training for a call center that supported customers calling in with their tech issues.

Current state: There was high employee turnover, customer satisfaction scores (also known as CSAT) were dropping, the calls took longer than they were supposed to, and the managers and trainers were constantly doing just in time training initiatives hoping for the best results. How can a call center support its customers and make a profit in this shape?

Answer: They can’t.

What was the solution? After going in and doing a job analysis, interviewing some groups of employees and identifying that the problem was they weren’t adequately prepared to do the job, I revised their new hire training so that it was more engaging, covered all of the top call drivers, and had the new employees doing the job in the safety of the training environment (via tools simulation and mock calls) before they got on the floor to take calls.

Then it was time to measure.  Did it work?

Remember that the problem of the current state had a few measurables that could be tied to training: 1) the CSAT was poor, and 2) the calls took too long. These were both data points that could be used in order to do an ROI analysis. So after a month on the floor, I compared the new batches of call center employees who had gone through the revamped training, with the previous few groups who went through the previous training. I used sample sizes that were around the same (around 100), and the same time period to try and do an apples to apples comparison.

What I found was that for the new batch of employees, the CSAT scores had gone up by 10%, and their calls were 2 minutes shorter. And the only thing that had changed was the training. This was great news. Customers were happier because they got their issues resolved faster and more accurately, and the business was happier because those metrics are what they live and breathe by. Not only had we shown that there was a measurable difference in quality, but we could now look at how much we were saving per call. By assuming the employees made on average $15/hour, with a 100 employees, over time those 2 minutes per call were going to add up. All this money saved just by introducing a new training program catered to employee needs, and doing an effective training evaluation after it was implemented.

Do you want to know more about conducting an evaluation of a training project? Trying to figure out which data to use or how to get started? Contact Enspire for a consultation today!