Recently, I was discussing a company's internal training program with its HR Director. Administrative professionals are required to earn a certain amount of "credits" per year through the internal program. Full classes are offered in the Fall and Spring and a partial class schedule during the Summer. The topics are varied and include general software training, such as how to use styles in Microsoft Word, job-specific applications, such as how to write a certain report in more quickly and efficiently, and soft skills, such as organization and time management. More than 70% of the training, though, falls into the first category, software-specific.
As we discussed this in more detail, I learned that some of the problems with the program was the unequal results it seemed to produce. Some of the attendees felt that the courses were far below their skill set and that it was truly painful to sit through the same Microsoft Excel course year after year without learning anything new. Yet, the HR Director had come to find out that some employees, who had been with the organization for years, did not possess even the most basic software skills.
The HR Director wanted to know how this had happened and what could be done to correct it? After the time and energy the organization had devoted to creating and implementing the program, which had been in place for a number of years, it was fair to say that the return on the company's investment was zero, at best. If anything, the program was actually a loss. They paid for instructors, both high-cost outside trainers and internal IT staff whose time could have been spent on other things. The employees were taken away from their "real" work for at least 12 hours a year (the minimum number of credits each employee has to earn per year). And, the lost potential of having an entire workforce who had actually learned the materials and who should have been performing at a much higher level.
Turning to the first question--how had this happened? Certainly, there are a lot of ways to point fingers. The employees didn't take the classes seriously; they didn't have the initiative to develop their professional skills; and they didn't speak up if they were not able to grasp the material or follow along during the instruction. And what about the trainers? Shouldn't they have known that their students had drifted out in space 5 minutes after the class began; shouldn't they have seen the dazed look in the students' eyes; shouldn't they have questioned why the same, advanced-level employees were returning to a class they'd already taken several times; shouldn't they have pushed for a more advanced curriculum offering?
Should've, could've, would've. Yes, all of those things should have happened. But they didn't. There is one other thing, though, that didn't happen. And this, in my opinion, was the root cause of the program's dysfunction.
There was no assessment.
Students get credit regardless of whether they left with any increased knowledge. Students couldn't test out of programs that were below their skill level. Basically, so employees could play Scrabble on the computer during the class, as long as they showed up to 12 hours of classes each year.
This, in my opinion, was the fatal flaw of the program. As Stephen Covey said:
"Accountability breeds response-ability."
There were no checks in place to make employees accountable for the knowledge they'd just been given. And employees who did take the knowledge were rewarded with a big, fat, nothing. There was no "A+" or gold star at the end of the tunnel. They just returned to their desks and hoped that the information they'd just learned would help them in some way. Wow, what a waste!
So, how can it be fixed? Easy! Require attendees to take and pass a skills assessment at the end of the class before they will be awarded credit. If a large percent of the class fails the assessment, the training and the trainer need to be reevaluated--it's likely that the lesson was not well communicated. But if most do pass the assessment test, the ones who do not should not be awarded credit.
But they should be given another chance. Because, remember, the goal is to get the students to actually understand the material--not to trick them into passing or failing a test. So, if a student does not pass the assessment test, they (and all students) should be given take-away materials that they can reference and, when ready, try again. The take-away should be detailed instructions--probably more detailed than the actual class and used as a reference for later. Or video tutorials can be made available for students to watch. Videos can be created internally with a product such as Camtasia Studio, or purchased from a commercial site, such as Lynda.com. The videos can be hosted by the company's SCORM system, an internal server or intranet, or even YouTube. The point is to get the information to the students for them to study and then to retest--successfully.