People
Articles
Events
Marginalia
Genealogy

Home

 

 

The 100 Percent Solution

Stanley E. Malcolm, Ph.D.

(Published as a "Viewpoint" in Training, July 1998, p. 72)

It is about time we stepped back and took a fresh look at our approach to measuring learning's impact. What we'll see is that we need to focus more on course design and partnership with our business sponsors than on a traditional measurement strategy.

At the course level, the standard approach to measurement has long been Kirkpatrick's four levels - with "smile sheets" at level one and assessment of organizational impact at level four. Most organizations use smile sheets to determine if students "liked" a course, that is, perceived it as worthwhile. Many test students to see what they learned. Few measure skill application on the job, and of those few, fewer still do so routinely. A very rare few indeed feel able to measure organizational impact and do so only rarely.

While, to a certain extent, measurements at Kirkpatrick's four levels tell course developers different things, by and large they represent a hierarchy of perceived difficulty to implement. They also represent increasing direct relationship to business impact, which, after all, is the point of training in a corporate setting. So, the things that seem easier to measure (perceptions and learning) say less about business impact than application and, well, impact.

Generally, training organizations have chosen to measure perception and learning while inferring application and impact. I contend that this satisfies nobody: Trainers have lingering doubts about the impact of their efforts. In the worst cases, they lose sight of the real point of the measures, contenting themselves with achieving "satisfaction" and learning. And business managers remain skeptical, often treating training as an expense to be cut - for if they "believed" in training's effectiveness, would they cut the budget?

The problem in measuring course impact is not in Kirkpatrick's levels, it is in the design of the courses themselves. In "Reengineering Corporate Learning" I estimate that over 80% of critical job learning happens on the job. If you're skeptical of that figure, just ask yourself where you learned the critical skills you apply every day. Did you learn them in a classroom?

If less than 20% of critical job learning happens in our traditional courses, how can trainers be confident that they've made a valuable contribution to business success? After all, the trainers were "absent" when over 80% of the learning took place! Doesn't this explain too the limited expectations of managers to employees returning from class?

I propose a simple solution - design the entire 100% of learning, but continue to deliver only 20% or less by traditional means. The remaining 80% or more should properly be learned on the job - but in a way that has been designed or "structured" to make it consistent - from office to office, supervisor to supervisor, and day to day. Training's role is to provide the structure for learning, not to deliver it. The means might include coaching and assessment guides for supervisors, self-instructional materials for employees, intranet-based reference materials, and electronic performance support systems that embed business processes, advice, and learning granules accessible in the context of performing work. Our initiatives in the area of knowledge management are a response to the realization that most learning happens on the job, not in the classroom: we're learning all the time, and need to learn all the time to be competitive. We don't just learn from "teachers."

To drive home the point of how radically different the "100%" view is from our traditional approach to course design, let me propose the following simple rule: The "course" isn't over until the learning has been successfully applied on the job.

For example, the sales course isn't over until the learner has sold 500 of the new model widgets. As simple as that rule sounds, its effects are radical. How many of the courses you currently offer conform to it? If designers adopted it, wouldn't much of the frustration of measurement vanish?

Imagine a world where business managers would see firsthand the impact of learning on business performance. In such a world, trainers would play an important role in structuring on the job learning. More importantly, that role would be performed in much closer partnership with the business beneficiaries of learning. On the job learning components would be an integrated, essential element of a learning strategy, not an afterthought or ignored completely. The performance goals of learning would be shared by all, and measurement would be - relatively speaking - a piece of cake.

About the Author: Stan Malcolm serves as a coach to companies wishing to develop or review their strategies for performance support, learning technologies, and/or corporate learning issues generally. Formerly, he headed learning technologies and performance support initiatives at Aetna, Inc. He can be reached at Stan@Performance-Vision.com or 860-295-9711.

 

People
Articles
Events
Marginalia
Genealogy

Home