HMM Measurement Strategy Summary

advertisement
THE LEARNING PHILOSOPHY
BEHIND HARVARD MANAGEMENTOR
Harvard Business Publishing’s learning experts and instructional
designers have shifted the emphasis of Harvard ManageMentor
from knowledge gain to behavior change. Content has been written
with more real-world examples, presented in goal-oriented lessons
focused on single management or leadership tasks. Each lesson
provides learners the opportunity to start applying the concepts to
their work situation during the learning experience rather than
waiting until the end or treating this step as an optional activity. An
entirely new module, On-the-Job, allows the learner to pick a
performance goal and develop an action plan to focus on this goal
over a specified period of time.
The below screen shots demonstrate the feedback, reflection and
action planning components of Harvard ManageMentor.
FEEDBACK: Practice Exercise
Learners get immediate feedback on the choices they make in a
practice exercise. Exploring the outcomes of your choices in each
scenario will help them prepare for similar situations in the
workplace.
FEEDBACK: End of Topic Assessment
A multiple-choice, scenario-based assessment measures learners’
comprehension of how to apply the material presented in the topic.
REFLECTION: Opportunities to Apply Concepts
At the end of each lesson, learners are asked to reflect on key
insights from their review of the material. They are also asked to
identify one or two opportunities for applying the concepts on the job.
These reflections are carried forward into the On-The-Job experience
to help the learner as they look to personalize their action plan.
ACTION PLANNING: On the Job Application
Once the learner has completed all the lessons in a module, they
enter the On The Job component to assess themselves on the key
skills in a given topic and choose one to focus on over the course of
90 days. They then create an action plan to support their progress.
HARVARD MANAGEMENTOR MEASUREMENT
STRATEGY
The greater emphasis on application delivers more impact for
individual learners’ development and also for organizations.
But how does one know how many people are using the
resource, what they are working on, how they are applying
their learning, and how proficient they have become? Harvard
ManageMentor’s measurement strategy has been designed to
answer these questions and, ultimately, to help organizations
understand how learners are applying new skills on the job
and the results that learning is driving.
ONE VIEW: Key Metrics for Administrators
One View is a dashboard that shows how effectively learners are
applying skills on the job and provides insights for learning
professionals to refine their programs and show progress and
business impact with stakeholders.
From the dashboard, the administrator can immediately drill into the
details and sort, filter and analyze the usage and the impact of their
investment in Harvard ManageMentor.
Administrators can look at utilization, see which tools are most
popular, and analyze which topics achieved the highest learning
impact scores, among others, as well as drill down into each
component.
MEASURING LEARNING IMPACT
According to industry research, up to 60% of knowledge is lost if it
not applied within 48 hours. This “scrap” learning is one of the
largest costs of any training program. Harvard ManageMentor
minimizes scrap learning by including a 90-day “On-the-Job”
component to support each topic. Two brief surveys anchor the Onthe-Job component (the first at day 1 and the second at day 90) to
assess impact of the learning. This ongoing support for and
reinforcement of learning over time is a proven way to ensure the
learning sticks.
“Learning Impact” is a new metric in Harvard ManageMentor that
measures business results across three key impact drivers from
these surveys:
Knowledge
Impact
Did you gain new knowledge and
skills from the topic?
Job Impact
Are the knowledge and skills you
learned applicable on the job?
Results
Have the knowledge and skills
you learned directly improved
performance on the job?
Learning Impact combines these three drivers into a single,
convenient measure of learning effectiveness. The overall Learning
Impact score is available in the One View dashboard (bottom-right
in the above screenshot). An 15% Learning Impact indicates that
learners self-report an average 15% gain across Knowledge Impact,
Job Impact, and Results after completing the On-the-Job section of
the topic.
By itself, Learning Impact is a broad index of learner performance.
When combined with the additional Key Impact Drivers (Net
Promoter, Business Alignment, Courseware, Support Tools, and
Management Support), it provides a powerful framework for
exploring the relative business impact of learning across program
and just-in-time use cases.
LEARNING IMPACT CASE STUDY: The Cost of “Scrap” Learning
Historically, Jane’s executive team has been satisfied with knowing
that learners completed programs and with an overall sense of
knowledge gain. (Figure 1) But sales at the company have slowed
year over year, and the learning organization is under increasing
pressure to quantify its value to the company. Jane feels that she
has a good story to tell about the new finance program for R&D.
But in the current sales climate, she would feel even better
providing the executive team with recommendations to improve the
business impact of the program.
Figure 1
Jane opens the Learning Impact report to explore Budgeting and
Decision Making more closely, since these two received the most
activity over the past few months. Overall companywide learning
impact has hovered around 4% for the past few months. Jane
notices that Budgeting and Decision Making have very different
learning impact numbers. At 8%, Budgeting is twice the average. In
contrast, Decision Making is only 3% – a full percentage point lower
than the average.
Jane starts by scanning the list of Key Impact Drivers (see above):
She notices that for Budgeting, the average scores for
Management Support are well above benchmark.
Jane switches topics to review Decision Making, and immediately
notices the lower learning impact trend in the graph. She looks up
the Management Support score in the table and confirms that
learners gave Decision Making a lower mark.
Jane recalls from the Progress and Completion report that learners
in the Decision Making topic completed the On-the-Job section less
often than all topics in general. At the time, she hypothesized that
this was largely due to timing or because the topic was presented
as supplemental learning in the new finance program.
Knowing that learners self-reported lower Management Support
rating for Decision Making, Jane questions her original hypothesis.
She suspects that without strong support from their managers,
many of the learners felt uncomfortable investing the time to
complete the On-the-Job section of the topic.
Download