The shocking math behind learning retention!
Tips, tricks, and thought leadership to improve impact measurement.
Good morning, afternoon, or evening! It’s Saturday, May 3 and the ATD International conference is just around the corner! Who is going to be there?
The big day for those who want to nerd out on data and measurement and Sunday, May 18.
The day kicks off with David Vance speaking on the new International Standards for Measurement, I’ll be speaking at 11 am on using data and metrics to build stakeholder relationships, Rachel Armstrong has a session on quantifying impact, and my good friend Colin Hahn wraps up the day on measuring the intangibles.
If you’re joining the ATD fun in DC, please respond to this email so we can connect in-person!
And now … onto our resources, the shocking math behind learning retention!
Before we dive in - Here’s what’s happening on our Free Measurement Events Calendar!
Coming up next: The month of May is full of fun opportunities. Our monthly measurement meetup group is coming back together May 5 (and I’ll be offering participants a chance to test out a new digital coach that I’ve been building). Then, Julie Dirksen has invited me to do a behavioral breakdown where we’re going to explore with a live audience why people don’t respond to surveys and what we can do about that! Finally, the Human Capitol Lab is hosting a community conversation focused on impact on May 8.
New data to improve learning impact!
Our friends at Actionable.co just launched their latest Actionable Insights report and… wow!
It's called Make Learning Stick: How to drive engagement on your leadership programs.
If we want to make a measurable impact through learning and development, we must first design and deliver programs that people engage in. No engagement, no impact! I appreciate the latest Actionable.co report because it highlights a few simple things that have been proven through rich, valid data to make learning transfer to behavior change.
In the report, Chris Taylor, founder of Actionable.co shares a few wild stats:
Very few leadership development programs encourage participants to make commitments to change their behavior as part of their learning experience. Those who do see a 61% increase in engagement.
Programs that include an accountability buddy checking in with their participants 1/week see a 108% increase in participant engagement.
When a facilitator comments on participants work or progress within the program 3+ times/month we see 27% improvement in participant engagement.
Participants engage with their commitments an average of 40% more often when their peers can see what they're working on.
Improving any one of the 4 areas above (making behavior change commitments, accountability buddy activity, facilitator activity, peer visibility) could have a significant impact on your participants' follow through and - by extension - your ability to prove the impact of your programs.
If you want to make a greater impact in your organization or community, consider reading this report!
An interesting debate …
Recently, I had a friendly debate with a colleague on LinkedIn. He was suggesting that L&D could benefit from having a research and development budget, like so many other departments, to figure out what works and what doesn’t with learning in their organization. I believe this would be a waste of resources, as we already know what works in L&D - we just need to do a better job of doing what we know works!
How many times have you been told not to do something that you know would make your program more effective…
Performance needs analysis
Defining expected behavior change up-front
Leaning into performance support versus formal training
Exploring which KPIs should change if the learning initiative is successful
And many more!
We have the ability to do great work, when we work in environments with the support and encouragement to do what we already know works!
Oftentimes we don’t have robust data to prove to our stakeholders and training requestors that it is worth investing in the the right thing.
Here are a few of my favorite data points from Chris’s report!
Participants with high levels of Social Support reported 143% greater behavior change than those with no Social Support (78.3% vs 32.2%). This powerful statistic demonstrates the dramatic impact of proper support mechanisms on learning outcomes.
Despite the high impact (and virtually non-existent cost) of Accountability Partners, surprisingly few learning programs include accountability partners. This highlights a significant missed opportunity in most L&D programs.
Participants opting for 6+ days/week of nudges to encourage them to follow through with their behavior change commitments saw a 61% increase in engagement versus those who received 5 notifications/week. Yet few people select this high-frequency option. This represents yet another missed opportunity in supporting participants to change their behavior through learning.
I hope you’ll find useful ammunition to make doing the right thing easier in your learning function!
Thank you for reading all the way down! Let us know what you think of this week’s resources! Inspiration and improvement is our goal here at The Weekly Measure :)
See you in your inbox next weekend!
~ Dr. Alaina