7 Metrics that Matter – Are You Tracking Them?
Metrics really do matter. In previous blog posts we covered the importance of setting up effective learning environments and designing group and one-on-one sessions to reach different learning styles. This post focuses on probably the most important topic of all – what is the result when learning becomes a key success measure for your program? How can your curriculum and learning strategy become one of your most important competitive advantages in the next few years. What individuals and their families learn in prevention, treatment, or recovery programs can be measured and will be linked to program retention, recidivism, and long-term recovery rates. Do you currently highlight your curriculum, tools, and learning metrics as a key differentiator for your program? Is it time to make it a priority. Let’s start with the basics – definitions for the key metrics we’ll be highlighting today.
1. Engagement – the degree of participation, attention, curiosity, interest, optimism, and passion that individuals show when they are learning or being taught. Learning tends to suffer when students are bored, dispassionate, disaffected, or otherwise “disengaged.”
2. Satisfaction – the level of perceived value derived from a learning experience as it relates to the expected or desired outcomes. Satisfaction can be measured for areas like the environment, instructor, instructional strategies used, course design, course content, materials or tools used, etc.
3. Learning Comprehension – the ability to process information or knowledge and its meaning and to integrate it with what one already knows.
4. Critical Thinking (Problem-Solving) – the process of making an objective analysis and evaluation of facts or a situation in order to form a judgment.
5. Learning Retention – the ability to store information in long-term memory so that it can be readily retrieved and used for a future situation.
6. Achievement (Outcomes) – the results achieved from a learning experience and the impact of those results on what one achieves (or fails to achieve) later in life.
7. Program Retention – the ability of a program to keep its participants over a specified period of time or until the individual meets the requirements to complete the program.
Do these metrics matter? A question I’ve been asking since the first days of R1 is Why hasn’t learning been a key strategy in the fight against substance use/addiction and the opioid crisis? I don’t mean education in terms of communicating the national and global crisis or the stigma reduction efforts, both of which are so important. What I’m referring to here is how we teach individuals and their families about addiction and recovery in prevention, treatment, and recovery settings. Why don’t some of the best addiction programs in our country have a curriculum map and a learning plan for all of the individuals who come into their care? Are we taking the time to design and build standardized modules based on sound learning theory for each topic identified in the curriculum map? Are there clear and visible learning objectives for each group session? Are we testing individuals to measure what they are learning? Do we have a clear plan for how we’re going to teach, track, and measure – and with that information, understand the impact on outcomes?
What’s the opportunity? Recovery and treatment programs are treating individuals for long periods, including hundreds of hours in group settings. Why are so few programs structuring, measuring, and sharing the results of the learning that happens for individuals in their care? While this is a little unsettling, this is also a very exciting time! This is why I wake up every morning. Here lies the opportunity. Here lies your opportunity.
Where are you on measuring learning? So, what could we start to measure? Let’s go through each metric and see which ones you currently measure in your program and which ones you don’t – yet. The point is not to make any judgments, but just to pause and think about each metric for a moment. Future blog posts will deal with practical approaches to address each metric, but for now, just think about each of them for yourself.
Engagement. When you think about engagement, what do you think of? What are some key factors of engagement from your own experience? Have you ever defined it for your program or groups and determined how you could even measure it? As I mentioned in an earlier post, we don’t believe that any of the other learning metrics will matter if individuals aren’t engaged in their learning. If individuals are not present, listening, participating, and processing the information we are trying to transfer to them, they can’t learn it, apply it, or retain it. This one is critical and at the top of our list. Do you currently do any measurement of engagement?
Satisfaction. You may be tracking satisfaction as individuals leave your program or in follow-up surveys. Typically, programs track questions such as Was the information relevant? Was it helpful? Was the instructor knowledgeable? Did you find the materials useful and professional? Would you recommend your experience to others? All nice to know, but is this the extent of what we want to measure? What do you do now to measure satisfaction? Who do you share the results with, and what do they do with that information?
Learning Comprehension. Did individuals actually learn something? Did they learn what you intended them to learn given the learning objectives you identified? How do you know this? Did they tell you? Did you observe this? Did you test them for retention? What do you do today to understand what individuals are learning in your groups?
Critical Thinking/ Problem-Solving. Did you design the learning so that individuals can relate it to their own experiences or, better yet, actually practice and apply it? Did you allow them to experience real-life settings (or role play) to see what they might do differently with the information they’ve learned? Did this information and experience better prepare them for individual counseling or group process sessions? We think this is a critical link to better outcomes and something we’d like to help programs establish a sound approach for. How do you design for and measure critical thinking today?
Learning Retention. Did individuals not only learn something and have an opportunity to apply or practice it, but actually retain it for future use? The goal is that individuals will learn something about themselves or an important recovery topic and be able to use what they learned after they’ve left the program. Is the learning experience you are creating effective and memorable so that the new information can be used later when individuals need it? How do you follow up to see what learning from your program individuals have retained over time?
Achievement. This is one we’re still working to better understand. What does achievement look like for recovery and treatment settings? We think the answer will lie in the choices individuals make once they leave your program and are back in their real-life environment. What will they recall? What choices and actions will they take based on what they learned? How have they applied it in a real-life setting? How will the knowledge they retain affect their future choices?
Program Retention. Higher program retention and extended time in recovery have definitive correlation to better recovery outcomes. All of the learning metrics above will have an impact on program retention. Think about it – If I’m in treatment or a recovery setting and I’m engaged, learning something of value, and can see applying it to my own experience, I’ll probably stay another day and see what else these guys can teach me. The alternative (to be blunt) might be: I’m bored and sitting here thinking, Why am I even here? I’d rather be someplace else. And so that’s where I end up. Somewhere else, not getting help, and sticking with those same unhealthy behaviors.
Where did you come out? This is a lot to think about. But pretty interesting, right? When I ask programs what they are designing for, testing for, and tracking, the result is often discomfort and a quick change of the subject to something other than learning metrics. That’s OK for now. I’m looking forward to having the same conversations two or three years from now with a different level of understanding and rigor and a rich discussion about findings and results related to learning metrics. Opening the door on these conversations is an R1 Learning goal. We designed the Discovery Cards to increase learning by creating a highly engaging experience that appeals to all five learning styles. (Learn more about learning styles in our previous post.) We hope you’ll engage with us. Do reach out and contact us if you want to talk about learning metrics – we’d love to hear what you’re doing and how we can support you.
Copyright 2023 R1 Publishing LLC / All Rights Reserved. Use of this article for any purpose is prohibited without permission.
We’ll go into each of these metrics in more detail in future posts. In the meantime, please help us with any of the following actions. Otherwise, stay tuned!
1. Share this blog post with others. (Thank you!)
2. Start a conversation with your team. Bring this information to your next team meeting or share it with your supervisor. Change starts in conversations. Good luck! Let us know how it goes.
3. Visit www.R1LEARNING.com to learn more about R1, the Discovery Cards, and how we’re creating engaging learning experiences through self-discovery.