If we’re going to design for impact and then measure the outcome, how do we do this? It starts with what we’re trying to achieve, and then it comes down to applying what we know of learning science to the design of a solution. We need to be systematic in process, while practical in application.
Initial Analysis
What is needed upfront is an appropriate analysis. Too often, we can decide to offer a course when the solution is otherwise. Even before that, we need to ensure there actually is a problem to be solved, and that we’re targeting the right one, before designing an intervention.
A performance consulting perspective is useful here. In such an approach, you find out what the core difference is between the performance currently being delivered, and what is necessary and/or desirable. If someone comes saying they need a course on X, the response is not “yes” or “no”, but instead is “yes, and…”. What comes after “and” is a request for either the metric that’s not up to speed or how you’ll know if you’ve succeeded. The point is to try to identify an actual need, focused on outcomes the organization needs. Technically, this is a ‘gap analysis’ identifying the core problem.
From there, the ideal next step is to identify the root cause of the performance gap. In a root cause analysis, we look to identify why the problem is occurring. There are many reasons that performance can be lagging, including the wrong rewards, the wrong information, insufficient resources, as well as a lack of skill. Not all of those can be addressed with training or job aids, thus identifying the source of the problem is necessary.
Once the need for a course or job aid is established, so that the source of a gap can be addressed, you move on to designing the solution. To be very clear, in many situations, we may well be needing to have an impact on other factors than performance. There may be times we need to change the behavior and aren’t measuring whether the behavior is impacting the business, or we just want to know learners are able to do new things at the end of the training. Those aren’t ideal, but they’re real, and we should similarly be designing for impact on those goals, regardless.
PXD
Once we’ve identified the source of the problem, we can design an intervention. Given that it’s not always a learning experience, in theory, we could talk about performance experience design (PXD), including both sources. In practice, of course, we tend to do one or the other, or a combination. It could be that it’s a situation calling for a job aid, and there are design precepts to do that. At other times, a learning experience is called for.
Regardless, we want to apply the best design principles. Here we draw upon principles to create a first best guess. What isn’t always recognized is that the principles give us a good basis, but they’re unlikely to exactly match any particular situation. What happens is that there are extra tweaks necessary to appropriately contextualize the solution to the particular situation.
Thus, we also should do it iteratively. That is, for a job aid, we should first identify the necessary elements, and test them, before finalizing the design. For learning experience design, we should similarly prototype the core practice experiences (ala a variety of iterative design models), before adding in the necessary models and examples, as well as the ‘emotional’ hooks. Note that the practices should be engaging as well, and we should be testing and tuning them as well as the overall experience.
Just as with analysis, the design also may be focused on a less-than-complete solution. While ideally, we’d have a ratio closer to 80/20 for practice versus content, in many cases the budget expectations are for that ratio to be reversed. We also may not be able to develop the appropriate practice that we’d desire. However, we can, and should, create at least mini-scenarios as just better-written multiple-choice questions. While going to branching scenarios or full simulation games might be desirable, there are times when we trade off principle for the practical.
Testing and refinement
The rationale for iteration is predicated on a recognition that there should be testing and tuning. Just as with analysis and design, the testing also has ranges of practicality. Ideally, we’d conduct an A/B test with a representative audience. However, there are times when this isn’t practical. Pilots are valuable smaller steps, where we trial parts with folks close to the representative audience. Expert, stakeholder, and/or audience review, versus testing, are shortcuts that have value even if not as fully informative.
Similarly, while we should budget in time for several tests and refinements, we might have to settle for one cycle of revision. This comes from practical reasons such as project management and budget as well as principled reasons coming from a well-understood and common problem.
However, the ultimate goal of impact suggests moving upwards along all dimensions. In the long term, the ability for learning & development to truly assist organizations will come from identifying and properly addressing real needs based upon evidence, not assumption. That’s a strategic goal to be desired.
In conclusion, designing for learning impact requires a systematic approach rooted in understanding organizational needs, conducting a thorough analysis, and applying principles of learning science to interventions. By identifying performance gaps and addressing root causes, we can create effective solutions, whether they involve learning experiences or performance support tools. Iterative design and testing are crucial for refining interventions and maximizing their impact. While practical constraints may limit the extent of testing and refinement, the ultimate goal remains the same: to drive meaningful change and improvement in organizational outcomes. For a deeper understanding of how to design for learning impact, I encourage you to download our eBook, Designing for Learning Impact: Strategies and Implementation. This resource provides comprehensive insights and practical strategies to help you create impactful learning interventions.