Written by 2:18 pm Learning Impact, Workplace Learning

Performance-Focused Job Aid Design

In our previous blog, Performance-Focused Analysis, we explored the process of addressing performan…
A graphic depicting the process of performance-focused job aid design. It highlights key steps: designing and testing job aids before training, addressing cognitive gaps, and iterating for improvement. The importance of distributed cognition, supported by external tools and resources, is emphasized. Notable contributors like Rossett & Schafer and Gottfredson & Mosher are mentioned. The graphic showcases planners, sidekicks, and technological aids to enhance performance. Iteration, evaluation, and usability testing are crucial for developing effective job aids that improve performance and reduce cognitive load.

In our previous blog, Performance-Focused Analysis, we explored the process of addressing performance gaps through a comprehensive approach, including performance-focused analysis, gap analysis, root cause analysis, and targeted interventions. We emphasized the importance of understanding these gaps to optimize learning and development strategies for achieving tangible results.

Building on that foundation, let’s examine the subsequent essential phase: when the solution is either performance support alone or with training, that job aid design should be done first. This is because any training design that incorporates the job aid should be developed after the job aid is tested and working as appropriate.

The performance perspective says that we should make sure that folks have the resources to succeed. That includes our cognitive resources, and job aids provide this support. What job aids do is tap into ‘distributed cognition’, whereby our thinking isn’t just in our heads, but is distributed around the world. As Ed Hutchins documented in his quite-academic book Cognition in the Wild, we can distribute our thinking across tools and people. Which provides multiple benefits, including complementary strengths. While our minds are good at pattern-matching and meaning-making, we’re bad at remembering rote and arbitrary information such as transport schedules, and similarly can’t reliably execute steps repeatedly, such as tossing a coin error-free each time. Technology reflects the alternative, and a performance perspective not only respects that but leverages it.

Most job aid design isn’t documented by academic research, but instead by the insight of talented performance consultants. Folks who formed the basis of the Performance Improvement movement, such as Geary, Rummler, Harless, and more, have posited the importance of such efforts. Less guidance exists about exactly how one does this design. Allison Rossett, along with Lisa Shacfer, and Bob Mosher & Conrad Gottfredson are two sets of authors who have focused on aspects of this approach.

Cognitive Gap

Rossett & Schafer posit two different types of aids, planners and sidekicks. The distinction is between ones you use in the moment – sidekicks – and ones that are used before or after an event: planners. Both author pairs emphasize the process, focusing on being clear about the need.

Rossett & Schafer suggest three overarching categories of support: executing steps, making decisions, and accessing information. They then acknowledge how technology has complexified these categories. Gottfredson & Mosher go further, talking a bit about the core cognitive challenge. This is, we think, a useful approach. As previously mentioned, our cognitive architecture has limitations. We struggle with a variety of barriers, including limits to attention and working memory, struggling to remember steps or whether they’ve been performed, accurately capturing the full sweep of sensory data, accurate recall of arbitrary or large volumes of information, and more. Our solutions should reflect the particular source of the problem.

Appropriate Solution

Common solutions have emerged over time. We’ve used auditory and visual recording to capture what our senses can’t accommodate. We’ve used calculators to reduce overhead in solving data problems. Checklists have been touted by no less than Atul Gawande as a tested solution for checking steps. Step-by-step guides have uses from assembling furniture to dealing with flight emergencies. And so on.

For one thing, don’t reinvent the wheel unnecessarily. Look to examples for possible solutions. Unless this is a totally new situation, which is unlikely, someone’s produced a solution you can plagiarize (as far as your lawyers will let you). One source may be the tools the performers have created themselves. They may require polish, but they might be a good foundation. The use of tools can indicate a situation that can be designed better to incorporate the tool, or to eliminate the need for one.

Creating usable tools requires paying attention to the details. In addition to the writing, which should be structured and simple, the visual design also needs to be considered. Not all job aids are visual, so if it’s audio support (think of your GPS), considerations need to be made to vocal and signal perception and clarity. With digital support, tools can also be interactive, and so the design of the interaction also matters.

Iteration & Evaluation

One thing that both sets of authors, and good practice, dictate is that your first solution is unlikely to be ideal. Atul Gawande documented how much testing and refinement of his checklist was required to create the final version that achieved notable metrics. Particularly given the lack of specific categories and design approaches, and instead focusing on process, testing, and tuning are important components.

Here, as with learning design, we should have metrics we’re iterating to achieve. The tool ultimately should improve performance, whether making things faster or more accurate, or whatever improvement you’ve determined. (A noted analysis of a proposed new phone system solution ultimately indicated that it would take longer than the existing approach!) There were performance metrics identified in the objectives that should be evaluated here as a basis for improving the design.

Methods from usability including protocol analysis (having folks verbalize while they perform to understand any cognitive barriers) are useful here. Choosing sample tasks that represent the necessary performance and analyzing steps taken can also be useful. Jakob Nielsen found that only a few users were necessary to find most problems. Iterating between expert review and user test has also been shown to be effective.

Even with the testing (which should be done with learning solutions as well, as we’ll see), job aids typically are a less costly solution than training. Done properly, with appropriate roll-out and messaging, they put the information to succeed in the world, rather than in the head. They make sense where our cognitive gaps are likely to create issues. They should be your preferred solution, when possible.

In summary, designing effective job aids is a crucial step in enhancing performance and supporting our cognitive capabilities. We can create resources that significantly improve efficiency and accuracy by leveraging existing tools, focusing on clear and simple design, and iterating based on feedback. The insights from leading performance consultants highlight the importance of addressing cognitive challenges and utilizing appropriate solutions to overcome them. For a comprehensive exploration of these concepts and to delve deeper into the strategies for optimizing performance, download our eBook, Rethinking Learning: Focus on Performance. This resource provides an in-depth look at performance support and training, offering practical guidance to help you succeed in your performance improvement initiatives.

[mc4wp_form id="5878"]
Close