MastersinCommunications.com spoke with professionals in the field of technical communication regarding various projects they were involved in, as a way of introducing the required tasks and skills to those exploring this growing field. The development of every communication product or project should include some combination of the following steps.
Identify Goals, Audience, and Scope
The organizers of an international mountain bike race were looking for a way to increase their visibility and number of riders. They hired a technical writer and designer to create a “race bible,” or guide addressing essential logistical, practical, and procedural information for the event. Once the document had been completed, they hired a usability tester, and asked him to ensure the race bible would serve the purpose of helping riders become familiar with the 100-mile course and increase successful completion of the race.
This case study takes an in-depth look at both the process of usability testing, and creation of a technical communication product, the mountain bike race bible. For this project, the organization hired two separate people to complete these tasks. The technical writer oversaw the design and content of the race bible, while the usability tester helped with strategy, assessment, and revision. If the technical writer had experience as a usability tester, it is possible he could have completed both phases of the project on his own.
Strategy, Planning, and Research
After receiving the assignment from the organizers, the first step for the usability tester was to meet with the technical writer/designer to discuss the strategy behind the content and design he created for the document. Together, they developed a two-year plan for testing, editing, and release of the final product:
- Year 1/Pre-Race: A small group of local riders, already familiar with the course, would be invited to use a prototype of the race bible to go through the course and give feedback on their experience. The document would then be revised, if necessary.
- Year 1/Race: Registered racers would be randomly placed in two groups – one to receive the bible and the other to race without that resource as in previous years.
- Year 1/Post-Race: Riders from both groups were invited to participate in an online survey to give feedback on their race experience.
After analysis and editing, the final race bible would be released to all riders who registered for the Year 2 race.
The technical writer had written the original document based on discussions with mountain bike racers, examining guides from races around the country, as well as the information provided for participants in the Tour de France.
Once the initial draft of the guide was complete, the Year 1 pre-race participants were invited to the first phase of testing. They were all local riders who were familiar with the course. Some had participated in the race in previous years, while others were hobby riders, not racers. Once they had gone through the course with the race bible, they were interviewed as a group, in an effort to create synergy between their responses. A few people were unable to attend and were interviewed individually.
Based on the feedback received from the Year 1 pre-race participants, the race bible was reviewed by the technical writer and usability tester, edited, and produced in hard copy for the group of registered racers selected to test the guide during the race.
The Year 1 post-race surveys were developed based on textual analysis of the prototype race bible, as well as racing guides from other organizations around the U.S. and the Tour de France. The Year 2 post-race survey was based on the information and feedback that emerged from the Year 1 post-race survey.
Test, Review, Revise
The usability tester discovered the three primary categories for the effectiveness of the race bible were logistical information, race-specific information, and non-essential/supplementary information.
- Logistical information included directions to the race site, parking, race staging, and location of aid stations along the route.
- Race specific information included a course elevation profile, terrain maps, and an overview and narrative description of each phase of the course.
- Non-essential information included tips on housing and dining in the area, a list of previous race winners, and race sponsors.
The questions developed regarding those categories helped the usability tester and technical writer revise the Year 1 prototype into the Year 2 race bible.
Production and Evaluation of Product Effectiveness
The final version of the race bible was printed and mailed to each registered rider in a format that was easy to access during the race. A pdf document was also emailed to registrants in case they preferred to use an electronic version. While a live, online version had been considered, most of the race was in backcountry mountainous terrain with limited cellular access.
After each annual race, registrants received an email inviting them to participate in a survey for product improvement of the race bible. This practice was discontinued after the first three years, as feedback indicated the race bible served its purpose.
Usability testing comes in a variety of formats, from highly technical labs to casual conversations with participants. However, there are always three general steps: developing a testing plan, recruiting participants, and analyzing and reporting the findings. Whether the testing is for a race bible, a new software product, or a social media campaign, it allows the tester to identify and fix problems before the product is released. You can find more about usability testing at usability.gov.
Additional Technical Communication Case Studies:
This case study explores the creation of onboarding materials for new employees hired by a growing non-profit organization. It discusses scope and goals of the project, along with the skills needed to complete the training program.
In this case study, a technical writer is tasked with building an online tool to help employees use a new software system. This project involved interviewing staff members, designing the tool’s interface, and writing step-by-step instructions.