Incorporating usability testing throughout the entire design process, especially before launch, allows you catch glitches and/or make design changes prior to anyone seeing it live. When more than minor adjustments need to be made to your site, it’s much better to have completed them before the public sees it.
For Christina Mullins, a Contracting Officer at the Public Building Service in the General Services Administration (GSA)’s Region 3 based in Philadelphia, usability testing was a new frontier, and one that quickly proved valuable.
“For me, it revealed a lot of information that I would never have even thought of,” she said. “As a member of the team so involved in the content, we thought ‘of course you would put this here’, but as you talk with people who were independent, you see people think differently.”
Mullins recently led a project to create an internal agency website called the “GSA Acquisition Portal,” designed to be a “one-stop shop” resource site for GSA’s acquisition workforce, and was asked by a colleague to investigate usability testing.
She connected with Jonathan Rubin, Program Manager for the DigitalGov User Experience Program, and with a little guidance, was able to design a simple but effective test based on completing 12 tasks the project team felt people would want to accomplish with the site.
The website project team consisting of members from various backgrounds and offices in the agency reviewed, prioritized, and finalized the test script. Volunteer testers were recruited from a wide range of positions to represent the breadth of GSA employees who will use the site. The tests were performed on semi-functional pages in a staging environment, completely virtually.
“I’m in Philadelphia and the team members are dispersed around the country,” she said. “For each test participant it took no more than an hour, and it was done completely remotely with [screen-sharing software].”
The results were eye-opening.
From information architecture and appearance of “buttons” to the order of lists and assumed meanings of words, the testers highlighted many issues they found confusing. As a result, Mullins and her team were able to make crucial adjustments before launch.
Take a look at some of the “before and after” images below:
Test participants had difficulty finding the FY 2010 Procurement Management Review (PMR) Report, because they did not consider it a “best practice.” Once participants navigated to the “Best Practices” supporting sub-page, they again stated that the links there were not really “best practices.” “Reports & Publications” is a better category description.
During the test, many people scrolled right past the hot buttons, sometimes not noticing them. The design was changed to look more like “buttons” users would think to click on.
Some of the feedback included expanding the Quick Links section to a wider set of activities. Several new links were identified and added. Also, test participants commented that the order of the links was confusing. The order was initially set to follow the acquisition process from solicitation to contract award, but alphabetization is more intuitive.
Mullins also noted the testing served as a motivator for her team.
“We had a lot of people… provide positive feedback, which was motivation for the team in general,” she added. “After working on the project for four or five months, you start to lose momentum and [the testing] was great to keep us going and assure that we were moving in the right direction.”
For those looking to perform usability testing on their agency projects, Mullins said it’s important to “get buy-in” from others on the team and in management. From there, she offers the following advice:
“Go read a case study and drink the Kool-Aid, then the next thing to focus on is writing a good test plan, and make sure you build enough time into your schedule to make it happen.”
For more information on usability testing, visit the DigitalGov User Experience Program page or join the DigitalGov User Experience Community of Practice.Edit