This past year, I led an effort to redesign the staff Intranet site for the National Center for Complementary and Integrative Health (NCCIH), at the National Institutes of Health (NIH). After months of surveying, planning, and testing (see the part 1 blog post, How Do You Redesign a ‘Dinosaur’? Redesigning an Intranet Site: the Beginning Stages), the site was launched in Fall 2017. I learned several helpful lessons along the way that I wanted to share:
After spending 22 years in the U.S. Army, including 3 years as a recruiter, Julie Jackson realized that not only was she qualified to work in usability, but had a knack for it—especially because of her ability to strike up a conversation with nearly anyone, anywhere. Julie shares how her training in the Army has helped in her approach to usability testing, and gives a peek inside how usability testing works for USAJOBS.
At the beginning of 2017, the ITIF (Information Technology and Innovation Foundation) released a report that benchmarked 300 federal websites in four areas: page-load speed, mobile friendliness, security and accessibility. Some sites fared better than others, but the report highlighted that our federal sites have a ways to go (DigitalGov included) in these areas. Looking at these four metrics is important as they directly impact our customers’ first perceptions of the quality of our government’s digital services.
The Information Technology & Innovation Foundation (ITIF) recently published a report, Benchmarking U.S. Government Websites, that looks at the performance, security, and accessibility of the top 297 government websites. ITIF is a think tank in Washington, D.C. whose mission is to formulate, evaluate, and promote policy solutions that accelerate innovation in technology and public policy. Over the past 90 days, government websites were visited over 2.55 billion times. According to the Analytics Dashboard, 43.
The U.S. Web Design Standards are a library of design guidelines and code to help government developers quickly create trustworthy, accessible, and consistent digital government services. Last month, we announced the 1.0 release of the Standards, a milestone that signals the Standards are a stable, trustworthy resource for government designers and developers. By using the well-tested and easy-to-implement code from the Standards, developers can quickly create new websites or have a leg-up in updating existing services to have a modern, consistent feel.
USA.gov’s Analytics Success: using analytics data to inform design and responsivity to create a better experience for the user Last year, the USA.gov team found themselves facing a challenge. We were in need of a new content management system for our websites, USA.gov and Gobierno.USA.gov, which help people find and understand the most frequently requested government information. We wanted to align the content on those websites with content in the knowledge base used by our contact center; up until this point, the information in those two places had been similar but unique.
Many content managers in the digital world understand the irrepressible desire to improve, fix, edit, add, and move things around. It’s our job, after all, to nurture the ongoing process of creating, updating, and testing. But, there are those sites or pages that never seem to make it to the high-priority list. For our Web team, this was our Center’s staff Intranet site. Our Web team recognized that the Intranet was in need of attention.
What is mobile-friendly? Mobile-friendly simply means your visitors can use phones and tablets to visit your website and have a user-friendly experience. Many of us get toward the end of mobile site development and really do not know if what we created is “mobile-friendly.” We think we have followed all of the mobile best practices and performed usability testing. However, do we have something concrete to quantitatively certify that we are mobile-friendly?
Too often, usability and accessibility are confused with each other by our clients (stakeholders). They shouldn’t be, because while they are related, they are very different. So, how do you bring these two concepts together? They should really be working side-by-side throughout the ENTIRE process. This might seem like a no-brainer but it can be a challenge. First things first, Section 508 of the U.S. Rehabilitation Act is a LAW.
I recently wrapped up a series of user interviews as part of a review of our judiciary-wide intranet in order to provide better digital services to our customers (and yes, our internal users are our customers, not just the general public). As I prepare to delve back into determining user and content needs for a more varied audience and wider platform, I thought it might be helpful to share lessons learned during my recent effort and any new strategies that might be helpful for anyone getting ready to jump into their users’ brains.
Your audience is not homogenous. No matter the agency, target audiences are not only diverse, they are diverse on a multitude of factors. Recently, evolving trends in multicultural marketing have gained attention as organizations adjust their marketing and outreach strategies to meet 21st century realities. Marketers who recognize the need for a coherent, effective multicultural strategy have turned to the Total Market Approach (TMA). A coalition of marketing agencies, clients and associations led by AHAA: The Voice of Hispanic Marketing released an industry-sanctioned definition of TMA in September.
Much is being said and written about the coming Mobilegeddon/Mopocalypse on April 21st—the day Google’s ranking algorithm will begin boosting results for mobile-friendly sites and penalizing mobile-unfriendly sites. While some agency websites are mobile-friendly, a great many are not. We will do well to pay attention—almost 25% of traffic on government websites is coming from mobile devices. And if responding to the UX needs of 25% of site visitors is not enough argument, perhaps the Google algorithm update will convince agencies that it’s time to upgrade.
It’s a forgone conclusion that usability studies are effective in identifying weak points within a website, but what about testing people who are visually impaired? How hard is it to accommodate them? There are some additional challenges that you may encounter when conducting testing with people with disabilities; however, these challenges should not be considered overwhelming. I spoke with Peter McNally, a Senior Usability Consultant at the User Experience Center at Bentley University, to get his take on usability testing with users who have visual impairment.
At the U.S. Agency for International Development (USAID), our new open data policy will begin making more Agency-funded data broadly accessible to the public. It completely changes the way we do business, and it also means that in the coming years, the amount of data we host on our open data website (known as the Development Data Library) will dramatically increase. So the question is: when we’re done overhauling our website, how will the user make sense of all that information to find exactly what they’re looking for?
To improve your digital systems with user experience (UX), you need people. And to get people in government, you need position descriptions. While DigitalGov has collected a wide variety of position descriptions, I thought I would create a post specifically on UX positions, and explain the difference between these jobs. Yes, there is overlap. But this is still an excellent place to get started. I am indebted to the helpful heroes at USAJOBS for scouring through their vast job database to find these examples.
Anything built should be built right. It doesn’t matter if it’s built of wood, carbon nanotubes or code. So it’s encouraging that the practice of User-Centered Design—getting customer feedback at every stage of a project—is catching on with APIs as well. When we think APIs, we mostly think of developers and not designers. But the experience of those who want to use your APIs isn’t just dependant of the strength and elegance of your API.
How do you define user experience (UX)? That was the question posed to more than 100 people at the GoodGovUX event at the Artisphere in Arlington, Virginia, on February 24th. Attendees learned how government can improve the user experience of digital products, from intranets to forms to good ol’ fashioned websites. GoodGovUX co-founder Keith Deaven collected responses from the crowd, which was a diverse mix of people working in private industry, federal, and local governments.
Audit. It’s a word that generally has no positive connotations whatsoever. We hear the word audit and we think of tax audits or timesheet audits, etc. The word normally strikes fear or dread in the hearts of most mortals. But it is also a task that all websites will need to perform from time to time, and hopefully after reading today’s column you can view content audits as positive opportunities and not as dreadful chores.
User Experience (UX) is the comprehensive experience a person has when using a product or application, and usability is the ease of use (or lack thereof) when using it. Many of us have discovered the vast advantages of evaluating usability on our own; however, getting others to jump on board is often a different story. The most difficult part of integrating an effective UX program in your organization is getting the initial buy-in from developers and stakeholders.
Users don’t like surprises. Unexpected or unwanted content undermines the credibility of your agency and frustrates users who come to your website looking for specific information. Using links appropriately in your website content is one way to build trust with users, according to an article by Kara Pernice of the Nielsen Norman Group. Here’s a real life example: If the link above led to an article about 3D printing, you’d probably be pretty annoyed right now.
The word accessibility breeds misconceptions. Why? Because accessibility is something that scares you. Accessibility is hard. Accessibility needs people with specialized expertise. Accessibility problems often depend on the context of the website or Web application in question. Accessibility takes time. Accessibility is a legal mandate. Accessibility is a moral obligation. These statements are both true and misconceptions. The misconceptions happen when you try to solve accessibility problems with just accessibility solutions.
Being able to design a website that users love is not too far away from being able to read their minds. While designers can’t read minds, that doesn’t stop them from using their website’s top tasks to make it seem like they can. A website’s top tasks include 5-10 tasks (depending on the scope of the site) that the majority of the website’s users want or need to do on the site.
In Design Secrets of the World’s Best e-Government Web Sites, the Asia-Pacific online communications powerhouse FutureGov singles out eight national e-government portals as the best-designed in the world, and identifies the best practices these sites exemplify. “Ultimately, these websites are the best in the world because they are designed to be practical, simple, quick and adaptable,” writes Joshua Chambers, editor of FirstGov Digital. “One core principle stands out above all others: a well-designed government website must make it as easy as possible for citizens to find the information and services that they need.
In one sense, almost any type of user research is crowdsourced—you’re talking to people and using that information to improve your system. But in a true sense, crowdsourcing is more than just collecting information, it’s collaborating on it. We want to have real conversations, not one-time emailed suggestions without followups. So here’s a few tidbits on crowdsourcing User Experience (UX) for your site, mobile app, API or whatever else you’ve got cooking:
This past year DigitalGov University has hosted at least one Usability event per month and we thought we’d give you a round-up of those events. After all, November 13th was World Usability Day. Since this year’s theme of World Usability Day is Engagement it would be great to take a look at the event recap article, Improving the User Experience with Usability.gov. The folks at Usability.gov took a user-centered approach to refresh their site and make the design more engaging.
The cream of the crop of the top of the mountain of ALL of the surveys I run has to be the Federal User Experience (UX) Survey. It’s the second time I’ve had the privilege of running it with Jean Fox, research psychologist extraordinaire from the Bureau of Labor Statistics. When I start thinking about learning what all of my UX colleagues are doing, and designing solutions for them based on real data, I start clasping my fingers together like Mr.
Bob goes to a popular federal government site, using his assistive technology, and starts reading a teaser for an article. Just below the teaser, there’s an embedded video on the page. He presses the tab key, trying to navigate to a link for the full article, but suddenly he’s trapped—he can’t tab past the video. He’s stuck, and he can’t access the content. Frustrated, Bob leaves the site.
Whether they pop up while perusing an e-commerce site or land in your inbox after your bumpy flight in from Chicago, surveys are used in many different industries to gauge customer satisfaction and glean insight into user motivations. They are a useful tool in the kit of a user experience designer or anyone who is involved with improving the usability of a product. Surveys seem deceptively easy to create, but the reality is that there is an entire industry and an academic field based on survey design.
Editor’s note: Building off the great discussion started around Customer Experience, we’re looking at the difference between User Acceptance Testing and Usability Testing. If you develop software, you’ve probably heard of User Acceptance Testing. You may also have heard the term Usability Testing. Same thing, right? Nope. And confusion here can cause big problems. Last year I was developing a mobile game for Android—think Whack-A-Mole meets mutant veggies. Eight months into the project we decided to do some user acceptance testing to find some bugs before launch.
After an agency-wide redesign of program websites that targeted the public and prioritized a common “look and feel,” the federal Office of Child Support Enforcement at the Administration for Children and Families had a visually appealing website. The problem: Key stakeholders—state and tribal child support agencies, employers, and other partners who deliver program services and access the site daily—complained they could no longer easily find needed information. Their feedback prompted us to facilitate a UX-minded focus group to recommend improvements that met both users’ business needs and the redesign goals.
Most people relate the term “heat map” with something they see during the weather forecast on the nightly news, those colorful maps that vividly illustrate how hot it’s going to be during an impending heat wave. The word “heat map” may not usually however, conjure up images of a widely used Web usability tool; but for those who manage Environmental Protection Agency’s (EPA) website, that is exactly what the phrase brings to mind.
When it comes to Web and software design, the pen(cil) is often mightier than the Design Suite. What I mean is: Tech is cool, but don’t fall under its spell. It’s often when you remove the technological layers between you and your thoughts that the best ideas sprout. You’ve heard of great ideas that started on bar napkins, right? One way that low-tech beats high-tech is when it comes to conceptualizing early-stage design ideas.
Imagine this: You just found a great online tool that can help you do your federal job 100% better. You’re all ready to download it and start conquering the world when someone asks, “Have you checked the Terms of Service?” You’re not sure what they’re talking about, what a Terms of Service is, or why you need one. Let’s answer this and more in our Terms of Service Flowchart (click the image to the right to download your own PDF copy of this chart for reference) and our Terms of Service FAQ:
For a small shop with a small staff, limited time, and a small budget, redesigning a website (and testing that redesign for usability) can be daunting. At least it seemed so to us when we redesigned the National Oceanic and Atmospheric Administration (NOAA)’s National Ocean Service website in November of 2013. We met the challenge by keeping things simple. One solution was to adopt the popular, open-source Twitter Bootstrapframework, which is very flexible and well documented.
User testing isn’t just for websites—it’s for any product that has an audience. Which is everything, really. And that includes print materials, signage and infographics as well. Focusing on the User Experience is especially vital for the U.S. Food and Drug Administration (FDA), which is committed to effectively communicating about products that affect the public on a daily basis. Brian Lappin works for the Risk Communication Staff at FDA. His team supports the agency in making sure that all types of communications—video, graphic and Web—are easily understood.
Most analytics tools can tell you how many times a link on your page is clicked on, but they can’t help you draw conclusions about a page with just a mere list of top links. A tool called a heatmap turns data into a data visualization, so you can more easily see how people are interacting with the design. With it, you can find out some really important stuff: if the page design plays a part in clickthroughs, where on the page your users are moving, and what on your page might be worth featuring/not featuring.
Over the last 18 months, the intrepid Mobile Gov team has worked with you to prioritize a set of guidelines and recommendations for good mobile user experience; categories are ranked by priority and tagged by user experience concepts such as information architecture, content, functionality, design, trustworthiness, and user context. The primary purpose of this set is to put the user’s main task up front. Thus, while you’re testing your mobile site’s individual functionalities, don’t forget to make sure that your users can reasonably complete their tasks.
One of the most important jobs for an organization is to think about the entire ecosystem of their brand and what the user experience is across each channel. Whether it is through accessing information on your site through various devices, calling a help line, engaging through social media, and/or having a face-to-face conversation, there may be any number of combinations for how people interact with your organization. And the expectation is that the tone, interactions, functions, and visual design will all be cohesive.
Usability and accessibility are slightly different lenses to assess user experience. It is possible to be strong in one area and weak in the other. Using either approach alone could result in an inaccurate view of your site’s user experience. Evaluating your website with both usability and accessibility in mind gives all users the best possible user experience. What is Usability? Usability relates to the how easy things are to use.
Plan and analyze. Write and design. Test and refine. As Web Manager for Usability.gov, I have found that taking a user-centered approach is vital each time you improve or build a digital product, especially when the content is about improving user experience. In our recent reboot of Usability.gov we put our own advice to the test by evaluating the existing site and analyzing the extensive feedback on the concepts for the redesign.
Digital metrics are critical for measuring, analyzing, and reporting on the effectiveness of your Web, mobile, social media, and other digital channels. Every agency should have a metrics strategy to measure performance, customer satisfaction, and engagement, and use the data to make continuous improvements to serve its customers. Part 1: Common Metrics: Guidance, Best Practices, and Tools Part 2: Reporting Requirements and Common Tools Part 3: Rationale and Framework for Common Metrics and Measures Part 4: Case Studies, Training, and Additional Resources Part 1: Common Metrics—Guidance, Best Practices, and Tools Agencies should ensure that they collect, analyze, and report on a minimum baseline set of performance and customer satisfaction measures.
When redesigning a site, it’s easy to place menu items, text and other content wherever you can make them fit. It’s harder to take a step back and ask the strategic question: Is this the best place for this? A good rule of thumb is to never make any changes randomly—base your decisions on user data. The DigitalGov User Experience Program team evaluated Business.USA.gov on June 1, 2012, and their usability recommendations were adopted by the Business.
If you want to make a website more efficient and user friendly, then it’s not enough just to have your most valuable information on the site. People are busy—they want to find what they’re looking for, and they want it fast. You don’t always need to redesign an entire site to make things easier to find. Sometimes, a few small changes can do the trick. The DigitalGov User Experience team looked at Army.
More and more people use search as their primary means of finding what they are looking for. When users get confused by the search results, or can’t immediately find what they are looking for, they’re going to get frustrated. They may even leave the site for good. The DigitalGov User Experience Program helped test Regulations.gov on October 5, 2012, to find three high–priority, fixable problems that could make the user experience much easier and more pleasant.
When users interact with a website to find information, it is important that we help them find their way by using plain language, clear terminology and visible help text. On December 7, 2012, the DigitalGov User Experience Program helped test the U.S. General Services Administration’s Contract Vehicle Navigator website. This Navigator site helps contracting officers find contracts that best meet their needs. Through usability testing, three key problems were identified.
Websites allow newer government programs to establish a visual identity that introduces them to users and conveys the importance of their work. On April 18, 2012, the DigitalGov User Experience Program helped test GSA’s Federal Risk and Authorization Management Program (FedRAMP) site, which at that point was less than six months old. Three immediate needs were identified. Problem 1: Purpose of Program Not Clear The homepage text was filled with jargon and acronyms, and provided no clear guidance for the user to understand why they should engage with FedRAMP.