๐Ÿงช Usability Testing

Overview

I led user testing efforts to improve SBA.govย and ensure the site supported a wide range of users, including those navigating the site with assistive tools.ย 

  • My Role: UX Designer & Testing Lead
  • My Responsibilities: I designed, wrote scripts, facilitated, and coordinated usability sessions with the National Federation of the Blind. I also led the redesign of identified improvements.
  • Timeline: 6 months
  • The Team: 1 designer, 2 developers
  • Tools Used: Zoom, screen reader software, testing scripts

Framing the Challenge

Problem

It was discovered that key content and form interactions on SBA.gov werenโ€™t intuitive for all users.

Pain point #1

Important password instructions were positioned after the input field, which many users missed.

Pain point #2

Image descriptions lacked useful context, making it difficult for some users to understand visual elements.

Pain point #3

Standard testing environments didnโ€™t reflect the tools or conditions users actually relied on in real life.

Hypothesis

By inviting users to test the site using their own devices and tools, we could surface more meaningful insights and identify issues that could impact a wide range of users, including those navigating the site with assistive tools.

Goal

Improve the clarity and structure of content and form elements to ensure all users can complete tasks with ease and confidence.

Process & Approach

Key contributions

  • Facilitated usability testing with members of the BUILD (Blind Users Innovating and Leading Design) program from the National Federation of the Blind
  • Wrote test scripts, designed session formats, and handled all participant communications
  • Collaborated with developers and content strategists to apply findings and make meaningful changes to the site

Discovery

  • Partnered with the National Federation of the Blind to recruit participants who brought their tools and techniques for site navigation
  • Captured real-time challenges and feedback through recorded sessions and detailed observations
  • Focused on how users interacted with forms, read content, and interpreted visual elements

Design

  • Reordered password field instructions so they appear before the input, allowing all users to see and hear them at the right time
  • Rewrote image descriptions to include clearer, more informative context for users relying on non-visual cues
  • Established a user feedback loop by checking updates with participants and incorporating their suggestions into final designs

Challenges & Solutions

Challenge #1

Users missed important instructions when filling out forms

Solution #1

Moved guidance above the field to ensure it was encountered at the right time

Challenge #2

Visual content lacked clarity for users who are not sighted.

Solution #2

Enhanced image descriptions in collaboration with the content team to improve understanding and usability

Challenge #3

Traditional testing methods felt too generic or disconnected from real user experiences

Solution #3

Invited users to bring their own devices and tools to create a more authentic, personalized testing environment

Outcomes

Deliverables

  1. User testing scripts and session recordings
  2. Updated design documentation and site revisions
  3. Revised content guidance for image descriptions and form fields

Results & Future

  • Improved form clarity and content structure for a broader range of users
  • Reduced confusion during registration flows and increased successful form completion
  • Fostered greater user confidence by ensuring critical information was delivered in the right order and format

Key Takeaways & Learnings

Design must reflect real-world use

Testing with real tools and real users uncovered insights that standard methods couldnโ€™t.

Content clarity is just as critical as layout

Small changes in how and when information appears can make or break a task flow.

Users are experts in their own experiences

Empowering users to guide improvements led to better, more thoughtful design decisions.