Discussion of Usability Test Results
Research Question: In Programs and Services, do users know what they will find in each section? Which sections can be improved?
Overall, users had positive experiences with the Hennepin County Library website and productive usage within the programs and services section. Generally, users knew what to find within each Program and Services menu item, and the usability test yielded data that shows which aspects of the Programs and Services section can be improved. Results from this usability test indicate users generally appreciated the site and found it useful to complete their tasks.
Quantitative data measurements of efficiency and effectiveness indicate:
- Users can successfully locate printing services.
- Most users found it difficult to locate resume help, given a specific library.
- Users enjoyed using the events calendar, but often struggled to search
- Users did not successfully use the website search function, and generally avoided this tool
- Many users searched for information about programs and services in the locations section
- Users wanted events to be linked, either similar events at the same library or similar events at different libraries
Preferences and self-reported data
At the end of our usability test, participants circled words that described their attitude toward Hennepin County website (figure 2.0). These product reaction cards identify adjectives that describe each user’s experience during the usability test. Five participants each choose five words and only one word was negative, or 4%. The negative attitude was the word confusing, however it should be noted, that same participant also chose the words reliable, trustworthy, simplistic, and organized to also describe satisfaction.
Our data measures efficiency as a dimension of usability. In the Performance section (to come) we will discuss how our results measure efficiency quantitatively through comparison to our benchmarks for average time on task, insufficient information that causes users work harder than they need to, and completion rates for tasks. Here in the Preference section, users’ self-reported data measures efficiency qualitatively. Data from the reaction cards indicates efficient web design―users found the site accommodating to their needs and facilitative to their goals.
The ease of use throughout the site and the utility of the design are reflected in self-reported attitudes of straightforward and informative, which were attitudes of three out of five participants. These attitudes indicate the site allowed participants to finish their tasks with ease. Inefficient options, such as difficult, distracting, technical, or primitive, were not selected in this usability test. Users saw the site as an efficient means to their goals as well as modern (chosen twice) and trendy (chosen once) in design.
Our data also measures effectiveness as a dimension of usability. In the Performance section, we will discuss how our results measure effectiveness quantitatively through comparison to our benchmarks for post task difficulty ratings, error occurrences, and completion rates for tasks. Here in the Preference section, users’ self-reported data measures effectiveness qualitatively. Data from the reaction cards indicates effective web design―users found the site accommodating to their needs and facilitative to their goals.
The integrity throughout the site and the approachable nature of the design are reflected in self-reported attitudes of organized and understandable. Organized described the attitude of all five participants. These attitudes indicate the site contained enough information that was accessible, allowing participants to finish their tasks with ease. Ineffective options, such as overwhelming, unhelpful, inconsistent, or cumbersome, were not selected in this usability test. Users saw the site as an effective means to their goals as well as trustworthy (chosen twice) and intuitive (chosen once) in design.
Performance and observed data
Users reported the site was both efficient and effective during the product reaction cards section, but our analysis will take a triangulation approach to answer our research questions. We will use multiple methods and different metrics:
- Questionnaire responses
- Participant comments
Critical Issue #1 Events not connected
Events did not connect to similar events, users were looking for connections for similar or related events. For scenario 2, users wanted help updating their resume at Ridgedale Library. All five participants found a job-related event, however the event users found was not 20-Minute Resume and Job Search Assistance for Ridgedale Library. One participant found Job Search Assistance, ending the task without completing it.
Participants 3 and 5 commented that they wanted similar events to reference one another when the events are similar. Participant 5 commented, “Having options if it’s available at more than one library would be easier.”
For instance, Job Search Assistance for Brookdale Library did not link to Job Search Assistance at Ridgedale Library. Another example is that Microsoft Word Basics for Building Skills for Ridegdale Library did not link to 20-Minute Resume and Job Search Assistance for Ridgedale Library.
All five participants found events for either Job Search Assistance or Ridgedale Library, if these events linked to one another and to 20-Minute Resume and Job Search Assistance in the “view Details” (figure 2.7), users could have completed the task with more efficiency and more effectively.
Critical Issue #2 Preliminary search results
When users searched the Events Calendar, today’s events populated the search results before users entered search criteria in the “Search events for” field. Participants 1 and 2 ended their search there, without adding the key word to the search box (figure 2.8). Participant 2 uses this function (unsuccessfully) for two tasks.
No users received this preliminary search result and continued to narrow the search. Participant 2 was not able to locate resume help and gave up on this task. That participant noticed the highlighted date upon an unsuccessful search and commented, “I need to know the date to find the events I’m looking for.”
Every user said they or a friend would be likely to use this service. Participant 1 commented, “I could find it if I took more time. This probably isn’t a common offering, that’s why it was not easily accessible.”