Comparative Usability Study
Students fail or drop out of online courses at a significantly higher rate than face-to-face classes. Among faculty (and students) there’s a perception that online courses are “easier,” so students who are unsuccessful in these courses are often labeled lazy or lacking in commitment to their education. Since I have worked with many students who were studying both online and off, I set out to understand what other challenges might contribute to their attrition and inability to complete coursework that they were otherwise prepared for. My previous work with undergraduate students led me to pursue a usability study to see “if the system was in the way.”
For this project, I was a UX research team of one. I was responsible for the concept, design, and delivery of this usability study, as well as reporting and presenting the results. You can read the formal dissertation with citations and references here.
This study required:
- Deep understanding of learning theory, ethnographic research design, user centered design, and usability principles
- Lab and contextual interviewing
- Development of research plan and materials, including moderator scripting
- Qualitative data analysis to identify themes and common user pain points
- Synthesis of research findings into actionable results
- Combining quantitative metrics and qualitative feedback
On a university campus, it’s easy to find students deep in conversation, athletes on crutches, and delivery people with carts using a ramp instead of the stairs to enter a building. Incorporating universal design from the beginning yields structures that are easily used by everyone, regardless of ability. Continuing to design environments that cannot be used by a variety of people using assistive technology devices is the equivalent of an restaurant owner hastily adding a wheelchair ramp at the rear entrance of a building because it is the only door wide enough. In this token compliance with the law, the wheelchair user must separate herself from her group, go to the rear entrance, find out that the door is locked, return to the front to request that the door be open, and travel through the kitchen in order to rejoin her friends for dinner. This is the user experience of students using assistive technology devices to access the internet can encounter when a course is not designed with their needs in mind.
This findings of this study could serve as an initial blueprint for designing more accessible online courses, and provide additional impetus for course designers to adopt a minority identity view of disability in the same manner that many hold an inclusive multicultural view of ethnicity. At the institutional level, utilizing a proactive rather than reactive stance in serving students with disabilities in higher education promotes a commitment to provide a quality, equitable education to all students enrolled.
To look at the user experience of students in online courses and how their perception of that experience relates to grades and/or attrition, I decided on a holistic investigation of student abilities and experiences.
Primary Questions: Does the user interface design of an online course (or course management system) affect performance or persistence of students with and without disabilities? How could we identify common usability factors for students with and without disabilities that affect the decision to complete or withdraw from an online course?
To break that down, I wanted to understand:
How do different usability factors influence students with and without disabilities’ experience in an online course?
How do student characteristics such as self-efficacy, spatial visualization ability, and information literacy affect student perception of usability in an online course?
Which usability factors are most difficult to overcome for students with low spatial visualization ability or low self-efficacy in online tasks?
All of the course systems in use at the participating institutions used a variation of a standard online course template with navigation links on the left of the main content. All included breadcrumb navigation at the top of the screen. Layout was similar to the one shown below:
The majority of participants were unable to easily navigate their course to locate specific course information as requested. The large chunk of text in the main content area was generally ignored as a source of relevant information, even when it included instructions for accomplishing the task as directed.
Cognitive Dissonance: Experience vs. Opinion
Based on completion rates for directed tasks, the researcher found that the user interface presented a barrier to accessing – and therefore learning – course content in the areas that did not comply with standard website design practice. However, the participants generally did not view or understand their access difficulties as usability barriers or failings in the course site design. Despite encountering errors that directly inhibited their progress through the course site, most participants described the system as easy to use and indicated that they would not make any changes for improvement.
Self-Efficacy and Self-Perception
Self-efficacy as reported by the participants in the Web User’s Self-Efficacy scale did not accurately indicate actual technological skills. However, combined with (or preceded by) computer literacy instruction, technological and web-based problem-solving skills appeared to be improved. This disconnect between reported self-efficacy and actual ability held among all but one participant.
Cognitive Load Effects: “I didn’t see that there”
Cognitive load was shown to be an absolute barrier to successful completion of tasks for the participant with the lowest technology literacy, and a source of frustration for users with more technology experience. Usability challenges such as navigational links located within blocks of text increased extraneous cognitive load and consequently inhibited participants’ ability to effectively access course information. Although all of the institutions’ course management systems conformed to the standard layout described earlier, this standard presents undue hindrance to online learners when individual course sites do not reflect common web design and user experience developed in practice online. Participants tended to completely overlook the main text area if it was comprised of a single text block. The participants all looked for navigation information along the top and left sides of the web page, but were confounded by confusing navigation labels and nonstandard locations of navigation information.
It is important to note that frustrating experienced users may result in similar attrition as that of learners with lower technology literacy. Although experienced users are presumably equipped to adjust better to design flaws, they may also reject a flawed course system altogether to remove an unwanted source of stress. They may stop logging into online courses because memorizing information locations and unusual naming conventions takes up valuable mental processing bandwidth that should be utilized for developing new mental schemas needed for learning course content.
Conclusions & Recommendations
Not knowing how to do something that you believe “everyone else” can do is an emotional experience. One participant was completely distraught with what she perceived as her own shortcomings in completing the assigned tasks. There is high value attached to the ability to learn for those who seek learning. Online learners are not just faceless students who may never be involved in campus life – they are people with lives, work setbacks, money problems, triumphs and failures. Success and self-efficacy reinforce one another, and it is certainly desirable to support this success cycle. Design research in conjunction with formal user experience and usability testing for online course interface design could be the key to a more rewarding and inclusive experience for students who represent the broad range of ability and experience in our nation’s colleges and universities.
Recommendation 1: Students should be objectively assessed for computer literacy before being allowed to enroll in an online course for the first time. The IC3 Digital Literacy exam from Certiport is an affordable and fairly accurate way to do this.
Recommendation 2: The design of online courses should be modified and constantly updated to the usability and accessibility standards of general web applications.
Recommendation 3: Faculty teaching online should receive training and ongoing support to understand the difference between face-to-face and online teaching. Such training should also emphasize the need for faculty to demonstrate their own digital literacy.