Forms and disclosures are a central component of business and customer interactions. However, they often lack good visual organization or clear and concise language, highlighting a distinct need for more extensive usability testing and research. In particular, eye tracking serves as an excellent tool for evaluating and improving paper and electronic forms. In this paper, we present numerous examples of the benefits of eye tracking for form usability as well as practical considerations for conducting eye tracking on paper forms. In addition, we provide two case studies of paper form eye tracking. One involves a paper diary designed to track users’ television viewing habits and the other is a multi-page government form. Our experiences suggest that paper forms are amenable to traditional usability testing practices and also benefit from the additional insights gained through eye tracking.
You have full access to this open access chapter, Download conference paper PDF
Business forms and disclosures have become central to customer relationship management. Forms are used to solicit information from the customer in a standardized manner, and disclosures are used to communicate rights, facts, risks, and other important information to the customer. Anyone who has ever completed a medical history form at a hospital or accepted the terms and conditions of a new software download has interacted with a business form or disclosure and, more than likely, the unnecessary jargon, redundancy, ambiguity, and obscurity associated with many of these documents. It should therefore come as no surprise to learn that most consumers are unable or unwilling to read these documents [1].
Although some may speculate that businesses deliberately make forms opaque, there are just as many benign causes related to the difficulty in organizing and presenting large amounts of information to a wide variety of audiences. In fact, this situation led the U.S. Government to pass the Plain Writing Act of 2010 [10], which requires federal executive agencies to use plain writing in all documents agencies issue to enhance citizen access of Government information and services. Footnote 1 The latter part of this requirement is essential because it highlights the need to collect and assess information from users on their experience with forms and their language.
Typically, when we consider the application of usability testing, it is in regard to complex technologies such as websites, software, or applications, but usability can apply to any context where specific users are interacting with a product or information in order to reach a specific goal [5]. In this respect, forms and disclosures are no different from applications or websites. In general, the goals of forms and disclosures are to extract accurate information from users and convey important, necessary information. In order to meet these goals, these documents must comply with the same usability principles that are used in more complex applications. Peter Moorville [8], in particular, expanded upon the concept of usability and illustrated the facets of user experience. Table 1 shows each facet, its general application, and how it applies to form usability.
This finding highlights the importance of designing for desirability and credibility of a form, as well as how eye tracking helped inform this finding. Without noticing the name and logo of the sender, respondents are unlikely to find mail material trustworthy, and they will often quickly discard it. And without a desirable cover, respondents are unlikely to open the diary to get started with the process. Here, eye tracking was able to inform designers about the elements on the cover that were most likely to be missed. As a result, we recommended moving the logo and name to the center of the cover, where it would be more visible.
Example Step 3 Findings. Step 3 asked participants to answer questions about who watches TV in their household by entering this information in specified slots at the top of the page. The “Old” diary and “Prototype” diary contained an example directly below the fillable slots, while the “New” diary contained an example beneath the fold. Unlike Steps 1 and 2, which progressed linearly from top to bottom and left to right, Step 3 forced participants to start at the bottom of the page and move to the top in order to progress in the correct sequence. The “Old” diary and “Prototype” diary, performed poorly in regard to noticeability. Eye tracking showed that 45 % of participants did not fixate on the example until after they had completed the fillable areas—a finding that exemplifies the importance of the “findable” facet of user experience. We recommended that the example be placed before the actual content, where users will be more likely to notice it before they are asked to input information.
Example Step 4 Findings. Eye tracking also provided relevant insights into how participants processed the fillable fields in Step 4. The “Old” diary and “New” diary had a column order that listed the station and channel number earlier in the sequence. The subsequent columns requested that participants enter the name of the program and the people in the household who watched it. Gaze plots indicated that participants were confused about this order, because they did not look in an orderly left-to-right pattern as one would expect. Gaze plots indicated that participants did not look in an orderly left-to-right pattern— a result that demonstrated the inefficiency with processing information in this layout. The “Prototype” diary had a different column order which had participants enter the name of the show before the other information. Gaze plots revealed a more F-shaped pattern [8] of participants looking down the page and then looking in a linear left-to-right pattern when entering information, — a result that demonstrated a marked improvement in processing efficiency. We recommended the ordering of the “Prototype” diary, with the television show entry field occurring earlier in the progression (Fig. 2).
Practical Considerations. Our first challenge with conducting the study was selecting the placement of the eye tracker. At first, we tested the usability test setup with an older and larger eye tracker (Tobii X120) and mounted it above the form. We encountered issues with this setup when we found that participants’ eyelids were more likely to block the eye tracker from collecting data. Before data collection began, a newer and more compact eye tracker (Tobii X2-60) was launched. The size of this eye tracker allowed us to place it below the form. This placement resulted in a considerably higher rate of capture.
Other challenges with implementing eye tracking into this study related to the interactive nature of the form. Responding to the form resulted in head movement; consequently, during this interaction, the eye tracker was less likely to collect eye movement data. As a result, we focused on providing results from data collected while the participant was not moving his or her hand to respond. Participants also tended to block the eye tracker while they wrote on the form. To overcome this, we delineated an area below the form so participants had a cue as to where not to place their hands.
Introduction. This multi-page form is completed to report discrepancies in personal finances to the Federal Government. The purpose of this study was to determine the issues associated with completing the form and provide recommendations to correct those issues. Full details of this study have been presented by our colleagues [12].
Stimulus. Before the form began, there were three pages of instructions with text and tables. The first part of the form was a series of “yes” and “no” questions to help respondents self-assess whether they should continue with completing the form. The second part of the form asked respondents to enter their personal information and the information for a family member. The third part of the form was a grid format that asked participants to account for different amounts of items associated with their income and expenses. The fourth and last part of the form asked for the respondent’s signature.
Method. Nine people (4 male and 5 Female), with an average of forty two, and diverse backgrounds from the Washington, DC, area participated in this study. Eight of the nine participants reported that they typically complete this form themselves by hand or with computer software; one participant reported to complete the form with the assistance of a professional. The form was mounted to a flat vertical surface for viewing and writing above a Tobii X2-60 eye tracker. In additional to the eye tracking, we collected conventional performance and self-report measures.
Before starting the task, participants completed a questionnaire that asked about their past experiences with this type of form. Participants were provided with scenario information so they did not enter their own personally identifiable information. Participants were also provided with supporting documents that helped them complete the form.
Example Findings. Eye-tracking gaze plots demonstrated that participants read most information on the first page of instructions, skimmed through the second and third pages of instructions, and then began working on the form. An analysis of the quantified fixation data demonstrated that, although most of the first page instructions were fixated, certain areas of the page, particularly the information that came later on the first page, tended to be skipped.
Most of the sections on the second page of instructions were not fixated. However, the center of the second page, which was a numbered section, had higher counts of fixations. This is consistent with research that has shown that users tend to read numbers and bulleted, bolded items [4, 9]. Participants had very minimal fixations on the third page of instructions.
Participants’ self-reported comments during the debriefing interview supported the eye-tracking and performance data about the use of instructions. One participant commented on the length of the instructions by saying, “I feel like the instructions were so long and perilous that I could really only retain about 10 %.” On the satisfaction questionnaire, participants, on average, responded that they read “some of the instructions.” In debriefing, we asked participants about the way they usually interact with tax form instructions. Responses indicated that, consistent with the session observations and eye-tracking data, participants tended to skim and skip instructions. For example, one participant said: “[I used them] the way I usually do. I read the first page and then I skip the rest. It’s typical of how I do it. I read about 30 %, and I know that the information is there and that I can go back. But I don’t ever finish the whole instruction booklet.” Another participant said: “I usually skim through it. Usually [these types of] forms are laid out the same way.”
The evidence suggested that the instructions are not being used as intended and highlights the importance of the “usable” facet of user experience. The instructions were presented in narrative format, which suggested that respondents should read through the full instructions before they start the form. However, our research suggested that most respondents will visually scan the information on these pages before they start. The later the information appears in the instructions, the less likely it will be read. Because the instructions are lengthy, they are more likely to be used as a reference when working on the form, and less likely to be read before getting started.
We recommended moving instructions to sections of the form where the specific information. For example, we recommended placing a condensed version of the instructions for completing the line items next to their respective fillable lines in the form. We also recommended that the remaining information in the instructions should be reformatted to facilitate an efficient scan pattern, such as reformatting text by chunking different pieces of information together into bulleted lists with bolded items.
Practical Considerations. We faced similar challenges to those in our first case study. In this study, we also used the more compact eye tracker (Tobii X2-60) and placed it below the form. In the first case study, the form was mounted at a 45° angle. In this study, we mounted the form to a vertical flat viewing service at a 90° angle. This resulted in a trade-off: we collected a higher rate of eye movement data samples, but it made the form more difficult to write on. In this study, the emphasis was on the use of the instructions, so we proceeded with the vertical setup. As in the first case study, we focused on providing results from data collected while the participant was not moving his or her hand to respond.
Despite playing an important role in organizations and customer relationships, forms and disclosures are often poorly organized and difficult to complete. Principles of usability testing commonly employed in more interactive and complex applications provide significant value to improving the design and organization of forms. In particular, the use of eye tracking provides an additional level of insight into users’ attentional allocation and progress through a form or disclosure. In this paper, we have summarized and synthesized relevant literature related to electronic forms and presented two case studies demonstrating the efficacy of usability testing paper forms and emphasized the value derived from including eye tracking in these tests. In addition, we have provided practical guidance for user experience researchers considering using eye tracking for paper forms. In summary, we have shown that forms regardless of medium can benefit from the same usability methods and measures implemented in more interactive environments. In the future, we hope businesses and organizations recognize the value of usability testing and the additional insight eye tracking delivers in creating useful, usable, accessible, credible, findable, and desirable forms.
The Plain Writing Act of 2010 is not the first declaration of this type, but it does represent a high-profile, high-impact piece of legislation.