Abstract
In this paper, I review the findings of ongoing research in usability and user experience analysis. In particular, I first discuss how real designers and usability evaluators in their own workplaces use findings from usability testing to drive design decisions within a decision-making space. Second, I investigate how designers and evaluators consciously or unconsciously alter raw usability findings when they develop their recommendations. Finally, I explore what these findings might mean for usability education. Ultimately, I ask if these usability evaluators and designers do what we think usability evaluators and designers should be doing.
- Gould, J. D. and Boies, S. J. 1983. Human factors challenges in creating a principal support office system: the speech filing system approach. ACM Trans. Office Info. Syst. 1, 4 (Oct. 1983), 273--298. Google ScholarDigital Library
- Friess, E. Designing from Data: Rhetorical Appeals in Support of Design Decisions. J. Business and Technical Comm. 24, 4 (2010), 403--444.Google Scholar
- Krug, S. 2010. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. New Riders, Berkley, CA. Google ScholarDigital Library
- Landauer, T. K. 1995. The Trouble with Computers: Usefulness, Usability, and Productivity. MIT Press, Cambridge, MA. Google ScholarDigital Library
- Olson, G. M., and Olson, J. S. 1991. User-centered design of collaboration technology. J. Org. Comp. 1, 1, 61--83.Google Scholar
- Polyani, M. 1967. The Tacit Dimension. Routledge & Paul, London, UK.Google Scholar
- Sullivan, P. A, and Porter, J. E. 1990. How do writers view usability information? A case study of a developing documentation writer. ACM SIGDOC Asterisk J. Comp. Doc. 14, 4, 20--35. Google ScholarDigital Library
- Friess, E. 2011. Politeness, Time Constraints, and Collaboration in Decision-Making Meetings: A Case Study. Technical Communication Quarterly. 20, 2 (April 2011), 114--138.Google ScholarCross Ref
- Friess, E. (2011). Discourse Variations Between Usability Tests and Usability Reports. J. Usability Studies. 6, 3 (May. 2011), 102--116. Google ScholarDigital Library
- Nielsen, J. 2005. Heuristic Evaluation. http://www.useit.com/papers/heuristic/.Google Scholar
- Molich, R. 2011. Comparative Usability Evaluation. http://www.dialogdesign.dk/CUE.html. Google ScholarDigital Library
- Bowman, D. 2009. Goodbye, Google. http://stopdesign.com/archive/2009/03/20/goodbye-google.html.Google Scholar
Index Terms
- Do usability evaluators do what we think usability evaluators do?
Recommendations
What do usability evaluators do in practice?: an explorative study of think-aloud testing
DIS '06: Proceedings of the 6th conference on Designing Interactive systemsThink-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that ...
Discourse Variations Between Usability Tests and Usability Reports
While usability evaluation and usability testing has become an important tool in artifact assessment, little is known about what happens to usability data as it moves from usability session to usability report. In this ethnographic case study, I ...
Detailed Usability Heuristics: A Breakdown of Usability Heuristics to Enhance Comprehension for Novice Evaluators
HCI International 2020 - Late Breaking Papers: User Experience Design and Case StudiesAbstractHeuristic evaluation (HE) is one of the most commonly used usability evaluation methods. In HE, 3–5 evaluators evaluate a certain system guided by a list of usability heuristics with the goal of detecting usability issues. Although HE is popular ...
Comments