|
Any usability testing< method where the evaluator and user participant are not in the same location. Remote evaluation may be moderated, with the evaluator observing the participant in real time, or may be automated or unmoderated with the participant working without direct observation or interaction.
The term "remote evaluation" spans a wide number of detailed methods that collect a range of data. At one extreme, there is little difference from in-person task-based lab testing, except that the moderator and participant are not in the same place. At the other, there are no user tasks at all, and the data collected is aggregated analytics.
Bolt and Tulathumutte diagrammed the methods< on two axes: Qualitative (moderated) vs. Quantitative (unmoderated) and Concrete vs. Conceptual (how closely the method reveals actual behavior on a completed interface).
-
They group together all qualitative (moderated) methods using remote screen-sharing and audio: a participant and moderator work together in real time. Tools include Adobe Connect, GoTomeeting, NetMeeting, LiveLook, UserVue, Skupe, WebEx, Glance, Youguu.
-
They organize quantitative (unmoderated) methods for collecting task data on a range from concrete to conceptual methods:
- Testing on live sites/apps. Tools include UserZoom, RelevantView, WebEffective, Webnographer
- Testing wireframes. Tools include Chalkmark, Usabila
- Testing conceptual artifacts. Tools include online card sorting, OptimalSort, WebSort
-
Quantitative methods without tasks include:
- User analytics on live sites. Tools include ClickTale, ClickHeat
- A/B/C testing on live sites
- Surveys
Related Links
Formal Publications
Bolt, N. and Tulathumutte, T. (2010) Remote Research: Real Users, Real Time, Real Research<. Rosenfeld Media.
Web Resources
-
Andreasen MS, Nielsen HV, Schroder SO, Stage J. (2007) What happened to remote usability testing? An empirical study of three methods.< Proceedings of ACM CHI 2007 Conference on Human Factors in Computing Systems 2007.
-
Compared remote synchronous condition, where testing is conducted in real time but the test monitor is separated spatially from the test subjects, and two remote asynchronous conditions, where the test monitor and the test subjects are separated both spatially and temporally. The results show that the remote synchronous method is virtually equivalent to the conventional method. The asynchronous methods are considerably more time-consuming for the test subjects and identify fewer usability problems, yet they may still be worthwhile.
- Remote Usability, Bolt|Peters blog<
- Remote Online Usability Testing: Why, How, and When to Use It by Dabney Gough and Holly Phillips, Boxes and Arrows<
- Bolt, Nate and Tulathumutte, Tony. Remote Research: Real users, real time, real research Rosenfeld media, February 2010<.
- Soucy, Kyle, "Unmoderated, Remote Usability Testing: Good or Evil" UXmatters, January 18, 2010<.
- de la Nuez, Alfonso. 'An Attainable Goal: Quantifying Usability and User Experience.î User Experience Magazine, Volume 7, Issue 3, 2008<.
- Farnsworth, Carol. 'Getting Your Money Back: The ROI of Remote Unmoderated User Research.î User Experience Magazine, Volume 7, Issue 3, 2008<.
- Tullis, Tom. 'Automated Usability Testing: A Case Study.î User Experience Magazine, Volume 7, Issue 3, 2008<.
- Albert, William, Tullis, Tom, Tedesco, Donna Beyond the Usability Lab: Conducting Large-Scale User Experience Studies Morgan Kaufmann, January 2010. Amazon<
Resources from UPA
Markel, Joanna.; Rosehan, Serena. Making Method work for you: how remote contextual inquiry got us up-close with users. UPA 2008 Conference.
Mitchell, Peter P. An Inventory and Critique of Online Usability Testing Packages. UPA 2002 Conference.
Nuez, Alfonso de la., Tedesco, Donna., Aseron, Rob., Tullis,Tom., and Albert, Bill. Unmoderated Usability Testing: Experiences from the Field. UPA 2009 Conference.
Pressman, Eric. Usability TV Techniques and Tips for Broadcasting Usability Tests to Remote Observers on a Budget. UPA 2002 Conference.
Sapienza, Fitipp PhD. Working with lmmigrant and Trans-National Users in Usability Evaluation. UPA 2008 Conference.
Semen, Timothy S., and McCann, Tom. Timothy. The Trials, Tribulations and Triumphs of Online Usability Testing. UPA 2001 Conference.
Tullis, Tom.; Fleischman, Stan.; McNulty,Michelle.; Cianchette, Carrie.; Bergel, Marguerite. [http://home.comcast.net/~tomtullis/publications/RemoteVsLab.pdf< An Empirical Comparison of Lab and Remote Usability Testing of Web Sites.] 2002 UPA Conference.
Wei, Carolyn.; Barrick,Jennifer.; Cuddihy,Elisabeth.; Spyridakis, Jan. Conducting Usability Research through the Internet: Testing Users via the WWW. UPA 2005 Conference.
Related Topics
Facts
Sources and contributors:
Whitney Quesenbery
|
|