NEEShub – Heuristic Evaluation

100 %
0 %
Information about NEEShub – Heuristic Evaluation
Design

Published on March 10, 2014

Author: ekerrigan

Source: slideshare.net

Assignment 05: Heuristic Evaluation Team 02 Omid Farivar | Evan Kerrigan | Benjamin Nayer SI 622: Evaluation of Systems and Services March 17, 2011

Agenda 1. Project Introduction 2. Methods 3. Findings & Recommendations 4. Discussion

What is NEES? ‣ Network for Earthquake Engineering Simulation is an NSF-funded research organization ‣ Serves as a network to coordinate and share data and research among earthquake, civil, and structural engineers and 14 different laboratories throughout the U.S. ‣ Offer multiple services including NEEShub, which is our focus

Project Warehouse? NEEShub? ‣ Suite of software and tools to store, share, and analyze data—tailored to the needs of the NEES community ‣ At its heart is the “Project Warehouse,” which serves as the system’s repository for uploading, managing, and sharing research data ‣ Also offers an array of tools for visualizing and analyzing data outside of the Project Warehouse

photo cc flickr user delilah021 Heuristic Evaluation

Nielsen’s Heuristics 1. Visibility of system status 2. Match between system and the real world 3. User control and freedom 4. Consistency and standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility and efficiency of use 8. Aesthetic and minimalist design 9. Help recognize, diagnose, and recover from errors 10. Help and documentation Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York, New York: John Wiley & Sons.

Individual Evaluations ‣ Interviews & personas guided identification of different system aspects—researchers, data viewers, data creators and “uploaders” ‣ Selected separate, pre-existing projects to analyze individually ‣ ‣ ‣ Created and edited projects and experiments Conducted several individual site passes Ranked severity and importance to success

Rating Scale Ranking Meaning 0 Team does not agree that issue impacts system usability 1 Cosmetic problem only; need not be fixed unless extra time is available on project 2 Minor usability problem; fixing this should be given low priority 3 Major usability problem; important to fix, so should be given high priority 4 Usability catastrophe; imperative to fix before product can be released Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York, New York: John Wiley & Sons.

Team Evaluations ‣ Assembled as a team to discuss, explain, and defend individual findings ‣ ‣ Team collectively re-rated issue severity A score of ‘0’ was accorded to items that the team felt were not truly usability issues

photo cc flickr user stefanoodle Findings & Recommendations

Finding 01: Inaccurate Presentation of Folder Size and Contents ‣ Project experiment folders in file browser are labeled as very large (10s–100s of GB) ‣ When accessing the folders, it appears to be empty—is the content loading? deleted? restricted? ‣ Violates visibility of system status, since it provides inaccurate system information: users cannot know if something went wrong

Recommendation ‣ ‣ Present correctly labeled folder size ‣ Offer helpful error notifications, particularly for permission issues Use a status spinner to indicate if the system is loading

Finding 02: Inconsistent Filetype Upload Behavior ‣ Users can upload various filetypes to project experiments—videos, images, etc.—but the documentation about acceptable filetypes is inconsistent ‣ Provides unhelpful error responses when uploading incorrect formats ‣ Violates the following heuristics: consistency and standards, error prevention, and help users recognize, diagnose, and recover from errors

Recommendation ‣ Provide clear, consistent contextual help and documentation ‣ ‣ Validate formats during file upload Provide clearer, more accurate error messages

Finding 03: Uploading Sensor Data Places Excessive Recall Burden on Users ‣ Uploading sensor data to an experiment is performed via populating a blank spreadsheet, with specific file format requirements and other formatting constraints provided before you download it ‣ This spreadsheet contains none of this information, forcing the user to remember it ‣ Violates recognition rather than recall heuristic

Recommendation ‣ Add formatting requirement instructions within the provided spreadsheet

Finding 04: Difficulty Performing Multiple File Searches ‣ Each project’s File Browser has a search function, which works normally at first ‣ New results are not returned after attempting later searches—the query simply fails. Reloading page is only solution. ‣ ‣ No error message or feedback— it just fails! Violates the error prevention, error recovery, and system visibility heuristics.

Recommendation ‣ Investigate reasons why issue is occurring and generate a solution ‣ Display current search terms prominently and provide error messages if queries fail

Finding 05: Unclear Options in File Browser Search Menu ‣ Ambiguous drop-down menu with search box—contains options to “Find by: ‘Title’ or ‘Name’” ‣ No ‘Title’ column in the file browser; unclear what that option means… no apparent difference when searching with each option ‣ Violates consistency and standards heuristic

Recommendation ‣ Two Possibilities: • Replace ‘Title’ option with a clearer, more meaningful filter (e.g., ‘timestamp’) • Remove drop-down menu altogether

Finding 06: Difficulty Determining State of Upload Progress ‣ File uploaders expected to handle large amounts of data ‣ No progress feedback provided when uploading files ‣ Can’t navigate away from uploader without losing the upload lightbox ‣ Violates visibility of system status heuristic

Recommendation ‣ Implement a progress bar to display upload status—time elapsed, time remaining, etc.

Finding 07: Usability Strengths of NEEShub ‣ Extensive documentation is provided, especially when creating and editing projects or experiments ‣ ‣ Conspicuous help button for each data field Consistent iconography, such as the help buttons and filetype icons in the file browser • • • Navigation aids “Breadcrumb”-style secondary navigation bar Similar display in the file browser

photo cc flickr user yanivg Caveats & Limitations

Caveats & Limitations ‣ ‣ ‣ Minimum optimal number of evaluators ‣ Our decision to select different, pre-existing projects may have resulted in less-detailed investigations of specific pages ‣ Some of the issues—particularly file-upload ones— may be have less of a significant usability issue as there is an external tool used for file management Prior experience with NEEShub may introduce biases A larger and more diverse set of evaluators may have helped

Q&A Team 02 Omid Farivar | Evan Kerrigan | Benjamin Nayer

Add a comment

Related presentations

Related pages

Evan Kerrigan :: NEEShub Usability Evaluation

NEEShub. I was a UX researcher and project manager for a comprehensive usability analysis of NEEShub, ... Heuristic Evaluation Interaction Maps Interviews
Read more

Heuristic Evaluation - Documents - docslide.us

I was tasked to produce a heuristic analysis on a mobile phone game. I selected Tap Tap Revenge from Tapulous. The game runs on the iPhone OS. NOTE: This ...
Read more

Heuristic Evaluation Checklist - Documents - DOCSLIDE.US

Appendix A Heuristic Evaluation Checklist Errors - help diagnose, recognize and recover from them Are error messages clear and in plain language? Are error ...
Read more

Evan Kerrigan :: Library of Congress Usability Evaluation

I administered a heuristic evaluation of the same “Personal Archiving” section. Along with a partner, we independently evaluated the website’s ...
Read more

FAILURE CHARACTERIZATION AND ERROR DETECTION IN ...

of critical analysis and rigourous evaluation of your research. His feedback, both positive and negative, ...
Read more

NEEShub - NEES-2012-1156 : Next Generation Hybrid ...

Support Tickets. Check on status of ... Evaluation and Theory No reviews have been ... award is to explore new ideas to overcome the heuristic approaches ...
Read more

Comparative appraisal - doi.acm.org

Jakob Nielsen , Rolf Molich, Heuristic evaluation of user interfaces, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ...
Read more

Schedule - CMMI 2012 Grantee Conference

4 scheDule Tuesday, January 4, 2011 ... but the rational evaluation/ ... variety of heuristic methods to carry this out.
Read more