Abstract
One key drawback when evaluating usability return on investment (ROI) is that the assessment criteria are often subjective, making it difficult for members of a development team to buy in to the need to support usability-derived redesign recommendations. It is thus necessary to convey to the development team the importance of design for usability in a format that is universally understandable. The use of measurable usability requirements to assess usability ROI was found to be an effective approach to align design with operational performance and at the same time justify the need for redesign to the development team. This approach should result in better development team cohesion, as well as superior end product performance, which captures and supports the needs of end users and other stake holders alike. In the current effort, this alignment process is described, and the utility of the approach is demonstrated by its application in a field case study of the successful design of a software application.
Chapter PDF
Similar content being viewed by others
References
Mayhew, D.J.: The usability engineering lifecycle. Morgan Kaufmann Publishers, Inc., San Francisco (1999)
Bias, R.G., Mayhew, D.J.: Cost-justifying Usability. Morgan Kaufmann Publishers, Inc., San Francisco (2005)
Hewett, T.T.: The role of iterative evaluation in designing systems for usability. In: Harrison, M.D., Monk, A.F. (eds.) People & Computers: Designing for usability. Proceedings of the second conference of the BCS HCI specialist group, Cambridge University Press, Cambridge (1986); In: Booth, P.: An Introduction to Human-Computer Interaction, LEA, Hove, East Sussex, U.K. (1990)
Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction 13(4), 373–410 (2001)
Quesenbery, W.: Defining a summative usability test for voting systems: A report from the UPA 2004 Workshop on Voting and Usability (Online). Usability Professional’s Association (2004), http://www.usabilityprofessionals.org/upa_projects/voting_and_usability/documents/voting_summative_test.pdf (retrieved January 20, 2008)
Theofanos, M., Quesenbery, W., Snyder, C., Dayton, D., Lewis, J.: Reporting on Formative Testing: A UPA 2005 Workshop Report. Usability Professionals Association (2005), http://www.usabilityprofessionals.org/usability_resources/conference/2005/formative/%20reporting-upa2005.pdf
Kneifel, A.A., Guerrero, C.: Using participatory inquiry in usability analysis to align a development team’s mental model with its users’ needs. In: Proceedings of the Society for Technical Communication 50th annual conference, Dallas, Texas, USA, May 18-21 (2003)
Dumas, J.S., Redish, J.C. A practical guide to usability testing, 2nd edn. Intellect, Portland, OR (1999)
Redish, J., Bias, R.G., Bailey, R., Molich, R., Dumas, J., Spool, J.M.: Usability in Practice: Formative usability evaluations – evolution and revolution. In: Proceedings of the Computer Human Interaction Conference, Minneapolis, Minnesota, USA, April 20-25 (2002)
Kies, J.K., Williges, R.C., Rosson, M.B.: Coordinating computer-supported cooperative work: A review of research issues and strategies. Journal of the American Society for Information. Science 49, 776–779 (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Champney, R.K., Kokini, C., Stanney, K.M. (2011). Making the Design Process More Usable: Aligning Design with User Performance. In: Marcus, A. (eds) Design, User Experience, and Usability. Theory, Methods, Tools and Practice. DUXU 2011. Lecture Notes in Computer Science, vol 6769. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21675-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-21675-6_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21674-9
Online ISBN: 978-3-642-21675-6
eBook Packages: Computer ScienceComputer Science (R0)