0% found this document useful (0 votes)
81 views35 pages

CT026-3-2 Human Computer Interaction: Level 2

The document discusses human-computer interaction evaluation. It describes different evaluation paradigms including formative and summative evaluation. Iterative evaluation is emphasized as an ongoing process. Key factors to consider in evaluation are outlined in the DECIDE framework including goals, questions, methodology, and practical and ethical issues.

Uploaded by

Ted Win
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views35 pages

CT026-3-2 Human Computer Interaction: Level 2

The document discusses human-computer interaction evaluation. It describes different evaluation paradigms including formative and summative evaluation. Iterative evaluation is emphasized as an ongoing process. Key factors to consider in evaluation are outlined in the DECIDE framework including goals, questions, methodology, and practical and ethical issues.

Uploaded by

Ted Win
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 35

CT026-3-2

Human Computer Interaction

Human Computer Interaction


Evaluation
Level 2

Prepared by: RHR First Prepared on: December, 2007 Last Modified on:
Quality checked by: MOH
Copyright 2004 Asia Pacific Institute of Information Technology
Topic & Structure of the lesson

• Why evaluations?
• When to evaluate?
• Evaluation paradigm
• DECIDE : A framework to guide evaluation
• Pilot studies

CT026-3-2 Human Computer Interactions 2


Learning Outcomes

At the end of this lecture, you should be able to:

• Describe the evaluation paradigms & techniques used in


interaction design.
• Discuss the conceptual, practical and ethical issues that
must be considered when planning evaluations.
• Use the DECIDE framework in your own work

CT026-3-2 Human Computer Interactions 3


Two Main Types of Evaluation

• Formative evaluation

– It is done at different stages of development

– To check that the product meet users’ needs

– Focus on the process

• Summative evaluation

- To assess the quality of a finished product

- Focus on the results


CT026-3-2 Human Computer Interactions 4
Two Main Types of Evaluation

“ When the cook tastes the soup, that’s


formative. When the guests taste the
soup, that’s summative ”

CT026-3-2 Human Computer Interactions 5


Iterative Evaluation

Iterative design and evaluation


is a continuous process that Original Product Concept
examines:
•Early ideas for conceptual
Parallel Design Sketches
model
•Early prototypes of the new
system First Prototype
Evaluation
•Later, more complete
prototypes Iterative Design Versions
Evaluation enable designers
to check that they
Final Released Product
understand users’
requirements
CT026-3-2 Human Computer Interactions 6
Why evaluation?

“Iterative design, with its repeating cycle of design


and testing, is the only validated methodology in
existence that will consistently produce successful
results. If you don’t have user-testing as an integral
part of your design process you are going to throw
buckets of money down the drain.”

See www.AskTog.com for topical discussion about


design and evaluation.

CT026-3-2 Human Computer Interactions 7


When to evaluate?

• Throughout the design phases

• Also at the final stage – on the finished product

• Design proceeds through iterative cycles of


‘design –test – redesign’

• Triangulation involves using a combination of


techniques to gain different perspectives

CT026-3-2 Human Computer Interactions 8


Evaluation Paradigm

Any kind of evaluation is guided


explicitly or implicitly by a set of
beliefs, which are often under-pined
by theory. These beliefs and the
methods associated with them are
known as an ‘evaluation paradigm’

CT026-3-2 Human Computer Interactions 9


Four Evaluation Paradigms

• ‘quick and dirty’

• usability testing

• field studies

• predictive evaluation

CT026-3-2 Human Computer Interactions 10


Quick And Dirty

• ‘Quick & Dirty’ evaluation describes the


common practice in which designers
informally get feedback from users to
confirm that their ideas are in-line with users’
needs and are liked.

• Quick & dirty evaluations are done any time.

• The emphasis is on fast input to the design


process rather than carefully documented
findings.
CT026-3-2 Human Computer Interactions 11
Usability Testing
• Usability testing involves recording typical users’
performance on typical tasks in controlled
settings.
• As the users perform these tasks they are watched
& recorded on video & their key presses are logged.
• This data is used to calculate performance times,
identify errors & help explain why the users did
what they did.
• User satisfaction questionnaires & interviews are
used to elicit users’ opinions.

CT026-3-2 Human Computer Interactions 12


Field studies

• Field studies are done in natural settings

• The aim is to understand what users do


naturally and how technology impacts them.

• In product design field studies can be used to:


– identify opportunities for new technology

– determine design requirements

– decide how best to introduce new technology

– evaluate technology in use.


CT026-3-2 Human Computer Interactions 13
Predictive Evaluation

• Experts apply their knowledge of typical users,


often guided by heuristics, to predict usability
problems.
• Another approach involves theoretically based
models.
• A key feature of predictive evaluation is that users
need not be present
• Relatively quick & inexpensive

CT026-3-2 Human Computer Interactions 14


Evaluation Techniques

• observing users

• asking users’ their opinions,

• asking experts’ their opinions,

• testing users’ performance

CT026-3-2 Human Computer Interactions 15


DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 16


Determine The Goals

• What are the overall goals of the evaluation?

• Who wants it and why? Which stakeholder?


End user, database admin, code cutter?

• The goals influence the paradigm for the


study

CT026-3-2 Human Computer Interactions 17


Examples of Goals

• Some examples of goals:-


– Identify the best metaphor on which to base
the design
– Check to ensure that the final interface is
consistent
– Investigate how technology affects working
practices
– Improve the usability of an existing product

CT026-3-2 Human Computer Interactions 18


DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 19


Explore The Questions
• All evaluations need goals & questions to guide them
so time is not wasted on ill-defined studies.
• For example, the goal of finding out why many
customers prefer to purchase paper airline tickets rather
than e-tickets can be broken down into sub-questions:
– What are customers’ attitudes to these new tickets?
– Are they concerned about security?
– Is the interface for obtaining them poor?

CT026-3-2 Human Computer Interactions 20


DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 21


Choose Paradigm & Techniques

• The evaluation paradigm strongly


influences the techniques used, how data
is analyzed and presented.

• For example, field studies do not involve


testing or modeling

CT026-3-2 Human Computer Interactions 22


DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 23


Identify Practical Issues

• For example, how to:-

– select users

– stay on budget

– staying on schedule

– find evaluators

– select equipment
CT026-3-2 Human Computer Interactions 24
DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 25


Decide On Ethical Issues
• Develop an informed consent form
• Participants have a right to:-
– know the goals of the study
– what will happen to the findings
– privacy of personal information
– not to be quoted without their agreement
– leave when they wish
– be treated politely

CT026-3-2 Human Computer Interactions 26


DECIDE: An Evaluation Framework

• Determine the goals the evaluation addresses.

• Explore the specific questions to be answered.

• Choose the evaluation paradigm and techniques


to answer the questions.

• Identify the practical issues.

• Decide how to deal with the ethical issues.

• Evaluate, interpret and present the data.

CT026-3-2 Human Computer Interactions 27


Evaluate, Interpret & Present Data

• How data is analyzed & presented depends on


the paradigm and techniques used.
• The following also need to be considered:
– Reliability: Different evaluation process has different
degrees of reliability
– Biases: is the process creating biases? (interviewer
may unconsciously influence response)
– Ecological validity: is the environment of the study
influencing it (under controlled environment, user is
less relaxed)
CT026-3-2 Human Computer Interactions 28
Pilot Studies
• A small trial run of the main study.
• The aim is to make sure your plan is viable.
• Pilot studies check:-
– that you can conduct the procedure
– that interview scripts, questionnaires,
experiments, etc. work appropriately
• It’s worth doing several to iron out problems before
doing the main study
• Ask colleagues if you can’t spare real users
CT026-3-2 Human Computer Interactions 29
Heuristic Evaluation

• A heuristic is a guideline or general


principle or rule of thumb that can guide a
design decision or be used to critique a
decision that has already been made
• The general idea behind heuristic
evaluation is that several evaluators
independently critique a system to come
up with potential usability problems
CT026-3-2 Human Computer Interactions 30
Heuristic Evaluation

To aid the evaluators in discovering usability problems,


there is a list of 10 heuristics which can be used to
generate ideas:
• Visibility of system status
• Match between system and the real world
• User control and freedom

• Consistency and standards

• Error prevention

CT026-3-2 Human Computer Interactions 31


Heuristic Evaluation

To aid the evaluators in discovering usability problems,


there is a list of 10 heuristics which can be used to
generate ideas:
• Recognition rather than recall
• Flexibility and efficiency of use
• Aesthetic and minimalist design
• Help users recognize, diagnose, and recover from errors
• Help and documentation

CT026-3-2 Human Computer Interactions 32


Key Points

• An evaluation paradigm is an approach


that is influenced by particular theories
and philosophies

• Four categories of techniques were


identified: observing users, asking users,
asking experts and user testing

CT026-3-2 Human Computer Interactions 33


Key Points

• The DECIDE framework has six parts:


– Determine the overall goals
– Explore the questions that satisfy the goals
– Choose the paradigm and techniques
– Identify the practical issues
– Decide on the ethical issues
– Evaluate ways to analyze & present data
• Do a pilot study

CT026-3-2 Human Computer Interactions 34


Next Session

Topic and Structure of next session

Evaluation

CT026-3-2 Human Computer Interactions 35

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy