Module 1 Overview
Module 1 Overview
Overview of Assessment
A. Domains of Learning
1. Cognitive
organizational structure may department and selects the required tasks for
be understood. training.
Distinguishes between facts
and inferences. Keywords: analyzes, breaks down, compares,
contrasts, diagrams, deconstructs, differentiates,
discriminates, distinguishes, identifies, illustrates,
infers, outlines, relates, selects, separates.
Synthesis: Builds a structure Examples: Write a company operations or
or pattern from diverse process manual. Design a machine to perform a
elements. Put parts together specific task. Integrates training from several
to form a whole, with sources to solve a problem. Revises and process
emphasis on creating a new to improve the outcome.
meaning or structure.
Keywords: categorizes, combines, compiles,
composes, creates, devises, designs, explains,
generates, modifies, organizes, plans, rearranges,
reconstructs, relates, reorganizes, revises,
rewrites, summarizes, tells, writes.
Evaluation: Make judgments Examples: Select the most effective solution. Hire
about the value of ideas or the most qualified candidate. Explain and justify a
materials. new budget.
2. Affective
This domain includes the manner in which we deal with things emotionally, such
as feelings, values, appreciation, enthusiasms, motivations, and attitudes. The five
major categories listed in order are:
Examples: Listen to others with respect. Listen for
Receiving phenomena: and remember the name of newly introduced
Awareness, willingness to people.
hear, selected attention.
Keywords: asks, chooses, describes, follows,
gives, holds, identifies, locates, names, points to,
selects, sits, erects, replies, uses.
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
3. Psychomotor
achieved by practicing.
Keywords: copies, traces, follows, react,
reproduce, responds
Mechanism: This is the Examples: Use a personal computer. Repair a
intermediate stage in leaking faucet. Drive a car.
learning a complex
skill. Learned responses Keywords: assembles, calibrates, constructs,
have become habitual and dismantles, displays, fastens, fixes, grinds, heats,
the movements can be manipulates, measures, mends, mixes, organizes,
performed with some sketches.
confidence and proficiency.
Complex Overt Response: Examples: Maneuvers a car into a tight parallel
The skillful performance of parking spot. Operates a computer quickly and
motor acts that involve accurately. Displays competence while playing
complex movement the piano.
patterns. Proficiency is
indicated by a quick, Keywords: assembles, builds, calibrates,
accurate, and highly constructs, dismantles, displays, fastens, fixes,
coordinated performance, grinds, heats, manipulates, measures, mends,
requiring a minimum of mixes, organizes, sketches. NOTE: The key
energy. This category words are the same as Mechanism, but will have
includes performing without adverbs or adjectives that indicate that the
hesitation, and automatic performance is quicker, better, more accurate,
performance. For example, etc.
players are often utter
sounds of satisfaction or
expletives as soon as they
hit a tennis ball or throw a
football, because they can
tell by the feel of the act
what the result will produce.
Adaptation: Skills are well Examples: Responds effectively to unexpected
developed and the individual experiences. Modifies instruction to meet the
can modify movement needs of the learners. Perform a task with a
patterns to fit special machine that it was not originally intended to do
requirements. (machine is not damaged and there is no danger
in performing the new task).
As mentioned earlier, the committee did not produce a compilation for the
psychomotor domain model, but others have. The one discussed above is by
Simpson (1972). There are two other popular versions:
1. Multiple Choice
In testing, multiple-choice is the most widely
used selection type test, perhaps because
these questions can be used to test such a
wide range of instructional objectives, are
easy to conduct and score, and are less
prone to bias and subjectivity (Laprise,
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
Dissection of a
multiple choice
question
Comm
Descri on
Part
ption Proble
ms
1. Descri Conf
Dissection of a
multiple choice
question
Comm
Descri on
Part
ption Proble
ms
to be
com
plete
d,
decis
ion
to be
mad
e, or
probl
em
or
situa
tion
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
Dissection of a
multiple choice
question
Comm
Descri on
Part
ption Proble
ms
to be
resol
ved
2. The Conf
Altern alterna using
atives tives or
from ambi
which guou
the s
langu
learner
age
selects
No
the clear
correct right
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
Dissection of a
multiple choice
question
Comm
Descri on
Part
ption Proble
ms
answ
er
Poorl
y
writte
answe
n or
r(s) impla
usibl
e
distra
ctors
and alternatives
o key words that appear only in the stem
distractors
o two distractors that have the same
meaning
o longer written alternatives indicate a
correct answer
o do not give the answer away in the
2. Short-Answer Tests
Essay and short answer items encourage
students to strive toward understanding a
concept as an integrated whole, permit
students to demonstrate achievement of
such higher level objectives as analyzing
given conditions and critical thinking, allow
expression of both breadth and depth of
learning, and encourage originality,
creativity, and divergent thinking (Jordan,
2012).
Strategies to try: Short-answer items can
take a variety of forms: definitions,
descriptions, short essays, or mixtures of
the three. Short essays can require
students to apply their knowledge to a
specific situation. Gronlund (1998) and
Nitko (2001) also identified several tips for
constructing short-answer items:
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
BENEFITS:
1. Clarify Learning outcomes
2.Ensure content coverage
3.Matching methods of instruction
4.Help in Assessment plan and blue print
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
E. Validity
Validity is the most important issue in
selecting a test. Validity refers to what
characteristic the test measures and how
well the test measures that characteristic.
Validity tells you if the characteristic
being measured by a test is related to
job qualifications and requirements.
Validity gives meaning to the test
scores. Validity evidence indicates that
there is linkage between test
performance and job performance. It
can tell you what you may conclude or
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
Construct-related validation requires
a demonstration that the test measures
the construct or characteristic it claims
to measure, and that this characteristic
is important to successful performance
on the job.
F. Reliability
Reliability refers to how dependably or
consistently a test measures a
characteristic. If a person takes the test
again, will he or she get a similar test score,
or a much different score? A test that yields
similar scores for a person who repeats the
test is said to measure a characteristic
reliably.
How do we account for an individual who
does not get exactly the same test score
every time he or she takes the test? Some
possible reasons are the following:
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
References
A. Abu-Dawwas, W. (2014). A new
scoring method for multiple choice tests in
learning management systems.
International Journal of Academic
Research, 6(1), 274–280.
doi:10.7813/2075-4124.2014/6-1/A.36
Azer, S. A. (2003). Assessment in a
problem-based learning course: Twelve
tips for constructing multiple choice
questions that test students’ cognitive
skills. Biochemistry and Molecular Biology
Education, 31(6), 428–434.
doi:10.1002/bmb.2003.494031060288
Baty, P. (2006, June 30). Class is allowed
to set exam. Times Higher Education
Supplement. New York, NY.
ASSESSMENT IN LEARNING 2
Module 1 Maria Cecilia Carnaje-Sualog, PhDc
ln/v10n1/assessment-and-collaboration-
online-learning (Links to an external site.)
Taşci, T., Parlak, Z., Kibar, A., Taşbaşi,
N., & Cebeci, H. İ. (2014). A novel agent-
supported academic online examination
system. Journal of Educational
Technology & Society, 14(1), 154–168.
Wilson, E. V. (2004). ExamNet
asynchronous learning network:
Augmenting face-to-face courses with
student-developed exam questions.
Computers & Education, 42(1), 87–107.
doi:10.1016/S0360-1315(03)00066-6
Wright, N. A., Meade, A. W., & Gutierrez,
S. L. (2014). Using Invariance to Examine
Cheating in Unproctored Ability Tests.
International Journal of Selection and
Assessment, 22(1), 12–22.
doi:10.1111/ijsa.12053