0% found this document useful (0 votes)
79 views18 pages

Unit 3 Lesson 3 Item Analysis

The document discusses item analysis and validation for assessments. It defines key terms like difficulty index and index of discrimination. It provides steps to conduct item analysis on test data and interpret results to improve test quality and validate that tests measure their intended constructs.

Uploaded by

stacysacay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views18 pages

Unit 3 Lesson 3 Item Analysis

The document discusses item analysis and validation for assessments. It defines key terms like difficulty index and index of discrimination. It provides steps to conduct item analysis on test data and interpret results to improve test quality and validate that tests measure their intended constructs.

Uploaded by

stacysacay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Unit 3 Lesson 3

Item Analysis and Validation


Ed 7 (Assessment in Learning 1)

Prof. Ruby R. Briones


Course Professor
Objectives

 Explain the important terms in item analysis


 Differentiate validity and reliability
 Conduct an item analysis given a a set of raw data
Outline

 Item analysis
 Difficulty Index
 Index of Discrimination
 Steps in Conducting an Item Analysis
 Sample Item Analysis computation
 Validity
 Reliability
ITEM ANALYSIS
(Thompson, 2021)

 Item analysis refers to the process of statistically


analyzing assessment data to evaluate the quality and
performance of your test items.
 This is an important step in the test development cycle,
not only because it helps improve the quality of your test,
but because it provides documentation for validity:
evidence that your test performs well and score
interpretations mean what you intend.
Goals of Item Analysis

 Identify the items which are not performing well


(determined by index of difficulty and index of
discrimination).

 Find out why those items are not performing well. (the
items may be too difficult or easy; too confusing or not
discriminating; miskeyed; and perhaps biases to a minority
group.
Difficulty Index
 The level of difficulty of an item is found by computing its difficulty index.
 It is the number of students who are able to answer the item correctly
divided by the total number of students, who took the test.

Difficulty Index = No. of students with correct answer


Total no. of students

Example: If there are 25 students who did not answer item #1 correctly and there
are 70 students who took the test, then

DI = (70-25) ÷ 70 = 45 ÷ 70 = 0.66
Index of Discrimination

 Difficult items tend to discriminate between those who


know and those who do not know; OR
 Easy items cannot discriminate between those who know
and those who do not know.

 The index of discrimination is a measure that tells


whether an item can discriminate the 2 groups of students
or not.
Steps in Conducting an Item Analysis

1) Arrange the answer sheets of the students from highest to lowest.

2) Identify the upper 25% class and the lower 25% class.
Example: If there are 60 students who took the exam, then
60 X 25% = 60 X 0.25 = 15
Therefore 15 students belong to upper 25% class and 15 students
belong to lower 25% class.

3) The answers of these 30 students will be data for item analysis.


Steps in conducting an Item Analysis

4) Each item in the test will be subjected to item analysis. The number of correct
responses for each item out of the answer sheets of the upper 15 class and lower 15 class
students will be counted.

5) Compute for the difficulty index per item of the upper class (DU).
6) Compute for the difficulty index per item of the lower class (DL)
7) Compute for the index of discrimination: ID = DU – DL
8) Take “ACTIONS” with regard to the test items, whether to RETAIN, REVISE, or REMOVE.
9) Revisit the TOS after removing or discarding some items, because some objectives may
NOT be represented anymore. So, there is a need to replace those items for the TOS to be
followed. The newly replaces test items should be subjected to item analysis again.
Table of Action for values of Difficulty
Index and Index of Discrimination
Difficulty
Index of Discrimination Action
Index
Not discriminating (ID ≤ 0.19) REMOVE
Difficult
Moderately Discriminating (0.20≤ 𝐼𝐷 ≤ 0.29) REVISE
(0.0 to 0.40)
Discriminating (ID ≥ 0.30) RETAIN

Moderately Not discriminating (ID ≤ 0.19) REVISE


Difficult Moderately Discriminating (0.20≤ 𝐼𝐷 ≤ 0.29) REVISE
(0.41 to 0.6) Discriminating (ID ≥ 0.30) RETAIN
Not discriminating (ID ≤ .019) REMOVE
Easy
Moderately Discriminating (0.20≤ 𝐼𝐷 ≤ 0.29) REVISE
(0.61 & above)
Discriminating (ID ≥ 0.30) RETAIN
Interpretation
 Difficulty Index  Index of Discrimination
Difficult (0.0 to 0.40) Not discriminating (ID ≤ 0.19)
Moderately Difficult (0.41 to 0.6) Moderately Discriminating (0.20≤ 𝐼𝐷 ≤ 0.29)
Easy (0.61 & above) Discriminating (ID ≥ 0.30)

 Interpretation:
1) All items which are “moderately difficult” are either “RETAINED” or “REVISED”.
2) A “difficult” item is “RETAINED” only if is “discriminating”. Why?
3) An “easy” item is “RETAINED” if it is either “moderately discriminating” or
“discriminating”. Why?
4) When is an item removed or discarded? Why?
Validation

 Purpose: to determine the characteristics of the whole test itself


(validity and reliability)

 It is the process of collecting and analyzing evidence to support the


meaningfulness and usefulness of the test.
 Validity is the extent to which a test measures what it is suppose to
measure.
 Validity is the appropriateness, correctness, meaningfulness, and
usefulness of the specific decision a teacher makes based on the test
results.
Evidences for Test Validation

 Content-related evidence of validity – refers to the construct of the


instrument (example: test/ exam)
 Criterion-related evidence of validity – refers to the relationship
between scores obtained using one or more other tests.
 Construct-related evidence of validity – refers to the nature of the
psychological construct or characteristic being measured by the test.
RELIABILITY

 It refers to the consistency of the scores obtained or how consistent they are
for each individual from administration of an instrument to another and from
one set of items to another.
 Formula: KR 20 or KR 21 (Kuder-Richardson 20 or 21).

 A reliable instrument is also valid.


 A valid instrument is also reliable.
Change is the end result
of true learning.

-Leo Buscaglia-
Answer the Exercise on Item Analysis
References

 Thompson, N. (2021) What is Item Analysis. https://assess.com/what-is-item-


analysis/
 Navarro, R. L., et. al. (2017). Assessment of Learning 1 (3rd Edition). Lorimar
Pub. Inc., QC, Metro Manila, Philippines. pp. 85-95

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy