HPGD2303 CGS04254626
HPGD2303 CGS04254626
HPGD2303
EDUCATIONAL ASSESSMENT
Table of Specifications
Levels
Hours % Marks
Of Hours Allocated/ Total
Topic
4 2 20 6 3 15 6, 7, 9 2 10 8, 10 1 5 11 6
6 3 30 5 2 10 12, 14 1 5 13 2 10 15, 16 5
9 4 40 4 2 10 17, 18 1 5 19 1 5 20 4
Total 10 100 20 9 45 7 35 4 20 20
1
This distribution mirrors the relative importance of each topic within the course,
ensuring that the test covers both foundational and advanced aspects of educational
assessment.
Overall, the test paper is structured so that approximately 45% of the items assess
comprehension, around 35% assess application, and about 20% assess analysis. This
balanced distribution ensures a comprehensive evaluation of student proficiency in
educational assessment, covering both fundamental concepts and advanced critical thinking
skills.
4. Overall Justification:
The selected topics and the distribution of teaching hours and test items are designed
to provide a holistic evaluation of a student’s understanding of educational
assessment.
The breakdown aligns with the course outcomes by not only testing recall of
information but also requiring students to apply and analyze the concepts learned.
This structured approach ensures that the test paper is both comprehensive and
balanced, effectively covering theoretical foundations and practical applications of
assessment principles.
2
QUESTION 2: TEST PAPER AND COMMENTARY
Section A: COMMENTARY (Essay Format)
Introduction
The test paper for HPGD2303 is designed to assess a range of competencies in
educational assessment using 20 multiple-choice questions (MCQs) distributed across four
key topics. The purpose is to evaluate both foundational and higher-order cognitive skills—
comprehension, application, and analysis—through items constructed at a moderate difficulty
level. This ensures that the questions are engaging, thought-provoking, and accessible,
without being so easy as to invite guesswork.
4. Background/Profile of Test-Takers
The target audience for this test paper comprises HPGD2303 course participants and
teacher education students who already possess foundational knowledge in educational
assessment. These individuals are expected to be familiar with basic concepts and are
prepared to engage with both theoretical and practical aspects of assessment. The test is
designed to be accessible and supportive, reinforcing learning through clear, straightforward
questions.
3
5. Effectiveness of the 20 MCQs in Measuring Understanding
The 20 MCQs are distributed across four key topics—Roles of Assessment, Constructing
Objective Test Items, Authentic Assessment, and Appraising Classroom Test & Item
Analysis—ensuring a well-rounded evaluation. This distribution:
Covers Essential Content: Each topic contributes specific aspects that are crucial for
understanding educational assessment.
Balances Cognitive Demands: By including items at different cognitive levels, the
test not only measures recall but also the application and analysis of assessment
principles.
Facilitates Formative Use: The design supports the provision of constructive
feedback, enabling students to identify strengths and areas for improvement.
This structured categorization ensures that the test paper not only evaluates lower-order
cognitive skills (such as recall and basic understanding) but also higher-order skills (such as
application and analysis). Such an approach is essential for a comprehensive assessment,
ensuring that students can demonstrate a deep and practical understanding of educational
assessment principles.
4
7. Test-Taker Instructions
The following instructions are provided to ensure clarity and consistency:
This paper contains 20 questions. Answer all questions.
Read each question carefully: Choose the best answer from the four options
provided.
Mark only one answer per question: Each question is followed by four answer
choices (A, B, C, and D). For each question, choose only one answer. Circle the right
answer.
Time Management: Complete the test within 30 minutes.
Review your answers: Check your responses before submitting the test paper.
Feedback: After submitting your test, you will be able to see which questions you
answered incorrectly along with the correct answers.
Conclusion
In summary, the test paper is meticulously designed to align with the course
objectives and the Table of Specifications. By employing well-constructed MCQs, the test
effectively measures a range of cognitive skills and ensures a fair, objective, and accessible
assessment process. The clear alignment with Bloom’s Taxonomy and the inclusion of
detailed instructions further supports its validity, reliability, and overall educational
effectiveness. This approach not only assesses student performance but also promotes
continuous learning and improvement, reflecting the principles outlined in the HPGD2303
textbook.
5
Section B: Item Details Table
The following table summarizes the key details and rationale for each test item:
8
10 How can a teacher reduce 4 Application Assesses how
guesswork in MCQs? guesswork can be
A. By aligning each item with a minimized in
specific learning outcome. MCQs. The correct
B. By using distractors that differ option (D) states
noticeably from the correct that providing
answer. distractors similar
C. By limiting choices to only two in length and
alternatives. structure to the
D. By providing options that are correct answer
similar in length and structure. prevents students
from using test-
taking strategies
rather than
knowledge.
9
13 Which of the following is an 6 Application Identifies an
example of an authentic authentic
assessment task? assessment task.
A. A timed quiz that tests The correct option
vocabulary and grammar. (D) presents a
B. A fill-in-the-blank test project-based
covering isolated facts. assessment that
C. A multiple-choice test on applies knowledge
theoretical concepts only. to real-world
D. A project where students problem-solving.
design a plan to address a local
issue.
10
16 What challenge may arise in 6 Analysis Identifies a key
designing authentic assessments? challenge in
A. Ensuring tasks are closely authentic
linked to practical scenarios. assessment. The
B. Avoiding tasks that are too correct option (C)
detached from real-life correctly points out
applications. that grading
C. Providing clear, measurable authentic
outcomes that may require assessments can be
extra grading time. time-consuming
D. Reducing the need for due to the need for
extensive teacher feedback by detailed evaluation
oversimplifying tasks. and feedback.
12
Question 3: Item Analysis (Difficulty Index & Discrimination Index)
1 1-5 5 4
4 6-11 6 4
6 12-16 5 4
9 17-20 4 0
Total 20 12
13
Topic 1: Roles of Assessment in 5. Which factor best differentiates
Teaching & Learning (5 Items) formative from summative assessment?
A. The use of standardized tests with no
1. What is the primary purpose of
revision.
assessment in teaching?
B. The provision of immediate feedback
A. To assign grades for summative
during learning.
evaluation.
C. The focus on memorization of factual
B. To evaluate learning and guide
content.
instruction.
D. The allocation of final grades without
C. To rank students based solely on
feedback.
performance data.
D. To record student progress as a
historical account.
Topic 4: Constructing Objective Test
Items (6 Items)
2. In formative assessment, which element
6. What defines an objective test item in
is most critical for enhancing student
assessment?
learning?
A. A question with a clear, single correct
A. Immediate and constructive feedback
answer and consistent options.
provided during lessons.
B. A question that permits multiple valid
B. Final grading at the end of the term
interpretations.
without review.
C. A prompt for open-ended responses.
C. Standardized testing with delayed
D. A task requiring subjective judgment
analysis of results.
and varied answers.
D. Ranking students solely based on
numerical scores. 7. Why is precise wording crucial when
constructing MCQs?
3. A teacher uses regular quizzes and gives
A. It may allow ambiguous interpretations
immediate feedback. This approach
that mislead students.
exemplifies which type of assessment?
B. It unnecessarily complicates the
A. Criterion-referenced assessment.
question without added clarity.
B. Norm-referenced assessment.
C. It introduces confusion through
C. Formative assessment.
excessive technical language.
D. Summative assessment.
D. It reduces ambiguity and supports
objective scoring.
4. How is assessment data best used to
improve instruction? 8. How should distractors be constructed
A. By ranking students solely on in an effective MCQ?
performance metrics. A. They should mirror the correct answer
B. By assigning final grades without in both length and style.
further analysis. B. They must be noticeably shorter than
C. By recording student performance for the correct answer.
archival purposes. C. They should be uniform in length and
D. By guiding targeted changes in teaching style to avoid unintended clues.
methods. D. They should be significantly longer
than the correct answer choice.
14
9. What improves the validity of an MCQ 13. Which of the following is an example
item? of an authentic assessment task?
A. Ambiguous phrasing that invites varied A. A timed quiz that tests vocabulary and
interpretations. grammar.
B. A lengthy question including extraneous B. A fill-in-the-blank test covering isolated
details. facts.
C. Clear, concise wording that aligns with C. A multiple-choice test on theoretical
the learning objective. concepts only.
D. Excessive detail that obscures the main D. A project where students design a plan
idea. to address a local issue.
10. How can a teacher reduce guesswork 14. Why is authentic assessment beneficial
in MCQs? in modern education?
A. By aligning each item with a specific A. It connects theoretical knowledge with
learning outcome. practical applications.
B. By using distractors that differ B. It focuses solely on memorization of
noticeably from the correct answer. key facts.
C. By limiting choices to only two C. It minimizes the need for extensive
alternatives. student feedback.
D. By providing options that are similar in D. It reduces complex subjects to basic
length and structure. recall tasks.
11. Which pitfall is avoided in the revised 15. What promotes higher-order thinking
MCQ design? in authentic assessment?
A. Including options that vary widely in A. Encouraging rote repetition of isolated
phrasing and word count. facts.
B. Making distractors exactly match the B. Focusing solely on memorizing
length of the correct answer. definitions and formulas.
C. Using distractors that are noticeably C. Requiring analysis and application of
longer than the correct option. concepts in practical scenarios.
D. Allowing distractors to vary widely in D. Limiting tasks to simple recall of
style and phrasing. previously taught material.
16. What challenge may arise in designing
authentic assessments?
Topic 6: Authentic Assessment (5 Items)
A. Ensuring tasks are closely linked to
12. Which statement best defines authentic practical scenarios.
assessment? B. Avoiding tasks that are too detached
A. An evaluation focused solely on rote from real-life applications.
memorization. C. Providing clear, measurable outcomes
B. An assessment that simulates real-life that may require extra grading time.
tasks and challenges. D. Reducing the need for extensive teacher
C. A test that relies exclusively on feedback by oversimplifying tasks.
theoretical questions.
D. An evaluation that neglects practical
application.
15
Topic 9: Appraising Classroom Test &
Item Analysis (4 Items)
17. What is the primary purpose of item
analysis in classroom tests?
A. To review the overall structure of the
test paper.
B. To rank students solely by their
numerical scores.
C. To assess the clarity of test items.
D. To evaluate the difficulty and
discrimination of test items.
18. A high difficulty index in item analysis
indicates that:
A. Most students answer the item
incorrectly, implying it is difficult.
B. The item discriminates poorly between
high and low performers.
C. The item has ambiguous wording.
D. Most students answer the item
correctly, implying it is easy.
16
HPGD2303 Mind Test – Test Paper (January 2025 Semester) Kertas 1
40 MINUTES
1 1-5 5 4
4 6-11 6 5
6 12-16 5 4
9 17-20 4 2
Total 20 15
17
Topic 1: Roles of Assessment in 5. Which factor best differentiates
Teaching & Learning (5 Items) formative from summative assessment?
A. The use of standardized tests with no
1. What is the primary purpose of
revision.
assessment in teaching?
B. The provision of immediate feedback
A. To assign grades for summative
during learning.
evaluation.
C. The focus on memorization of factual
B. To evaluate learning and guide
content.
instruction.
D. The allocation of final grades without
C. To rank students based solely on
feedback.
performance data.
D. To record student progress as a
historical account.
Topic 4: Constructing Objective Test
Items (6 Items)
2. In formative assessment, which element
6. What defines an objective test item in
is most critical for enhancing student
assessment?
learning?
A. A question with a clear, single correct
A. Immediate and constructive feedback
answer and consistent options.
provided during lessons.
B. A question that permits multiple valid
B. Final grading at the end of the term
interpretations.
without review.
C. A prompt for open-ended responses.
C. Standardized testing with delayed
D. A task requiring subjective judgment
analysis of results.
and varied answers.
D. Ranking students solely based on
numerical scores. 7. Why is precise wording crucial when
constructing MCQs?
3. A teacher uses regular quizzes and gives
A. It may allow ambiguous interpretations
immediate feedback. This approach
that mislead students.
exemplifies which type of assessment?
B. It unnecessarily complicates the
A. Criterion-referenced assessment.
question without added clarity.
B. Norm-referenced assessment.
C. It introduces confusion through
C. Formative assessment.
excessive technical language.
D. Summative assessment.
D. It reduces ambiguity and supports
objective scoring.
4. How is assessment data best used to
improve instruction? 8. How should distractors be constructed
A. By ranking students solely on in an effective MCQ?
performance metrics. A. They should mirror the correct answer
B. By assigning final grades without in both length and style.
further analysis. B. They must be noticeably shorter than
C. By recording student performance for the correct answer.
archival purposes. C. They should be uniform in length and
D. By guiding targeted changes in teaching style to avoid unintended clues.
methods. D. They should be significantly longer
than the correct answer choice.
18
9. What improves the validity of an MCQ 13. Which of the following is an example
item? of an authentic assessment task?
A. Ambiguous phrasing that invites varied A. A timed quiz that tests vocabulary and
interpretations. grammar.
B. A lengthy question including extraneous B. A fill-in-the-blank test covering isolated
details. facts.
C. Clear, concise wording that aligns with C. A multiple-choice test on theoretical
the learning objective. concepts only.
D. Excessive detail that obscures the main D. A project where students design a plan
idea. to address a local issue.
10. How can a teacher reduce guesswork 14. Why is authentic assessment beneficial
in MCQs? in modern education?
A. By aligning each item with a specific A. It connects theoretical knowledge with
learning outcome. practical applications.
B. By using distractors that differ B. It focuses solely on memorization of
noticeably from the correct answer. key facts.
C. By limiting choices to only two C. It minimizes the need for extensive
alternatives. student feedback.
D. By providing options that are similar in D. It reduces complex subjects to basic
length and structure. recall tasks.
11. Which pitfall is avoided in the revised 15. What promotes higher-order thinking
MCQ design? in authentic assessment?
A. Including options that vary widely in A. Encouraging rote repetition of isolated
phrasing and word count. facts.
B. Making distractors exactly match the B. Focusing solely on memorizing
length of the correct answer. definitions and formulas.
C. Using distractors that are noticeably C. Requiring analysis and application of
longer than the correct option. concepts in practical scenarios.
D. Allowing distractors to vary widely in D. Limiting tasks to simple recall of
style and phrasing. previously taught material.
16. What challenge may arise in designing
authentic assessments?
Topic 6: Authentic Assessment (5 Items)
A. Ensuring tasks are closely linked to
12. Which statement best defines authentic practical scenarios.
assessment? B. Avoiding tasks that are too detached
A. An evaluation focused solely on rote from real-life applications.
memorization. C. Providing clear, measurable outcomes
B. An assessment that simulates real-life that may require extra grading time.
tasks and challenges. D. Reducing the need for extensive teacher
C. A test that relies exclusively on feedback by oversimplifying tasks.
theoretical questions.
D. An evaluation that neglects practical
application.
19
Topic 9: Appraising Classroom Test &
Item Analysis (4 Items)
17. What is the primary purpose of item
analysis in classroom tests?
A. To review the overall structure of the
test paper.
B. To rank students solely by their
numerical scores.
C. To assess the clarity of test items.
D. To evaluate the difficulty and
discrimination of test items.
18. A high difficulty index in item analysis
indicates that:
A. Most students answer the item
incorrectly, implying it is difficult.
B. The item discriminates poorly between
high and low performers.
C. The item has ambiguous wording.
D. Most students answer the item
correctly, implying it is easy.
20
Test Administration Overview
For this analysis, the test paper (comprising 20 MCQs) was distributed as a document
to 30 participants. These included 26 classmates and 4 teacher education students who
possess foundational knowledge in educational assessment. Participants marked their answers
using a pen tool on their devices and returned the completed document electronically. I then
manually scored each test paper using a Word application. After scoring, I provided each
participant with a feedback document that shows which items were answered incorrectly
alongside the correct answers.
A higher value indicates that an item is easier (i.e., a large proportion of students answered
correctly). This formula is detailed in the HPGD2303 Educational Assessment textbook
(Open University Malaysia, 2021).
A higher value suggests that the item is effective in differentiating students who understand
the material from those who do not. This method of calculation is also referenced in the
HPGD2303 textbook (2021).
21
Item Analysis Table
Item
Candidate 1 2 3 4 5 6 7 8 9 10
Hc 10 10 7 9 9 10 10 7 9 8
Lc 4 3 2 4 3 4 4 3 4 3
Hc ₊ Lc 14 13 9 13 12 14 14 10 13 11
Hc - Lc 6 7 5 5 6 6 6 4 5 5
Difficulty 70 65 45 65 60 70 70 50 80 55
Index
Discrimination 0.6 0.7 0.5 0.5 0.6 0.6 0.6 0.4 0.5 0.5
Index
Item
Candidate 11 12 13 14 15 16 17 18 19 20
Hc 8 10 10 10 10 8 7 8 6 6
Lc 4 6 6 6 6 3 4 2 4 4
Hc ₊ Lc 12 16 16 16 16 11 11 10 10 10
Hc - Lc 4 4 4 4 4 5 3 6 2 2
Difficulty 60 80 80 80 80 55 55 50 50 50
Index
Discrimination 0.4 0.4 0.4 0.4 0.4 0.5 0.3 0.6 0.2 0.2
Index
The difficulty index ranges from 50% to 80%, indicating that the test items are
generally within an acceptable range. Items with a difficulty index above 70% (such as Items
7, 12, 13, 14, and 15) were relatively easy, as most students answered them correctly.
However, items with a difficulty index around 50% (such as Items 8, 18, 19, and 20) were
more challenging, ensuring that not all students could answer them without a strong
understanding of the concepts.
The discrimination index ranges from 0.2 to 0.7. Items with a discrimination index of
0.4 or higher are effective in distinguishing high and low performers. For example, Items 2,
8, and 18 (with a discrimination index of 0.6) effectively differentiated between students with
a strong grasp of the material and those with weaker understanding. However, Items 19 and
20, with a discrimination index of 0.2, are borderline and may require slight revision to
improve their effectiveness.
Most items have moderate difficulty levels (between 45% and 80%), which suggests
that they are neither too hard nor too easy. The majority of items also have acceptable
discrimination indices (ranging from 0.4 to 0.7), indicating that they generally differentiate
well between students who understand the material and those who do not.
23
The discrimination indices mostly fall within the acceptable range, with most items above
0.4, confirming that the test effectively differentiates students based on their understanding.
Conclusion
The item analysis—using the Difficulty Index and Discrimination Index formulas from the
HPGD2303 textbook—demonstrates that the test paper is generally effective in assessing
student understanding of educational assessment principles. The majority of items show
acceptable difficulty and discrimination levels, confirming that the test is both appropriately
challenging and fair. Nonetheless, a few items require minor revisions to enhance their
discriminative power. This analysis provides a solid foundation for further refinement of the
test, ensuring that it continues to measure both fundamental and higher-order cognitive skills
in alignment with the course objectives.
24
Appendix
1 1-5 5
4 6-11 6
6 12-16 5
9 17-20 4
Jumlah 20
25
Topic 1: Roles of Assessment in 5. Which factor best differentiates
Teaching & Learning (5 Items) formative from summative assessment?
A. The use of standardized tests with no
1. What is the primary purpose of
revision.
assessment in teaching?
B. The provision of immediate feedback
A. To assign grades for summative
during learning.
evaluation.
C. The focus on memorization of factual
B. To evaluate learning and guide
content.
instruction.
D. The allocation of final grades without
C. To rank students based solely on
feedback.
performance data.
D. To record student progress as a
historical account.
Topic 4: Constructing Objective Test
Items (6 Items)
2. In formative assessment, which element
6. What defines an objective test item in
is most critical for enhancing student
assessment?
learning?
A. A question with a clear, single correct
A. Immediate and constructive feedback
answer and consistent options.
provided during lessons.
B. A question that permits multiple valid
B. Final grading at the end of the term
interpretations.
without review.
C. A prompt for open-ended responses.
C. Standardized testing with delayed
D. A task requiring subjective judgment
analysis of results.
and varied answers.
D. Ranking students solely based on
numerical scores. 7. Why is precise wording crucial when
constructing MCQs?
3. A teacher uses regular quizzes and gives
A. It may allow ambiguous interpretations
immediate feedback. This approach
that mislead students.
exemplifies which type of assessment?
B. It unnecessarily complicates the
A. Criterion-referenced assessment.
question without added clarity.
B. Norm-referenced assessment.
C. It introduces confusion through
C. Formative assessment.
excessive technical language.
D. Summative assessment.
D. It reduces ambiguity and supports
objective scoring.
4. How is assessment data best used to
improve instruction? 8. How should distractors be constructed
A. By ranking students solely on in an effective MCQ?
performance metrics. A. They should mirror the correct answer
B. By assigning final grades without in both length and style.
further analysis. B. They must be noticeably shorter than
C. By recording student performance for the correct answer.
archival purposes. C. They should be uniform in length and
D. By guiding targeted changes in teaching style to avoid unintended clues.
methods. D. They should be significantly longer
than the correct answer choice.
26
9. What improves the validity of an MCQ 13. Which of the following is an example
item? of an authentic assessment task?
A. Ambiguous phrasing that invites varied A. A timed quiz that tests vocabulary and
interpretations. grammar.
B. A lengthy question including extraneous B. A fill-in-the-blank test covering isolated
details. facts.
C. Clear, concise wording that aligns with C. A multiple-choice test on theoretical
the learning objective. concepts only.
D. Excessive detail that obscures the main D. A project where students design a plan
idea. to address a local issue.
10. How can a teacher reduce guesswork 14. Why is authentic assessment beneficial
in MCQs? in modern education?
A. By aligning each item with a specific A. It connects theoretical knowledge with
learning outcome. practical applications.
B. By using distractors that differ B. It focuses solely on memorization of
noticeably from the correct answer. key facts.
C. By limiting choices to only two C. It minimizes the need for extensive
alternatives. student feedback.
D. By providing options that are similar in D. It reduces complex subjects to basic
length and structure. recall tasks.
11. Which pitfall is avoided in the revised 15. What promotes higher-order thinking
MCQ design? in authentic assessment?
A. Including options that vary widely in A. Encouraging rote repetition of isolated
phrasing and word count. facts.
B. Making distractors exactly match the B. Focusing solely on memorizing
length of the correct answer. definitions and formulas.
C. Using distractors that are noticeably C. Requiring analysis and application of
longer than the correct option. concepts in practical scenarios.
D. Allowing distractors to vary widely in D. Limiting tasks to simple recall of
style and phrasing. previously taught material.
16. What challenge may arise in designing
authentic assessments?
Topic 6: Authentic Assessment (5 Items)
A. Ensuring tasks are closely linked to
12. Which statement best defines authentic practical scenarios.
assessment? B. Avoiding tasks that are too detached
A. An evaluation focused solely on rote from real-life applications.
memorization. C. Providing clear, measurable outcomes
B. An assessment that simulates real-life that may require extra grading time.
tasks and challenges. D. Reducing the need for extensive teacher
C. A test that relies exclusively on feedback by oversimplifying tasks.
theoretical questions.
D. An evaluation that neglects practical
application.
27
Topic 9: Appraising Classroom Test &
Item Analysis (4 Items)
17. What is the primary purpose of item
analysis in classroom tests?
A. To review the overall structure of the
test paper.
B. To rank students solely by their
numerical scores.
C. To assess the clarity of test items.
D. To evaluate the difficulty and
discrimination of test items.
18. A high difficulty index in item analysis
indicates that:
A. Most students answer the item
incorrectly, implying it is difficult.
B. The item discriminates poorly between
high and low performers.
C. The item has ambiguous wording.
D. Most students answer the item
correctly, implying it is easy.
28
29
PART II: ONLINE CLASS PARTICIPATION
30
Reference:
31