100% found this document useful (7 votes)
4K views27 pages

Module in Ed 107-Assessment in Learning Ii (Chapter 3)

The document discusses competency assessment instruments for trainers, including how to develop an institutional competency evaluation tool and evidence plan to properly assess trainees' knowledge, skills, and attitudes according to competency standards. An effective evaluation tool must be reliable, valid, objective, discriminating, and easy to administer and score. The evidence plan serves as a guide for collecting evidence to determine if trainees have demonstrated critical performance criteria for a competency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (7 votes)
4K views27 pages

Module in Ed 107-Assessment in Learning Ii (Chapter 3)

The document discusses competency assessment instruments for trainers, including how to develop an institutional competency evaluation tool and evidence plan to properly assess trainees' knowledge, skills, and attitudes according to competency standards. An effective evaluation tool must be reliable, valid, objective, discriminating, and easy to administer and score. The evidence plan serves as a guide for collecting evidence to determine if trainees have demonstrated critical performance criteria for a competency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 1

ED 107
ASSESSMENT IN LEARNING II
WITH FOCUS ON
TRAINERS METHODOLOGY I & II

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 2

CHAPTER 3
Competency Assessment Instruments
LESSON Institutional Competency Evaluation
1

Learning Outcomes:

At the end of the lesson, you should be able to:


 Determine the objectives of an institutional competency evaluation
 Identify the parts of an Institutional Competency Evaluation Tool

Introduction
Evaluation is a very significant element of the teaching learning process. This is done
to verify the acquisition of knowledge, skills and attitude needed acquired from the
training.
As a trainer, it is a must that you know how to test or verify that assessment criteria
addressed during the training.

Institutional Competency Evaluation


Institutional Competency Evaluation is the assessment of the knowledge, skills and
attitudes acquired from the training. In CBT (Competency Based Training),
evaluation is the systematic collection and analysis of data needed to make decisions
whether a trainee is competent or not yet competent.

The institutional Competency Evaluation is administered by the trainer within the


training duration. Trainees should be evaluated every after competency. No trainee
should be allowed to transfer to another competency without having been assessed.

For the purpose of CBT, assessments are usually given for the following purpose:
1. To validate the current competencies of trainees
2. To measure how much trainees have learned in the training sessions given
3. To help diagnose trainee’s problems and guide future instruction
4. To decide whether trainees are competent or not

The Institutional Competency Evaluation Tool

The competency evaluation tool should be carefully developed so that it will be able
to assess the four dimensions of competency such as:

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 3

1. Tasks skills
2. Task Management Skills
3. Job Role and Environment Management Skill
4. Contingency Management Skills

An analysis of the Modules of Instruction or the Competency Standards is critical in


the preparation of the assessment tool. Performance criteria for the competency are
the main basis for the competency assessment. You should carefully examine your
competency standards so that these criteria are included as a part of the evidences to
be gathered during assessment.

Characteristics of Good Evaluation Tool

1. Reliability
This refers to consistency of scores by the same person when re-examined
with the same test on different occasion. Your test is reliable if your test is
consistent in testing what is trying to test.

Factors that may affect Reliability


a. Length of the test- the longer the test the higher the reliability.
b. Difficulty of the test- the bigger the spread of the scores the more reliable
the measured difference is likely to be. Items should not be too easy or too
difficult.
c. Objectivity- this is achieved if scores are independent of the subjective
judgement of individual examinees.

To increase the reliability of the written test we do item-analysis. That is analyzing


the degree of difficulty and the index of discrimination of the test items. Standard
written test items should not be too easy nor too difficult and it should discriminate
those who learned from those who did not learn anything.

2. Validity
This is the degree to which the test actually measures what it purports to
measure. It provides a direct check on how well the test fulfils its functions.

Factors that influence the validity of test:


a. Appropriateness of test items;
b. Directions;
c. Reading vocabulary and sentence structures;
d. Difficulty of items;
e. Construction of test items- no ambiguous items or learning items;
f. Length of the test- sufficient length;
g. Arrangement of items- from easy to difficult; and
h. Patterns of answers- no patterns

To ensure the validity of the evaluation tool, prepare an Evidence Plan based on CS
(Competency Standard). To increase the validity of the written test, you should
prepare a table of specification.

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 4

3. Objectivity
The test must be fair to all the examinee
4. Discrimination
It must pick up the good examinees from the poor
5. Ease of Administration and Scoring
The test must have the right length and level of sophistication to do the job.

Parts of the Competency Evaluation Tool


1. Evidence plan
2. Written test
3. Performance test
4. Questioning tool (with answer)

Use separate sheet attached for your answers.

Test Your Understanding:

1. What is institutional competency evaluation? Who administer the evaluation,


and when it is administered?

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 5

ACTIVITY SHEET
Lesson 1 (Chapter 3)
Institutional Competency Evaluation
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 6

LESSON Evidence Plan


2

Learning Outcomes:

At the end of the lesson, you should be able to:


 Explain the purpose of preparing an evidence plan
 Determine the sources of the contents of the evidence plan
 Identify methods appropriate for evaluating a performance criteria
 Prepare the evidence plan

Introduction
One essential part of the Competency-Based Training Delivery is the institutional
assessment. Assessment is the process of collecting evidence and making judgments
on whether competency has been achieved. The purpose of assessment is to confirm
that an individual can perform to the standards expected in the workplace as
expressed in the relevant competency standards.

The Evidence Plan


In developing evidence gathering tools for an institutional assessment, the first stage
is to prepare and evidence plan.

Evidence plan are designed to-


 Serve as a planning tool
 Support the assessment process
 Assist the collection of evidence
 Inform the learners what is expected of them before they begin the assessment
 Serve as guide for the trainer in determining the method of assessment to be
used
In making an evidence plan you should have the competency Standard (CS) of the
chosen competency and the Evidence Plan Template

Co mpetency Standard Competency Standard


of the Qualification Unit of Competency
Ways in which evi dence
Demonstration &

will be collected (t ick the


Observation &

Unit of Co mpetency to column)


Third Party
Questioning

Questioning

be assessed
Portfolio
Ortfolio
Written
Report

Evi dence must show that


Methods of Assessment the trainee.....
*
*
*
Ev idence Requirements
*
Note: * Critical as pects of competency

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 7

Critical aspects of competency are the performance criteria that are listed in
the evidence guide of the Competency Standard (CS) as critical. These criteria
are required to be demonstrated by the trainee for him to evaluated as
competent. You should prepare an institutional competency assessment tool
that will show these evidences.

Parts of the Evidence Plan


1. Competency Standard- this is the title of your qualification
2. Unit of Competency- the institutional evaluation tool is packed by
competency. The name of the competency is written in this portion.
3. Evidence Requirement- the criteria for judging the competency of the
trainee. These are written in the competency standards. Critical aspects of
competency should be marked with an asterisk (*). Refer to the CS for the
identification of the critical aspects of competency.
4. Methods of Assessment- the methods of collecting evidence per each
performance. At least 2 methods of assessment should be chosen for each
criterion to allow for corroboration of evidences.

Knowledge, skills and attitude and the four dimensions of competency


are to be assessed. To do this, the following methods are recommended.

4.1 Written test- to test the acquisition of knowledge


4.2 Performance test- to test the demonstrated skills
i. Demonstration method- this is the method used when the
performance of a particular skills is to be assessed within the
workshop.

ii. Observation method- is used when the assessment is done by


observing the trainee on the actual job site while the trainee is
doing his job.

iii. Portfolio evaluation- is used when projects or outputs are


required to collect evidences of competency. In Institutional
Evaluation, we use the Performance Criteria Checklist to
evaluate the output/project.

4.3 Inte rvie w/questioning- this is to verify evidences which are not clearly
demonstrated during performance test. This is also the part of the
competency evaluation where you can ask questions to verify Job Role and
Environment Management Skills and Contingency Management Skills.

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 8

Sample Evidence Plan

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 9

Sample Tasks Sheet

Sample Performance Criteria Checklist

(Excerpt: TESDA-CBLM-Plan Train ing session)

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 10

Use separate sheet attached for your answers.

Activity: L2 (Chapter 3)

1. Task: Make an Evidence Plan: Given the sample guide, make an evidence plan
of your chosen Qualification Title (Dressmaking NC II, Cookery NC II, etc.).

 Search for the Training Regulation (TR) of the Qualification you choose.
This will serve as your guide in the formulation of your Evidence Plan.
 Given the sample template make an Evidence Plan, see sample for your
guidance.

Template for Evidence Plan

Competency Standard

Unit of Competency

Ways in which evidence will be collected

Demonstration &
(tick the column)
Observation &

Third Party
Questioning

Questioning

Portfolio

Written
Report
Evidence must show that the trainee.....

*
*
*
*

Prepared by: Date:


Checked by: Date:

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 11

ACTIVITY SHEET
Lesson 2 (Chapter 3)
Evidence Pan
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 12

LESSON
Table of Specification
3

Learning Outcomes:

At the end of the lesson, you should be able to:


 Discuss the importance of preparing a table of specification
 Determine the parts of the table of specification; and
 Explain how the table of specification is prepared
 Prepare a table of specification

Introduction

The evidence plan is a plan for the institutional evaluation tool. After preparing the
evidence plan, we are now ready to prepare for the development of the other parts of
the evaluation tool such as written test.

To ensure the validity of your test, you should prepare a table of specification so that
all contents to be tested have a representative question.

Table of Specification
A table that shows what will be tested (taught) is the table of specifications. For our
purpose of institutional evaluation, we shall be preparing a table of specifications for
our written test. This will help us plan how many items we need to cover all the
contents or objectives that we need to assess based on the evidence plan you
previously prepared.

A table of specifications is a two-way table that matches the objectives or content you
have taught with the level at which yo u expect students to perform. It contains an
estimate of the percentage of the test to be allocated to each level at which it is to be
measured. In effect we have established hoe much emphasis to give each objective or
topic.

Parts of the Table of Specification


1. Objectives/Content/Topic- these are the content
2. Levels of learning- your questions shall be divided into the levels of learning:
knowledge, comprehension and application.
 Factual/knowledge- recognition and recall of facts
Example:
The figure 1 in the symbol E6013 signifies
a. Tensile strength
b. Welding position
c. Material thickness
d. Maximum weld length
 Comprehension- interpret, translate, summarizes or paraphrase given
information
Example:

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 13

The megger is used to


a. Measure the amount of illumination
b. Determine the speed of electric motor
c. Measure the resistance of a lightning cable
d. Test the insulation resistance of a circuit
 Application- uses information in a situation different form original
learning context.
Example:
To measure the voltage of a circuit, you connect
a. A voltmeter across the line
b. An ammeter across the line
c. A voltmeter in series with the line
d. An ammeter in series with the line

3. Percentage/number of items

Objecti ves/ Knowledge Comprehension Application Total # of


Content/ Topic items/ %
of test
Learners training 20%
requirements
Session plan 20%
Assessment instruments 20%
(institutional)
Basic instructional materials 30%
Learn ing and teaching 10%
resources
TOTAL 100%

We also have to take into account the type of thinking skills we wish o assess.
Whether you use Blooms Taxonomy or another structure, the levels of learning can
help you identify the types of questions (or other type of assessment) that are
appropriate. For ease of use we have used only three levels: knowledge,
comprehension and application, and labeled the column accordingly. The important to
use levels of thinking that are relevant for your students and have been incorporated in
your instruction. At this stage it can be helpful to mark an “x” or make a check mark
in the cells to show the levels at which each objective will be measured, as shown in
the example below.

Objecti ves/ Knowledge Comprehension Application Total # of


Content/ Topic items/ %
of test
Learners training x (10%) x (5%) x (5%) 20%
requirements
Session plan x (5%) x (5%) x (10%) 20%
Assessment instruments x (10%) x (10%) 20%
(institutional)
Basic instructional materials x (10%) x (10%) x (10%) 30%
Learn ing and teaching x (5%) x (5%) 10%
resources
TOTAL 25% 35% 40% 100%

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 14

At this point we recognize that 25% of our test is to be on knowledge, 35% on


comprehension, and 40% on application. This does not mean that we must have 25
knowledge questions; it does mean that the score on the test will reflect
comprehension and application in equal amounts, and knowledge to a lesser degree
than knowledge or application.

It may be that at this point you want to compare the test(s) provided by the textbook
publisher with your completed table of specifications. If they match and you think the
questions are well written, you may decide to use the test (or parts of the test)
provided with the text. On the other hand, you may find that it will be necessary for
you to create a test to provide an accurate assessment of what the students in your
class have learned.

One questions frequently asked is how many questions are needed to adequately
sample the content representing an objective or topic. Increasing the number of
questions increases the probability that we will have a good estimate of what the
learner knows and can do.

When translated the number of items per topic, the Table of Specification for 40-item
test may look like this:

Objecti ves/ TES T ITEM DIS TRIB UTION Total Percent


Content/ Topic Knowledge Comprehension Application number age (% )
of i tems
Train ing requirements 4 2 2 8 20%
Session plan 2 2 4 8 20%
Assessment instruments 4 4 8 20%
(institutional)
Basic instructional 4 4 4 12 30%
materials
Learn ing and teaching 2 2 4 10%
resources
TOTAL 10 14 16 40 100%
Note: This is a sample. The number of items is not prescribed. The trainers should decide on the
number of items based on contents of the competency.

For purposes of validating the current competencies of the trainees or for identifying
mastered contents, items placement maybe identified in the Table of Specification for
easier analysis. At this point you also have to decide how many questions needed to
measure learning, what type of questions will be asked and whether a written
assessment is sufficient to measure the competency. In most cases, for skills training,
performance evaluation with interview maybe more appropriate as an assessment
instrument but the effectiveness of written assessment maybe harnessed through the
ingenuity and skills of the trainer. If however, the trainer decides for a performance
evaluation, it should be reflected in the evidence plan.

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 15

Samples Table of Specification

Use separate sheet attached for you answers:

Activity: L3 (Chapter 3)

1. Given the qualification you choose in lesson 2, prepare a table of specification.


See sample as your guide.

Test Your Understanding: L3 (Chapter 3)

1. What is the importance of preparing a table of specification?

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 16

ACTIVITY SHEET
Lesson 3 (Chapter 3)
Table of Specification
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 17

LESSON
Written Test
4

Learning Outcomes:

At the end of the lesson, you should be able to:


 Explain the advantage of preparing a reliable test items
 Determine the type of test appropriate for testing knowledge contents
 Enumerate guidelines in preparing a written test
 Prepare a written test

Introduction
Evaluation of competency should be assessing the knowledge, skills and attitude.
Written test is a method of assessment which can measure knowledge, skills and
attitude learned in training program but sometimes trainers fail to develop questions to
test the level of skills and attitude.

This lesson discusses some tips and guidelines in preparing the written test. The
written test that you will write after this lesson should follow the guidelines in
preparing a test item.

In developing test items always consider the five (5) characteristics of good-test-
validity, reliability, objectivity, discrimination and ease of administration and
scoring.

As in the construction of a workload and functional project in shop work, test


construction should follow the same steps. In the construction of a competency
assessment instrument, the following steps are recommended:

1. Examine the established Training Regulations and determine your objectives.


This will help in the analysis of the basic skills and knowledge requirements
of the trade.
2. Construct the table of specifications. This will be your blue print in
constructing individual test items, it will serve as a guide in the preparation of
a set of competency assessment methodology for certain trade.
3. Construct test items more than the number required for a set of Competency
Assessment Instrument. This will facilitate item banking and will give an
allowance for correction when the test items will be deliberated whereby some
items might be deleted.
4. Assemble the items for the test. After grouping the items by type, arrange
them such that elated items are together. The reason for this is obvious, it
saves examinee time as the test is taken and it will be easier to point out where
the examinee had failed. In assembling items for the test the specification table
should be followed.
5. Write clear and concise directions for each type of questions. The directions
should tell the examinee what to do, how to do it and where to place the

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 18

responses. They should also contain an example taken form the subject matter
being tested.
6. Study every aspect of the assembled test. After the test is assembled and
directions are written, it is a good policy to lay it aside for several days then
pick up again and review each part critically. Consider each item from the
point of view of the workers who will take the competency assessment. Try to
determine those items that are ambiguous. Check the grammar and be sure that
the words used will be understood by the workers who will take the
competency assessment.

The written test that we shall prepare as a part of the institutional assessment will
largely measure the acquisition of knowledge. Skills and attitude shall be measured
using performance test with questioning.

Guidelines for Teacher-Made Tests as to Format

1. Include easiest items first.


2. Group smaller items together, i.e. matching, completion, etc.
3. Put all of an item on the same page. Avoid splitting a matching exercise or
response to a multiple-choice question.
4. Number continuously.
5. Write clear, precise directions.
6. For ease of correcting, place blanks for responses to one side of the paper, or
use a separate answer sheet.
7. Avoid patterned responses in true- false, multiple choice, or matching
exercises.
8. Proofread the test carefully for clarity, errors, etc.
9. Make sure copies of the test are dark and legible.

Pointe rs in the formulation of test questions for written test

1. Keep in mind that it I not possible to measure all outcomes of instruction with
one type of test.
2. Devise your items so that they require the trainee to actually apply things
learned rather than merely recalling or recognizing facts.
3. Make certain that the type of test items used for measuring each objective is
the one that will measure the objective.
4. Avoid “tricky” or catchy questions. Do not construct puzzling items in which
hidden meaning or subtle clues provides the correct answer.
5. Do not lift statements directly from the books and use them as test items.
6. Check to make sure that no item can be answered simply by referring to the
other items. Make an item independent upon the answer of another.
7. Do not include the item for which the answer is obvious to a person who does
not know the subject matter.

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 19

8. Word the item in the simplest manner possible. Confine the items used to the
vocabulary level of the examinee. States questions clearly and eliminate
ambiguous items.
9. Arrange the items so that responses will not form a particular pattern.

Guidelines for Constructing Effective Multiple Choice Items

1. Present a single definite concept in the stem.


2. Place all common wording in the stem.
3. Make the alternative grammatically consistent with the stem and with each
other.
4. Avoid verbal association between the stem and the correct response
(grammatical clues).
5. Construct items with a single best item.
6. Include four or five alternatives.
7. Make all choices plausible.
8. Arrange alternatives in a logical sequence.
9. Avoid using opposites or mutually exclusive alternatives.
10. Eliminate option length and specificity as clue to the correct response. Make
options of similar length.
11. Delete specific determiners from the alternatives.
12. Avoid using “all the above” and none of the above” unless these are used in
questions where “all of the above” and none of the above” are not desirable
responses.
13. Avoid using opposite as possible answers.
14. Phrase stems positively unless emphasizing an exception. If desired response
is an exception to the question, underline except or not in the question.
15. Vary the position of the correct answer in a random manner.

Use separate sheet attached for your answers.

Activity: L4 (Chapter 3)

1. Using your prepared table of specification in lesson 3, construct a written


test.
Test Your Understanding: L4 (Chapter 3)

1. Why there’s a need to prepare a table of specification before constructing a


written test?

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 20

ACTIVITY SHEET
Lesson 4 (Chapter 3)
Written Test
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 21

LESSON
Performance Test
5

Learning Outcomes:

At the end of the lesson, you should be able to:


 Define performance evaluation
 Determine the guidelines in formulating performance test
 Construct performance test

Introduction
Evaluation of competency covers knowledge, skills and attitudes. To assess
knowledge, we can use written test as a method of assessment but to effectively assess
the skills and attitudes acquired by the trainee in CBT, we should use performance
evaluation which will include demonstration of the skills and an interview to follow-
up demonstration.

In this lesson, the format and structure of the prescribed per formance test shall be
discussed to help you develop your own instructions for demonstration.

Performance Evaluation

It is the formal determination of an individual’s job-related competencies and their


outcome.

Performance evaluation is accompanied with interview questions which are used


during the actual conduct of the test. This is to support evidences gathered by the
facilitator/trainer.

GUIDELINES IN FORMULATING PERFORMANCE TEST

This is the practical portion of the competency assessment instrument. This part
measures the skills possessed by the examinee in relation to the occupation. It consists
of General and Specific Instruction, and the List of Materials, Equipment/Tools and
the Marking Sheets.

A. GENERAL INSTRUCTIONS

This refers to the overall conduct of the test (before, during and after) which
concerns both the testing officer and the examinee. This part of the competency
assessment specifies the does and don’ts inside the testing area.

 Performance or what must be done


 The conditions or what is given
 The standard of performance expected of the examinee

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 22

B. SPECIFIC INSTRUCTIONS

This provides the instruction which the examinee must follow in the performance
of the test.

C. LIST OF MATERIALS, EQUIPMENTS

This provides the listing of the materials, equipments/tools needed in the


performance of the skills test. This contains also the complete specifications of
each item in the listing.

Pointe rs to follow in the construction/formulation of a good Test of Skills:

1. The test coverage must be consistent with the job description and skills
requirements.
2. The test must not take more than 8 hours to complete.
3. The test statement must specify the exact time within which the expected
examinee is expected to finish the task and the tools/equipment that will be
issued to the examinee.
4. The work performance/specimen or whatever is being tested must be
observable and measurable.
5. The test should be feasible. Do not design tests which makes use of rare or too
expensive equipment.
6. Where applicable there must be a working drawing which is clear and
accurate.
7. The standard performance outcome if possible, should be stated such as
surface finish, clearance or tolerance and number of allowable errors.
8. Directions must be clear, simple, concise and accurate.

Sample Performance Test

SPECIFIC INTRUCTION FOR THE CANDIDATE


QUA LIFICATION CD/CAM OPERATION NC II
Title of PBA : Draw and fabricate Cy linder and Top Plate
Units of Co mpetency Create Drawing Using CA D Soft ware
GENERAL Covered
INSTRUCTIONS PLEASE REA D CAREFULLY:
Given the necessary tools, materials and equipment, you are
required to draw and fabricate cylinder and top plate in accordance
with accepted institutional /industry standard. (Allotted time: 4hrs).
SPECIFIC
1. Gather instructions and relevant materials
INSTRUCTION 2. Prepare drawings in accordance to existing standards
3. Set screen display areas and basic parameters
4. Create drawing
5. Modify reviewed CA D drawings
6. Save drawing files
7. Print drawings

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 23

List of Equip ment, Tools and Materials


Equip ment
QTY Description

Supplies and Materials

Tools

Use separate sheet attached for your answers:

Activity: L5 (Chapter 3)

1. Construct performance test. (See sample guide in constructing performance


test).

Test Your Understanding: L5 (Chapter 3)

1. What is the significance of performance test in a competency –based training


delivery?

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 24

ACTIVITY SHEET
Lesson 5 (Chapter 3)
Performance Test
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 25

LESSON
Questioning Tool
6

Learning Outcomes:

At the end of the lesson, you should be able to:


 Determine the purpose of the questioning tool
 Enumerate the types of questions that are in the questioning tool
 Explain how corroboration of evidences will be achieved using the
questioning tool

Introduction
Corroboration of evidences should be achieved when gathering evidences of
competency. In case evidences from the written test and the performance test results
are not enough to decide for the contemporary of a trainee, the questioning tool
should be used.

This lesson discusses the structure of the questioning tool so that it will help the
trainer gather evidences of knowledge, skills attitude and the four dimensions of
competency needed for the competency being assessed.

Questioning Tool

The questioning tool is a must in an institutional competency evaluation tool package.


This will be used to verify evidences that were not clearly demonstrated in the other
methods of assessment such as in the written test and performance test.

The questioning tool should be able to demonstrate the four dimensions of


competency. To be able to do this your questioning tool should contain questions:

1. To follow-up the demonstration of tasks skills and task management skills.

All possible questions should be written here. Although the trainer is not
required to ask questions that are already observed in the demonstration of
skills, you should write all possible questions so that these questions are ready
for use.

2. To verify OHS practices


Safety practices are very important aspect of the demonstration. List down
questions on safety related to the competency being assessed. Questions
should concentrate on safety practices for the competency being assessed.
3. To verify Job Role and Environment management skills.

Questions that will verify the responsibility of the worker towards his
customers, co-employee, employer and the environment are very important

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 26

because oftentimes this dimensions of competency needs to be verified using


these questions. They are not demonstrated in most demonstration test.

4. To gather evidences for contingency management skills

Infrequent events may arise from the job that would need the worker to adjust.
These are the contingency management skills questions that you need to
construct to very this dimensions of the competency.

5. On knowledge of laws, rules and regulations

Knowledge of laws, rules and regulations critical to the job should also be
verified. Prepare questions to gather evidences for the specific competency.

Questioning Tool Template

RAT ING SHEET FOR ORAL QUEST IONING

Questions to prove the candidate’s underpinning knowledge Satisfactory


response
Extensions/Reflection Questions Yes No

Safety Questions

Job Role and Environment Questions

Contingency/Infrequent Events Questions

Rules and Regulations

Candidate underpinning knowledge was: Satisfactory Not Satisfactory

Feedback to Candidate
General comments (Strengths/Improvements needed):

Candidate signature Date:


Assessors signature Date:

Use separate sheet attached for your answers:

Activity: L6 (Chapter 3)

1. Given the template sample, construct a questioning tool.

References:

TESDA-Competency Based Learning Material (CBLM) - Planning Training Session

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas


ASSESSMENT IN LEARNING II WITH FOCUS ON TRAINERS METHODOLOGY I & II 27

ACTIVITY SHEET
Lesson 6 (Chapter 3)
Questioning Tool
Name: ___________________________________________ Date: _____________
Year & Section: ____________________________________

BACHELOR OF TECHNOLOGY AND LIVELIHOOD EDUCATION madcervas

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy