Sie sind auf Seite 1von 34

Shara M.

Baylin
Department of Mathematics Education
University of Science and Technology of Southern Philippines
shara.baylin@ustp.edu.ph
INTENDED LEARNING OUTCOME
At the end of Chapter 4, students are expected
to:
• Cite evidences of validity and reliability in
teacher-made tests
VALIDITY vs RELIABILITY
• necessary to ensure correct measurements of
traits (not directly observable)
• Two very important concepts in measurement
What is reliability?

• Consistent
• Accurate
• Intended learning
What is validity?

• shows how well the tool measures what it is


supposed to measure
What is validity?

• shows how well the tool measures what it is


supposed to measure
What is validity?

• shows how well the tool measures what it is


supposed to measure
Reliability vs Validity

Reliable Valid
1. CONTENT-RELATED EVIDENCE
2. CRITERION-RELATED EVIDENCE
3. CONSTRUCT-RELATED EVIDENCE
1. CONTENT-RELATED EVIDENCE
2. CRITERION-RELATED EVIDENCE
3. CONSTRUCT-RELATED EVIDENCE
CONTENT-RELATED EVIDENCE
1. FACE-VALIDITY – refers to a test that appears
to adequately measure the learning
outcomes
CONTENT-RELATED EVIDENCE
2. INSTRUCTIONAL-VALIDITY – the extent to
which an assessment is systematically
sensitive to the nature of instruction offered.
HOW to IMPROVE TEST VALIDITY?
• Table of Specifications (ToS)
- a test blueprint that identifies the content
area and describes the learning outcomes at
each level of the cognitive domain
- a two-way dimensional grid
HOW to IMPROVE TEST VALIDITY?
• Table of Specifications (ToS)
- a test blueprint that identifies the content
area and describes the learning outcomes at
each level of the cognitive domain
- a two-way dimensional grid
1. CONTENT-RELATED EVIDENCE
2. CRITERION-RELATED EVIDENCE
- refers to the degree to which test scores agree
with an external criterion
- Criterion validity is achieved when assessment
are able to verify current (concurrent) and
predict future performance (predictive)
THREE TYPES of CRITERIA
1. Achievement test scores
2. Ratings, grades and other numerical judgments
made by the teacher
3. Career data
TWO TYPES of Criterion-related
Evidence
1. Concurrent Validity
- provides an estimate of a student’s current
performance in relation to a previously validated
or established measure.
TWO TYPES of Criterion-related
Evidence
2. Predictive Validity
- pertains to the power or usefulness of test
scores to predict future performance.
1. CONTENT-RELATED EVIDENCE
2. CRITERION-RELATED EVIDENCE
3. CONSTRUCT-RELATED EVIDENCE
- an assessment of the quality of the instrument
used.
- Convergent and Divergent validation
VALIDITY OF ASSESSMENT METHODS
How do you develop performance assessments?

1. Define the Purpose


2. Choose the activity
3. Develop criteria for scoring
VALIDITY OF ASSESSMENT METHODS
How do you develop performance assessments?

1. Define the Purpose


2. Choose the activity
3. Develop criteria for scoring
VALIDITY OF ASSESSMENT METHODS
How do you develop performance assessments?

1. Define the Purpose


2. Choose the activity
3. Develop criteria for scoring
RECOMMENDATIONS IN CHOOSING
THE ACTIVITY
1. The selected performance should reflect a
valued activity.
2. The completion of performance assessments
should provide a valuable learning experience.
3. The statement of goals and objectives should
clearly be clearly aligned with the measurable
outcomes of the performance activity.
RECOMMENDATIONS IN CHOOSING
THE ACTIVITY
4. The task should not examine extraneous or
unintended variables.
5. Performance assessments should be fair and
free from bias.
VALIDITY OF ASSESSMENT METHODS
How do you develop performance assessments?

1. Define the Purpose


2. Choose the activity
3. Develop criteria for scoring
THREATS TO VALIDITY
1. Unclear test directions
2. Complicated vocabulary and sentence
structure
3. Ambiguous statements
4. Inadequate time limits
5. Inappropriate level of difficulty of test items
THREATS TO VALIDITY
6. Poorly constructed items
7. Inappropriate test items for outcomes being
measured
8. Short test
9. Improper arrangement of items
10.Identifiable pattern of answers
HOW TO ENHANCE VALIDITY
1. Ask others to judge the clarity of what you are
assessing.
2. Check to see if different ways of assessing the same
thing give the same result.
3. Sample a sufficient number of examples of what is
being assessed.
4. Prepare a detailed table of specifications.
5. Ask others to judge the match between the assessment
items and the objectives of the assessment.
HOW TO ENHANCE VALIDITY
6. Compare groups known to differ on what is being
assessed.
7. Compare scores taken before to those taken after
instruction.
8. Compare predicted consequences on actual
consequences.
9. Compare scores on similar, but different traits.
10.Provide adequate time to complete the
assessment.
HOW TO ENHANCE VALIDITY
11.Ensure appropriate vocabulary, sentence
structure and item difficulty.
12.Ask easy questions first.
13.Use different methods to assess the same
thing.
14.Use only for intended purposes.
CONSTRUCTING ToS
Any questions?
END

Das könnte Ihnen auch gefallen