Beruflich Dokumente
Kultur Dokumente
• Reliability
- refers to the consistency of scores
obtained by the same person when re-
examined with the same test on
different occasions, or with different
sets of equivalent items, or under other
variable examining condition.
KINDS OF RELIABILITY
1. Test-Retest
2. Parallel Forms
3. Inter-rater
4. Split-Half
TEST - RETEST
• Same test
• Same examinee
• Differ in time. There’s a gap
ALTERNATE FORM RELIABILITY
• Same examinee
• Same time
• But differ in forms. There’s an alternate form/test.
• The two forms use different items; however, the rules used
to select items of a particular difficulty level are the same
INTER-RATER RELIABILITY
TEST
ADMINISTRATION TEST ITEMS
• Test-Retest Reliability • Split Half
• Alternate Form
Reliability
SCORING SYSTEM
• Inter-rater Reliablity
HOW TO IMPROVE RELIABILITY?
1. Content Validity
2. Criterion-Related Validity
3. Construct Validity
CONTENT VALIDITY
• Involves an attempt to assess the content
of a test to assure it includes a
representative sample of all the questions
that could be asked
• Objectives and TOS
CRITERION – RELATED
VALIDITY
• How well a test corresponds with a particular
criterion provided by high correlations between a
test and a well-defined criterion measure
CONCURRENT PREDICTIVE
VALIDITY VALIDITY
CONCURRENT VALIDITY
- Unclear directions
- Difficult reading vocabulary and sentence structure
- Ambiguity in statements
- Inadequate time limits
- Inappropriate level of difficulty
- Poorly constructed test items
- Test items inappropriate for the outcomes being measured
- Tests that are too short
- Improper arrangement of items (complex to easy?)
- Identifiable patterns of answers
- Teaching
- Administration and scoring
- Nature of criterion
REMEMBER…