Sie sind auf Seite 1von 6

Basic Concepts It refers to the use of methods other than pen-

and-paper objective test which includes


TEST performance tests, projects, portfolios, journals,
An instrument designed to measure any and the likes.
characteristics, quality, ability, knowledge or AUTHENTIC ASSESSMENT
skill.
It refers to the use of an assessment method
It comprised of items in the area it is designed that simulate true-to-life situations. This could
to measure. be objective tests that reflect real-life situations
Basic Concepts or alternative methods that are parallel to what
we experience in real life.
MEASUREMENT
PURPOSES OF CLASSROOM ASSESSMENT
A process of quantifying the degree to which
someone/something possesses a given trait, 1. Assessment FOR Learning
i.e., quality, characteristics, or feature. (before and during instruction)
ASSESSMENT a. Placement
A process of gathering and organizing -done prior to instruction
quantitative or qualitative data into an
interpretable form to have a basis for -to assess the needs of the learners to
judgement or decision-making. have basis in planning for a relevant
instruction
It is a prerequisite to evaluation. It provides the
information which enables evaluation to take b. Formative
place.
-done during instruction (quiz)
EVALUATION
-to continuously monitor the student’s
A process of systematic interpretation, analysis, level of attainment of the learning objectives
appraisal or judgement of the worth of
organized data as basis for decision making. c. Diagnostic

It involves judgment about the desirability of -done before/during instruction


changes in students. -helps formulate a plan for detailed
TRADITIONAL ASSESSMENT remedial instruction

It refers to the use of pen-and-paper objective


test. 2. Assessment OF Learning
ALTERNATIVE ASSESSMENT -This is done after
instruction(summative assessment)
-to determine what students know and It is the degree to which the assessment
can do and the level of their proficiency instrument measures what it intends to
or competency measure.

It is also refers to the usefulness of the


instrument for a given purpose.
3. Assessment AS Learning
It is the most important criterion of a good
-This is done for teachers to understand
assessment instrument.
and perform well their role of assessing
FOR and OF learning. Ways in Establishing Validity

PRINCIPLES OF HIGH QUALITY CLASSROOM 1. Face Validity-is done by examining the


ASSESSMENT physical appearance of the instrument
to make it readable and
Principle 1: Clarity and Appropriateness of
understandable.
Learning targets.
2. Content Validity-is done through a
Learning targets should be clearly
careful and critical examination of the
stated, specific, and center on what is truly objectives of assessment to reflect the
important. curricular objectives.
Principle 2: Appropriateness of Methods 3. Criterion-related Validity-is established
Learning targets are measured by appropriate statistically such that a set of scores
assessment methods . revealed by the measuring instrument
is correlated with the scores obtained in
Learning Targets and their Appropriate another external predictor or measure.
Assessment Methods
2 Purposes of Criterion-related
Note: Higher Numbers indicate better matches Validity
(5=high, 1=low)
a. Concurrent Validity-describes the
present status of the individual by
correlating the sets of scores
Principle 3: Balance
obtained from two measures given
A balanced assessment sets targets in all at a close interval.
domains of learning or domains of intelligence.
b. Predictive Validity-describes the
A balanced assessment makes use of both future performance of an individual
traditional and alternative assessments. by correlating the sets of scores
obtained from two measures given
Principle 4: Validity at a longer time interval.
4. Construct Validity-is established statistically  Quality products and performance
by comparing psychological traits or factors that  Positive interaction between the
theoretically influence scores in a test. assessee and assessor
 Emphasis on meta-cognition and self-
a. Convergent Validity-is established if evaluation
the instrument defines another similar trait  Learning that transfers
other than what is intended to measure.
Principle 10: Communication
Ex: Critical thinking test may be correlated with
 Assessment targets and standards
creative thinking test.
should be communicated
b. Divergent Validity-is established if an  Assessment results should be
instrument can describe only the intended trait communicated to important users.
and not the other traits.  Assessment results should be
communicated to students through
Ex: Critical thinking may not be correlated with direct interaction or regular ongoing
Reading Comprehension test. feedback on their progress.

Principle 5: Reliability Principle 11: Positive Consequences

It refers to the consistency of scores obtained  Assessment should have a positive


by the same person when retested using the consequence to students; that is, it
same or equivalent instrument. should motivate them to learn.
 Assessment should have a positive
Principle 6: Fairness
consequence to teachers; that is, it
A fair assessment provides all students with an should help them improve the
equal opportunity to demonstrate effectiveness of their instruction.
achievement.
Principle 12: Ethics
Principle 7: Practicality and Efficiency
Teachers should free the students from harmful
When assessing learning, the information consequences or misuse or overuse of various
obtained should be worth the resources and assessment procedures such as embarrassing
time required to obtain it. students and violating students’ right to
confidentiality
Principle 8: Continuity
Administrators and teachers should understand
Assessment takes place in all phases of that it is inappropriate to use standardized
instruction. It could be done before, during and student achievement to measure teaching
after instruction. effectiveness,
Principle 9: Authenticity PERFORMANCE-BASED ASSESSMENT –
 Features It is a process of gathering information
 Meaningful performance task
about student’s learning through actual
 Clear standards and public criteria
demonstration of essential and observable skills 1. GENERABILITY-the likelihood that the
and creation of products that are grounded in student’s performance on the task will
real world contexts and constraints generalize the comparable tasks.
2. AUTHENTICITY-the task is similar to
METHODS OF PERFORMANCE-BASED
what the students might encounter in
ASSESSMENT the real world as opposed to
1. Written-open ended encountering only in the school.
3. MULTIPLE FOCI-the task measures
-a written prompt is provided. multiple instructional outcomes.
4. TEACHABILITY-the task allows one to
Format: Essays, open-ended test
master the skill that one should be
2. Behavior-based proficient in.
5. FEASIBILITY-the task is realistically
-utilized direct observations of implementable in relation to its cost,
behaviours in situations or simulated contexts space, time and equipment
requirements.
3. Interview-based
6. SCORABILITY-the task can be reliably
-examinees respond in one-to-one and accurately evaluated.
conference setting with the examiner to 7. FAIRNESS-the task is fair to all students
demonstrate mastery of the skills regardless of their social status or
gender.
4. Product-based
PORTFOLIO ASSESSMENT- is also an alternative
-examinees create a work sample or a to pen-and-paper objective test. It is a
product utilizing the skills/abilities purposeful, ongoing, dynamic and collaborative
process of gathering multiple indicators of the
5. Portfolio-based
learner’s growth and development. It is also
-collections of works that are performance-based but more authentic than
systematically gathered to serve many any other performance-based task.
purposes.
PRINCIPLES UNDERLYING PORTFOLIO
Types of Performance-Based Assessment ASSESSMENT

a. Demonstration-type-this is a task that 1. CONTENT PRINCIPLE suggests that


requires no product. portfolios should reflect the subject
matter that is important for the
b. Creation-type-this is a task that students to learn.
requires tangible products. 2. LEARNING PRINCIPLE suggests that
portfolios should enable the students to
7 Criteria in Selecting a Good Performance
become active and thoughtful learners.
Assessment Task
3. EQUITY PRINCIPLE explains that
portfolios should allow students to
demonstrate their learning styles and Nominal Measurement
multiple intelligences.
Keyword: Attributes are only named; weakest
RUBRIC
Ordinal Measurement
Is a measuring instrument used in rating
performance-based tasks. It is the “key to Keyword: Attributes can be ordered
corrections” for assessment tasks designed to
Is the distance from 0 to 1 the same as 3 to 4?
measure the attainment of learning
competencies that require demonstration of Interval Measurement
skills or creation of products of learning.
Keyword: Distance is meaningful
SIMILARITY OF RUBRIC WITH OTHER SCORING
INSTRUMENTS When distance between attributes has
meaning, for example, temperature (in
Rubric is a modified checklist and rating scale. Fahrenheit) -- distance from 30-40 is same as
distance from 70-80
1. Checklist
• Note that ratios don’t make any sense --
-presents the observed characteristics of a
80 degrees is not twice as hot as 40
desirable performance or product.
degrees (although the attribute values
-the rater checks the trait/s that has/have been are).
observed in one’s performance or product.
Ratio Measurement
2. Rating Scale
Keyword: Has an absolute zero that is
-measures the extent or degree to which a trait meaningful
has been satisfied by one’s work or
Frequency Distribution
performance.
• A table that shows classes or intervals
-offers an overall description of the different
of data with a count of the number of
levels of quality of a work or a performance
entries in each class.
What Is Level of Measurement
• The frequency, f, of a class is the
The relationship of the values that are assigned number of data entries in the class.
to the attributes for a variable
• Constructing a Frequency Distribution
Why Is Level of Measurement Important?
1. Decide on the number of classes.
• Helps you decide what statistical
 Usually between 5 and 20;
analysis is appropriate on the values
otherwise, it may be difficult to
that were assigned
detect any patterns.
• Helps you decide how to interpret the
2. Find the class width.
data from that variable
 Determine the range of the
data.

 Divide the range by the number


of classes.

 Round up to the next


convenient number.

• Constructing a Frequency Distribution

3. Find the class limits.

 You can use the minimum data


entry as the lower limit of the
first class.

 Find the remaining lower limits


(add the class width to the
lower limit of the preceding
class).

 Find the upper limit of the first


class. Remember that classes
cannot overlap.

 Find the remaining upper class


limits.

• Constructing a Frequency Distribution

4. Make a tally mark for each data entry


in the row of the appropriate class.

5. Count the tally marks to find the total


frequency f for each class.

Das könnte Ihnen auch gefallen