Sie sind auf Seite 1von 10

MODULE 7

EVALUATION MODELS IN EDUCATION


INTRODUCTION
Curricular evaluation is a system process of inquiry. The aim of investigation is to determine whether the curriculum as designed and implemented has produced or is producing the intended and desired results. Thus, evaluation models in education as tools for decision making go beyond measurement for it requires judgment on the part of the educator. Hence, the crucial role of evaluation in the success of any curricular reform effort is axiomatic and indisputable and no reasonable evaluation can be made without appropriate measurements done !osner, "##$%. &valuation can be conducted as formative, summative, or both. 'ormative evaluation is a way to detect problems and weaknesses in components in order to revise them. (n projects with sufficient time and funding, formative assessment is conducted prior to the implementation of the final program. (n practice, many projects begin with the )best effort* and conduct a formative evaluation with implementation, correcting weaknesses and errors as the project unfolds. +ummative evaluation is a process that concerns final evaluation to ask if the project or program met its goals. (n both types, the media or instructional program can be evaluated, but typically the summative evaluation concentrates on learner outcomes rather than only the program of instruction. Traditional tests and other evaluation methods commonly employed in classrooms are used in both instances, but specific kinds of evaluation can be used in formative evaluation. ,ecords, observations, interviews, and other data will permit the use of qualitative analysis of information for formative and summative evaluation. -eneral .bjective To know the different models of curriculum evaluation.

Specific Objectives. After readin t!is "#d$%e& '#$ s!#$%d be ab%e t#( 1. Define the term, evaluation.
/. 0now and understand the meaning of program or curricular evaluation. 1. (dentify the different major evaluation models in education. 2. 0now the characteristics of an evaluation instrument $. 3nswer the questions found at the last page of this module.

M#de%s #f Ed$cati#n Eva%$ati#n


". +criven4s 5odel

/. +take4s Congruency6Contingency 5odel 1. +tufflebeams4 C(!! &valuation 5odel 2. Tyler &valuation 5odel $. !rovus 7iscrepancy &valuation 5odel +criven4s 5odel The +criven4s 5odel "#89% relates to issues such as entity, goals or weightings. He defines curriculum evaluation as *gathering and combining performance data with a weighted set of goals scales to yield the comparative or numerical ratings and in the justification of: a. data gathering instruments b. the weightings c. the selection goals

Sta)e*s C#n r$enc'+C#ntin enc' M#de%


,obert +take "#;;% proposed the congruence6contingency model. The congruence or matching occurs between the intended and the observed data in terms of 1 major areas: ". 3ntecedents /. Transactions 1. .utcomes The analysis is based on the matching of what has been planned and what has actually occurred. +take defined an antecedent as any condition student profiles, teacher profiles performance record etc.% that exists prior to teaching and learning that may affect outcomes. Transactions refer to the interactions refer to the interactions between and among students and teachers and the different dimensions of the learning environment. .utcomes are the product of instructions or the consequences of education. .utcomes are the product of instruction or the consequences of education. The desirable outcomes are the development of knowledge, skills, attitudes, values and habits among the learners. (n addition, other consequences such as the impact of new programs on the perceptions and attitudes of teachers and administrators are also analy<ed. (n the system analysis vocabulary, the term 3=T&C&7&=T+, T,3=+3CT(.=+ and .>TC.5&+ used by +take in his model correspond to the words (=!>T+, !,.C&++&+ or TH,.>-H!>T+ and .>T!>T+, respectively. The model is shown below:

C#n r$ence+C#ntin enc' Matri, #f t!e Sta)e Eva%$ati#n M#de%


T'pe #f c#ntin enc' Intended Data Observed Data T'pe #f C#ntin enc'

3ntecedent

Congruence

L O . I C A L

Contingency
Transaction

Congruence Contingency
.utcomes

E M Contingency I Transactional R I C Contingency A L


.utcomes

3ntecedent

Congruence

St$ff%ebea"*s CI-- Eva%$ati#n M#de% According to Stufflebeam (1982), evaluation is undertaken for the ur ose of ac!uiring fundamental kno"ledge about the rogram, making decisions or #udgments, getting data or information as the basis of the rogram lanning intervention. $he %hi Delta &a a 'ational Stud( )ommittee on *valuation, chaired b( Daniel +. Stufflebeam, roduced and disseminated a "idel( cited model of evaluation kno"n as ),%% ()onte-t, ,n ut, %rocess, %roduct) model. )om rehensive in nature, the model reveals t( es of evaluation, of decision settings of decisions, and of change. ,n sha ing their model, Stufflebeam and his associates defined evaluation in the follo"ing "a(. /*valuation is the rocess of delineating, obtaining, and roviding useful information for #udging decision alternative0. Stufflebeam clarified "hat "as meant b( each of the arts of the definition as follo"s.

1. -r#cess. A articular, continuing and c(clical activit( subsuming man( methods and in using a number of ste s or o erations. 2. De%ineatin . 1ocusing information re!uirement to be served b( evaluation through such ste s as s ecif(ing, and e- licating. 2. Obtainin . 3aking available through such rocesses as collecting, organi4ing, and anal(4ing, and through such formal means as statistics and measurement. 5. -r#vidin . 1itting together into s(stems or subs(stems that best serve the needs or ur oses of the evaluation. 6. Usef$%. A ro riate to redetermined criteria evolved through the interaction of the evaluator and client. 7. Inf#r"ati#n. Descri tive or inter retive data about entities (tangible or intangible) and their relationshi s. 8. /$d in . Assigning "eights in accordance "ith a s ecified value frame"ork, criteria derived there from, and information "hich relates criteria to each entit( being #udged. 8. Decisi#n A%ternatives. A set of o tional res onses to a s ecified decision !uestion. $he evaluation rocess, said Stufflebeam, includes the three main ste s of delineating, obtaining, and roviding. $hese ste s rovide the basis for a methodolog( of evaluation. ,n the flo" chart form the model "hich consists of rectangles ("ith small loo s attached), he-agons, ovals, a circle, a fanc( *, solid and broken lines "ith arro"s and three t( es of shading. )rosshatched, the he-agons sho" t( es of decisions9 hatched, the ovals, the circle, and the big * de ict activities erformance9 and mottled, the rectangle stands four t( es of evaluation (1igure 6). T!e f#$r t'pes #f eva%$ati#n . $he %hi Delta &a a 'ational )ommittee ointed to four t( es of evaluation. )onte-t, ,n ut, %rocess, and %roduct, hence the name of the model, ),%%.
Context evaluation is the most basic kind of evaluation. (ts purpose is to provide a rationale for determination of objectives. 3t this point in the model, curriculum planner6evaluators define the environment of the curriculum, and determine unmet needs and reasons why needs are not being met. -oals and objectives are specified on the basis of context evaluation. (nput evaluation is that evaluation of the purpose of which is )to provide information for determining how to utili<e resources to achieve project objectives*. The resources of the school and various designs for carrying out the curriculum are considered. 3t this stage, the planner6evaluators decide on procedures to be used. !rocess evaluation is the provision of periodic feedback while the curriculum is being implemented. (t has three main objectives: the first is to detect or predict defects in the procedural design or its implementation during the implementation stages. The second is to provide information for programmed decisions, and the third is to maintain a record of the procedure as it occurs. !roduct evaluation, the final type, has its purpose )to measure and interpret attainments not only at the end of a project cycle, but often as necessary during the project term. The general method of product evaluation includes devising operational definitions of objectives,

measuring criteria associated with the objective of the activity, comparing these measurements with predetermined absolute or relative standards, and making rational interpretations of the outcomes using the recorded context, input, and process information. 'our types of decision. The hexagons represent four types of decision: !lanning, +tructuring, (mplementing, and ,ecycling. =ote that planning decisions follow context evaluation? structuring decisions follow input evaluation? implementing decisions follow process evaluation? and recycling decisions follow product evaluation. Three types of changes. (n these settings three types of changes may results: neomobilistic, incremental, and homeostatic. =eomobilistic change occurs in a setting in which a large change is sought on the basis of low information. These changes are innovative solutions based on little evidence. (ncremental changes are a series of small changes based on low information is so rare that it is not shown on the C(!! model. Homeostatic change goes back to structuring decisions. The model plots the sequence of evaluation and making from context evaluation to recycling decisions. The committee has touched up the model with small loops that lock like bulbs on the evaluation blocks to indicate that the general process of delineating, obtaining, and providing information is cyclical and applies to each type of evaluation. The ovals, the circle, and the & in the model represent types of activities, types of change, and adjustment as a result of the evaluations made and decision taken. The C(!! model presents a comprehensive view of the evaluation process. +aid the !hi 7elta 0appa Committee, )To maximi<e the effectiveness and efficiency of evaluation, evaluation itself should be evaluated@ the criteria for this include internal validity, expervasiveness, timeliness, and efficiency*.

T!e T'%er Eva%$ati#n M#de% 'o descri tion of evaluation models "ould be com lete "ithout including the contributions of :al h ;. $(ler "ho develo ed the /ob#ectives<based evaluation model.0 $he essence of the $(lerian model is that evaluation consists of the measurement of "hether the ob#ectives of an educational rogram, ro#ect, or roduct are accom lished, "hile the idea ma( seem sim listic toda(. $(ler changed the focus of evaluation fift( (ears ago from the measurement of the in uts into an educational rogram (e.g., counting ho" man( books are in the school librar( and listing the degree held b( the teachers) to a com arison bet"een a rogram=s ob#ectives as stated and "hat "as actuall( achieved (e.g., "hat "ere the goals of the ne" reading curriculum and ho" "ell do students actuall( read after com leting this curriculum).
Tyler proposed the following sequential steps in conducting evaluation: ". &stablish broad goals or objectives /. Classify objectives

1. 2. $. 8. 9.

7efine objectives in behavioral terms 'ind situation in which achievement of objectives can be shown 7evelop or select measurement technique Collect student performance data? and Compare data with behaviorally stated objectives

Clearly, the focal point of the Tyler model is on the objectives. Thus, the effectiveness of the approach depends on the establishment, classification, and definition of objectives, which become the basis of data collection and analysis. The model suggests a dynamic, cyclic process where the data and information provide the necessary feed back to the evaluator on the need to refine or formulate objectives. 5odifications and adjustments of the objectives enable the system to function optimally. Aelow is the illustration of the Tyler4s model.

T'%er Seven Steps Eva%$ati#n M#de%

"
&stablish .bjectives

/
Classifying .bjectives

1
7efine .bjective +elect indications

9
3naly<e 7ata Collect !erformance 7ata

8
7evelop 5easurement Techniques

,eport

T!e -r#v$s* Discrepanc' Eva%$ati#n M#de%


The 7iscrepancy &valuation 5odel 7&5%, designed by 5alcolm !rovus and reflected in the C(!! model, is an effective way to evaluate academic programs. !rovus defined evaluation as the process of agreeing upon program standards, and using discrepancy between outcomes and standards to evaluate the program. 3s such, the 7&5 can be used as a formative assessment to determine whether to revise or end a program.

The model is ideal for finding problems by means of discrepancy. (t can also be used to design a program from inception to conclusion, leading to summative evaluations. (n recent years there have been many approaches to program evaluation based on discrepancy analysis of 5alcolm !rovus in the "#8B4s, but with his untimely death and his books out of print, the testament to this heritage lives on primarily through the work of 7aniel +tufflebeam and his associates at The &valuation Center. Chile the 7&5 is called an evaluation method, +criven considers the term )evaluation* to be inappropriate and seems to prefer monitoring as a more appropriate description. 3lter described discrepancy evaluation as follows: The !rovus 7iscrepancy 5odel designed by 5alcolm !rovus in "#8#, is a well tested and commonly accepted utilitarian model to use in evaluating academic programs. He defined evaluation as the process of agreeing upon program standards governing that aspect of the program, and using discrepancy information to identify weaknesses of the program. His stated purpose of evaluation is to determine whether to improve, maintain or terminate a program@His model is primarily a problem6solving set of procedures that seeks to identify weaknesses according to selected standards% and to take corrective actions with termination as the option of last resort. The 7&5 uses stages and content categories to permit comparisons. The stages are: +tage ". !rogram 7efinition. To asses program design by first defining the necessary inputs, processes, and outputs, and then, by evaluating the comprehensiveness and internal consistency of the design. +tage /. !rogram (nstallation. To assess the degree of program installation against initial standards. +tage 1. !rogram !rocess. To assess the relationship between the variables to be changed and the process used to effect the change. +tage 2. !rogram !roduct. To assess the design of the program to se if it reached its goals objectives, outcomes%. 3t each of the four stages, the defined standard is compared to actual program performance to determine if any discrepancy exists. The use of discrepancy information always leads to one of four choices. !roceed to the next stage of evaluation if no discrepancy exists. (f a discrepancy exists, recycle through the existing stage after there has been a change in either the program4s standards or operations. (f number / cannot be accomplished, then recycle back to stage " program definition to redefine the program, then begin the discrepancy evaluation again at stage ". (f number 1 cannot be accomplished, terminate the program.

The 7&5 is most effective under the following circumstances: a. Chen the type of evaluation desired is formal, and the program is in the formative, rather than summative stages. b. Chen evaluation is defined as continuous information management addressing program improvement and assessment, and where evaluation is a component of program development. c. Chere the purpose of evaluation is to improve, maintain or terminate a program. d. Chere the key emphasis of evaluation is program definition and program installation. e. Chere the roles of the evaluator are those of facilitator, examiner of standards, observer of actual behaviors, and design expert. f. Chen at each stage of evaluation, program performance is compared with program objectives standards% to determine discrepancies. g. Chere the program evaluation procedure is designed to identify weaknesses and to make determinations about correction or termination. h. Chere the theoretical construct is that all stages f programs continuously provide feedback to each other. i. Chere the criteria for judging programs includes carefully evaluating whether: a% the program meets established program criteria, b% the actual course of action taken can be identified, and c% a course of action can be taken to resolve all discrepancies.

The 5odel suggests that in each of the five stages as shown below%, it is necessary to compare the performance with the identified standards. 3s is true with the other models under the scientific category, the application of the !rovus evaluation paradigm relies to a great extent on the identification of appropriate performance standards. 3 clear set of standards enables the evaluator to make judgment on what is valuable and what is not. However, determining the criteria on which to build the standards for evaluation is often problematic. .ften times, viewpoints differ on what and whose values or standards are applicable and suitable in a given situation. The lack of agreement on the indicators of success sometimes discourages beginning and inexperienced evaluators to undertake program analysis.

C#"paris#n #f -erf#r"ance and Standard 0ased #n t!e -r#v$s Dicrepanc' Eva%$ati#n M#de% STA.E
(

-ER1ORMANCE
7esign space, personnel, resources, materials% (nstallation actual operation%

STANDARD
7esign criteria

((

(nstallation 'idelity

(((

!rocesses instruction, leadership !roducts based on stated objectives% Cost economic and socio6 political implications%

!rocess 3djustment

(D

!roducts assessment

Cost benefit

CONCLUSION
5any more models of evaluation can be included in the list provided. (n fact, every situation where a decision is to be made or where a policy needs to be formulated may require an evaluation process. 'or this reason, it is important that the student understands the essential characteristics of a good evaluation system. (n general, the steps involved in formulating a good evaluation model include: ". 7efining and identifying the objectives of the program or project to be evaluated. This serves as the starting point for the entire evaluation process. /. 7eciding on the parameters to be measured given the objectives. This is the second step which requires that the evaluator be conscious of the critical indicators for every objective stated for the program or project to be evaluated 1. 7eciding on the measurement procedure to be undertaken. (n this stage, the evaluators decide on the appropriate measurement device that needs to be developed in order to measure the required parameters of the evaluation model 2. 5easuring the parameters of the evaluation model. $. Dalidating the findings against the perceptions of experts, the people involved in the educational program, the public in general, and, more importantly, against the objectives set forth by program or project.

Euestions to 3nswer
". /. 1. 2. Chat is evaluationF, program evaluationF 7ifferentiate the two types of evaluation. =ame some other models of evaluation in education that you know. !ropose a title of study related to educational program assessment and make a conceptual framework for this using one of the discussed evaluation models.

Das könnte Ihnen auch gefallen