Sie sind auf Seite 1von 18

Week 7 Parts D, E and F

By Jamie Doiron

Thoughts on The User Friendly Handbook for Mixed Method Evaluation


Part D

Chapter 2 Reactions
Clear, easy to follow project description Concise goals Do goals stop short?
What about improving test scores of elementary students enrolled in classes? Hard to track, but it would prove the effectiveness of the program downstream.

Clear difference between formative and summative evaluation

Chapter 2 Realizations
An evaluation may contain both formative and summative evaluations The design does not have to be 100% complete in the RFP Project goals and description are separate items

Chapter 6 Reaction
A lot of items were undetermined prior to the RFP being accepted and the evaluation starting
Budget Timeline Resources Evaluation plans

Excellent alignment between project goals, questions and stakeholders Alignment between questions, data sources and collection methods

Chapter 6 Realizations
Both quantitative and qualitative data are acceptable for an evaluation
Probably best to have both instead of just one

Prioritization is required to achieve evaluation goals on schedule and within budget A definitive schedule for data collection helps keep the evaluation on track.

Thoughts on Chapter 5 of The Course Text


Part D

Chapter 5 Reaction
Distinction between research and evaluation models is confusing Discrepancy model could be used for many things because all systems are interconnected - nothing exists in vacuum Goal-free model measures effectiveness against a program's customers or users Transaction model provides lots of immediate feedback to the program being evaluated Decision making model is for analyzing

Chapter 5 Realizations
All evaluation models involve some sort of research Discrepancy model works best when standards exist To properly use goal-free model, evaluator must be truly objective Transaction model requires a good relationship between evaluator and program staff Decision-making model is for long-term programs or outcomes only Systems analysis model is the most familiar to me as it is similar to what I did when working as a system analyst in an IT department Goal-based method could apply to many types of programs

Comparison of Chapter 5 and Chapters 2 & 6


Part D

Similarities
Chapters 2 and 6 of the internet readings seem to use the goal-free evaluation model
Measure effectiveness of program on the users of the program

All models discussed in all readings rely heavily on data Data gathering is a large part of the evaluation process for all models All readings discussed qualitative and quantitative data All models require heavy stakeholder involvement

Discrepancies
The internet readings dealt with only one scenario and model that used both qualitative and quantitative data The book compares and contrasts the models but does not recommend one over another The scenario makes not mention of the model used, only the process by which data will be collected, analyzed and used

Which Evaluation Model Best Suits My Evaluation Project?


Part E

My Evaluation Model
Discrepancy model Design Installation Process Product Cost-benefit analysis Compare to performance standards

Reasoning
Evaluation question: Will Google Sites meet the needs of my school district and function as an LMS? Compare google sites capabilities to their needs, which function as "performance standards" There is a cost involved Input from many stakeholders is required Determine differences between what is and what should be

Custom Assignment Design


Part F

Assignment Description
Create a calendar for the entire evaluation that cross references the data collection timeline, stakeholders needed, type of data being collected, and collection methods.

My submission
Click the link below to be taken to an interactive google calendar. I added appointments that correspond to all necessary meetings. If you click the appointment and view the details, you can see the questions from the scenario that would be asked at those meetings. This could be used by the evaluator as a timeline and scheduling tool for interview and evaluation activities, as well as to keep stakeholders informed of the data that would need to be gathered and the questions that would be discussed at interviews. Appointments could be made right from the events as well and sent to the stakeholders' emails. The hypothetical evaluation and events begin in May 2013. Click here for calendar.