Beruflich Dokumente
Kultur Dokumente
1|Page
Executive Summary:
Program evaluations - be they for funding or general organizational improvement - have become an integral part of nonprofit and for-profit institutions alike. As such, it is necessary for everyone up and down the chain of command, to be familiar with the assessment process in order to derive the greatest value from the effort. Everyone - from the intermittent volunteer, to the Chief Executive - play an important role in the day-today operations of an organization and thus have the ability to contribute significantly to the success of individual initiatives, as well as the overall wellbeing of the institution. This Program Evaluation Primer was written in an effort to provide a concise, yet comprehensive, overview of an evaluation process, in a way that is both, easy to use, as well as relevant to all stakeholders in a nonprofit environment.
Be concise, yet thorough enough to be effective. Provide real world examples, which help the reader apply the steps in a meaningful way. And, be understandable and applicable for all potential stakeholders in an organization -from volunteer, to Board of Directors.
As such, this guide was broken down into a sequence of steps, which applied in order, and step-by-step, forms a basis by which any organization will be able to conduct a successful evaluation. To further the effectiveness of this guide, examples have been provided, in each section, in order to provide relevancy and applicability to nonprofits. By taking the examples provided, and substituting an individuals organization specific information, any nonprofit will be able to apply the steps in an easy to use manner. In addition to the examples, a resource section is provided at the end of the document in order to provide further information on the evaluation process.
2|Page
Contents
Executive Summary:................................................................................ 1 What is Program Evaluation? .................................................................. 3 Some myths regarding Evaluations: ........................................................ 3 Why is Program Evaluation important? .................................................. 4 Options for Program Evaluations: ........................................................... 5 Process Evaluations: ............................................................................. 5 Outcomes Evaluations: ......................................................................... 6 Structure of Evaluations:......................................................................... 6 Form Collaborative Relationships:........................................................ 7 Determine Program Components: ....................................................... 7 Developing a Logic Model: ................................................................... 8 Determining Evaluation questions: .................................................... 11 Methodology: ..................................................................................... 13 Consider a Management Information System/Dashboard: ................ 14 Data Collection and Analysis: ............................................................. 15 Writing the Report: ............................................................................ 17 Additional Resources: ........................................................................... 19 Works Cited .......................................................................................... 21
3|Page
4|Page 3. Thirdly, it is sometimes stated that evaluations are unnecessary and merely produce a plethora of inconsequential data. Conducted properly, evaluations are never unnecessary, and rarely produce data, which cannot be relevant and help your organization move in a positive direction. Thomas Edison was once asked if he ever got discouraged with his ten thousand attempts to produce a functional light bulb, before he finally arrived at one that worked, to which he replied I didnt fail ten thousand times. I successfully eliminated, ten thousand times, materials and combinations which wouldnt work.
http://www.grantcraft.org/pdf_2012/guide_outcome.pdf
5|Page
Process Evaluations:
Nonprofits are formed out of a desire of an individual or a group of individuals to satisfy some societal need, which appears to be unmet. As such, it is important for the organizations attempting to satisfy these needs, to ensure they are operating in such a way that works towards that end. This is why it is important for nonprofit organizations to conduct what are called process evaluations. Process evaluations ask the question what? What are we trying to accomplish as an organization? Subsequently, are we implementing our plan as designed, in order to accomplish this task? When initiating a new process evaluation, the following three questions should be kept in mind in order to help guide the assessment: 1. What is the intended purpose of our program? 2. Are we implementing the program as designed in order to accomplish our goals? 3. If there are any gaps in design and implementation, what are they?
6|Page
Outcomes Evaluations:
Outcomes Evaluations, more commonly referred to as Outcomes Based Evaluations, are very similar to Process evaluations in their general structure. However, where they differ, is on the main focus of the assessment. Whereas Process Evaluations concentrate on whether an organization is doing what it had intended to do, Outcomes Evaluations focus on the overall effect of an organizations initiatives; or rather, whether an organizations actions are actually helping and to what extent. If Process Evaluations asked and answered the question what?, Outcomes Evaluations answer the question so, what?.
Structure of Evaluations:
Now that we have a basic understanding of what a process evaluation and outcomes evaluation is and what to look for, lets look at the different stages of the assessment. Remember that despite attempting to answer different questions, the two evaluations processes are very similar in nature and thus can benefit from a mutual outline, such as the one listed below. Any potential differences in application will be noted in individual sections. The Georgia Department of Human Resources produced a step-by-step guide on process evaluations back in 2002. As part of that guide, they listed a sequence of stages they proposed for the implementation of an evaluation. They are as follows:2
1. 2. 3. 4. 5. 6. 7. 8.
Form Collaborative Relationships Determine Program Components Develop Logic Model Determine Evaluation Questions Determine Methodology Consider a Management Information system Implement Data Collection and Analysis Write Report
http://health.state.ga.us/pdfs/ppe/Workbook%20for%20Designing%20a%20Process%20Evaluation.pdf
7|Page Given the general outline provided on the previous page, we will attempt to describe in detail each individual step, provide examples where needed, and expand on previous concepts and steps in order to provide a concise, yet comprehensive overview of the process.
8|Page The following example of determining program components was taken from the Georgia Department of Human Resources manual on Process Evaluations, and provides an easy to understand and visualize scenario.3 EXAMPLE: Who: What: When: Where: How: Elementary School Students Fire Safety Intervention 2 Times Per Year In Students Classroom Group Administered Intervention, Small Group Practice 1. Instruct students what to do in case of fire (stop, drop and roll). 2. Educate students on calling 911 and have them practice on play telephones. 3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to change batteries in a home fire alarm. Have students practice each of these activities. 4. Provide students with written information and have them take it home to share with their parents. Request parental signature to indicate compliance and target a 75% return rate. An important thing to keep in mind is that gathered data is only as valuable as the level of participation it yields. The statistics discipline teaches us that we can get an adequate indicator of a population of people, by taking a smaller sample size and conducting our analysis on the smaller group. However, this assumption is only true under the presupposition that we are able to gather all the necessary information through active participation of the sub-group. As such, Id suggest an addendum to the last point in the example above. I would suggest that an incentive be placed on the parental signature to encourage participation and thus ensure a greater chance of achieving the desired 75% return rate.
http://health.state.ga.us/pdfs/ppe/Workbook%20for%20Designing%20a%20Process%20Evaluation.pdf
9|Page Logic models are a list of the logical and sequential steps of your program, from idea conceptualization through implementation and into short, medium and long-term goal formation. Most logic models consist of four distinct steps: inputs, activities, outputs, and outcomes. Many individuals get the deer in the headlights look at the first sound of logic models, but understood correctly, the concept is quite easy to conduct and incredibly beneficial to understanding your program and how your intentions relate with your actions in an effort to attain your goals. Most logic models are represented in some sort of visual way in order to organize the different elements and better convey the steps. The example, below, is a basic logic model meant to express the concept in a simplified way. Remember that these models are used to help you understand the sequence of steps in your program and thus can be represented in a easily discernible way such as that below or could include more detail for a more thorough representation. Neither way is necessarily better than the other, but is dependent on what makes things easier for those involved in the initiative.
Inputs
Activities
Outputs
# of children reached
Outcomes
You have probably gleaned from the previous example, that this particular nonprofit organization has determined that there is an unmet need in their community with regards to lack of available food for children on the weekends, and that it is having an adverse effect on childhood health. As a result, they have decided to implement a backpack program, whereby they provide bags of food to children to take home on the weekends when government funded school meals programs arent available. From the example, it is easy to see what the inputs are (or rather what they have to give) food donations. Likewise, it is also easy to see what the organizations activities are (or rather how they go about giving what they have, to those who need it) the backpack program. Where the largest amount of confusion with logic models comes from however, is with the last two categories: outputs, and outcomes.
10 | P a g e One easy way to distinguish between the two categories is by thinking of them in the following terms. Whereas outputs tell stakeholders, how much?. Outcomes answer the question, so what?. The intention is to show that we are not simply intervening with a certain amount of effort and donations/services, but that the effort is also having the intended effect. Outputs or, again, what you produce from your programs - have long been an important way to assess your organizations productivity. However, in recent years, funders have been demanding recipients of their philanthropic resources to demonstrate that the quality of result is being maintained along with quantity. Such a variable is what we call outcomes. In order to understand the importance of outcomes in addition to outputs, think of it this way. If you are producing a lot of tires, but they fall apart, what benefit is there in the end. In order to ensure clarity on the differentiation of outputs and outcomes, let us revisit the previous example. But this time, were going to phrase it in terms of a narrative. Example: An organization recognizes the health of children in their community is significantly lower than the state average. They hypothesize that the lack of health is due to lack of access to healthy food on the weekends, as quick, convenience meals, are provided by their overworked parents who lack the time to cook wholesome meals. As a result, the organization decided to start a backpack program whereby they provide a bag of healthy snack foods and healthy alternative quick meals for the children to take home on the weekends. Within the first year, the organization is able to distribute over 10,000 meals and reach a total of 345 children. The following year, as children are going back to school and go to the doctor for their annual physical, the rates of the previous childhood illnesses has diminished drastically. In this example, the numbers of meals/children served is a quantitative output measurement, whereas the effect lower childhood illness is the outcome, resultant from the number of meals provided. A final thing to note with Logic Models, is they are not meant to be a static linear process, but rather as indicated by the arrow going from Outcomes, back to Inputs a continual process to aid in program improvement. Once you have determined your outcomes and how they indicate the health of your program, the results should be used to readjust your inputs, as to make your initiative more effective.
11 | P a g e
12 | P a g e 3. Have we conducted our program in different locations in the past? a. If so, where? b. Are we still there? c. If not, why not? d. What was the overall experience in previous locations? WHY: 1. 2. 3. 4. Why have we decided to do this particular program? Why is this more beneficial than another program we could offer? Why are people participating? Why arent people participating?
In terms of Outcomes Evaluations: Given our previous discussion on logic models, it is probably most prudent to think of this step as a working backward approach. The first step in developing a series of questions is to first determine what your intended impact will be. In keeping with the theme of previous examples, lets assume for a moment, that your organization is attempting to ameliorate childhood illness due to hunger. As such, weve determined that our program will have been successful if yearly medical physical tests are improved over the previous year our outcome. Likewise, were able to quantify this cause and effect relationship through the number of healthy meals we provide and the number of children we serve each year our output. Therefore, the questions we would use to determine the data for our output and thus discern the effect of our efforts could be the following:
1. 2. 3. 4. 5.
How many children did we serve? How healthy were they when they started our program? How long have they participated in our program? What has been the change in health over the time they have been in our program? What is the demographic breakdown of the children in our program?
a. b. c. d. e. f. g. Gender? Age? Ethnicity? Location of their home? Family size? Demographics of their household? Family income?
13 | P a g e
Methodology:
Now that youve determined which questions need to be answered, you will need to decide on the best way to collect the data. Some questions, such as: family income, gender, age etc may be available through standard, preparticipation paperwork. In this case, this easily available data commonly referred to as low hanging fruit - can be quickly gathered and assessed with no further client involvement. However, in other cases where questions, such as: why are people participating, why arent people participating, are specific times/locations ideal for our clients, etc are involved, alternative means of data gathering need to be employed. Depending on the type of questions which need to be asked, the following list contains options for further data acquisition: 1. 2. 3. 4. Surveys Interviews Post-tests Behavioral observation
When using a series of questions to gather data be it through surveys, interviews, or a guideline for observation - Ellen Taylor-Powell, Program Development and Evaluation Specialist for The University of Wisconsin Cooperative Extension service, recommended three main things to consider when determining questions for an evaluation, in her paper Questionnaire Design: Asking questions with a purpose. They are as follows:4 Consider: 1. The particular people for whom the questionnaire is being designed; 2. The particular purpose of the questionnaire; and 3. How questions will be placed in relation to each other in the questionnaire (For further detail and a number of excellent examples on types of questions, please refer to Questionnaire Design: Asking questions with a purpose5)
4 5
http://learningstore.uwex.edu/assets/pdfs/g3658-2.pdf http://learningstore.uwex.edu/assets/pdfs/g3658-2.pdf
14 | P a g e
Dashboards received their name due to the similarity they share with the instrument panel in a car. Whereas an instrument panel just shows the relevant information, so too, does an MIS Dashboard show only the data which management, a board of directors, etc need in order to make educated decisions. In keeping with the vehicle theme then, most Dashboards color code the data similar to stoplights. Green typically indicates that everything is good to go, yellow means slow down and start to take notice, and red indicating STOP, something might be wrong. In the above example of a dashboard, for instance, this particular nonprofit has set a target of 90% attainment of GED certificates for those enrolled in their program. As of the current date, an 82% rate has been attained for the program, which is within an acceptable pre-determined range as indicated by the green box. Though Dashboards can contribute significantly to the well-being of an organization by aiding those in decision making positions in their understanding of key indicators, three things should be kept in mind when determining whether to implement such a system.
http://www.blueavocado.org/node/398
15 | P a g e Consider the following acronym - T.E.A. 1. Time Do you have the time to dedicate? Even though a lot of your data could be collected through standard forms, it still needs to be inputted in a system. 2. Effort As the old adage says, garbage in garbage out. Do you have the right data to make it worth your effort? 3. Affordability Although there are a few free Dashboard systems out there, the ones that will provide you with the most time saving friendly interfaces, which provide relevant information and require a little learning curve, are more costly.
Answers: Employee 1 response: Provide job skills training to the abused to enable them to move out of an abusive relationship.
16 | P a g e Employee 2 response: Partner with a substance abuse program since most abusive relationships stem from dependency on drugs or alcohol. Client 1 response: Provide the couples a counselor. Client 2 response: Provide legal representation. Manager response: There is a high recidivism rate amongst abuse victims taking an average of seven interventions before someone leaves an abusive relationship. Therefore, we should continue counseling services - outside stays in the shelter - to attempt to mitigate future relapses in behavior.
In the case of qualitative answers, it is usually helpful to organize like responses into groups. For instance, in the example above, you could probably group Employee 2, Client 1 and the Manager responses into a larger set entitled Additional Counseling. Too often, organizations feel the need to provide some sort of numerical justification of their efforts. The important thing to remember is this. Not all evaluation results need to have a quantitative aspect to them. What you are really trying to determine, is What do I need to know to in order to convince myself and others that we are making a meaningful impact? It doesnt always need to be about numbers, but you need to be able to give good justification as to your reasoning. Additionally, not all stakeholders be they funders, clients or employees are looking for the same information. People respond to the same information in different ways. Whereas some people like to be presented with concrete numbers, others enjoy hearing first-hand, first-person accounts in the form of stories or narratives. Therefore, its best to have stories behind your numbers, and numbers behind your stories.
17 | P a g e
http://managementhelp.org/evaluation/program-evaluation-guide.htm
18 | P a g e i) Problem Statement (in the case of nonprofits, description of the community need that is being met by the product/service/program) ii) Overall Goal(s) of Product/Service/Program iii) Outcomes (or client/customer impacts) and Performance Measures (that can be measured as indicators toward the outcomes) iv) Activities/Technologies of the Product/Service/Program (general description of how the product/service/program is developed and delivered) v) Staffing (description of the number of personnel and roles in the organization that are relevant to developing and delivering the product/service/program) 6) Overall Evaluation Goals (eg, what questions are being answered by the evaluation) 7) Methodology a) Types of data/information that were collected b) How data/information were collected (what instruments were used, etc.) c) How data/information were analyzed d) Limitations of the evaluation (eg, cautions about findings/conclusions and how to use the findings/conclusions, etc.) 8) Interpretations and Conclusions (from analysis of the data/information) 9) Recommendations (regarding the decisions that must be made about the product/service/program) Appendices: content of the appendices depends on the goals of the evaluation report, eg.: a) Instruments used to collect data/information b) Data, eg, in tabular format, etc. c) Testimonials, comments made by users of the product/service/program d) Case studies of users of the product/service/program e) Any related literature
19 | P a g e
Additional Resources:
For further information/reading/examples on evaluations, please refer to the following list of resources:
Performance Evaluation Overview Delaware Department of Education (February, 2012) http://www.doe.k12.de.us/rttt/files/PerfEvaluationOverview.pdf 5 Tips (and Lots of Tools) to Become an Evaluation-Savvy Nonprofit Leader NGen http://www.independentsector.org/blog/post.cfm/5-tips-and-lots-of-tools-to-become-anevaluation-savvy-nonprofit-leader Using Dashboards in Training Evaluation Predictive Evaluation Model Dave Basarab Consulting http://www.davebasarab.com/blog/dashboard/using-dashboards-in-training-evaluationpredictive-evaluation-model/ Evaluation Strategies for Human Services Programs The Urban Institute https://www.bja.gov/evaluation/guide/documents/evaluation_strategies.html Rigorous Program Evaluations on a Budget: How Low-Cost Randomized Controlled Trials Are Possible in Many Areas of Social Policy Coalition for Evidence-Based Policy http://coalition4evidence.org/wp-content/uploads/Rigorous-Program-Evaluations-on-a-BudgetMarch-2012.pdf Basic Guide to Program Evaluation (Including Outcomes Evaluation) Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. http://managementhelp.org/evaluation/program-evaluation-guide.htm Workbook for Designing a Process Evaluation Georgia Department of Human Resources Division of Public Health, Melanie J. Bliss, M.A. and James G. Emshoff, PhD http://health.state.ga.us/pdfs/ppe/Workbook%20for%20Designing%20a%20Process%20Evaluat ion.pdf The State of Nonprofit Data NTEN http://www.nten.org/research/2012-state-of-data
Making Measures Work for You: Outcomes and Evaluation GrantCraft http://www.grantcraft.org/pdf_2012/guide_outcome.pdf Questionnaire Design: Asking questions with a purpose Ellen Taylor-Powell http://learningstore.uwex.edu/assets/pdfs/g3658-2.pdf Finally Outcome Measurement Strategies Anyone Can Understand Laurel A. Molloy, MPA The Many Faces of Nonprofit Accountability Alnoor Ebrahim, Havard Business School http://www.hbs.edu/faculty/Publication%20Files/10-069.pdf Building a Common Outcome Framework to Measure Nonprofit Performance The Urban Institute http://www.urban.org/publications/411404.html Where are you on your journey from Good to Great? Jim Collins http://www.jimcollins.com/tools/diagnostic-tool.pdf Outcome Based Evaluation Janet Boguch, MA Seattle University Institute of Public Service http://www.seattleu.edu/WorkArea/DownloadAsset.aspx?id=17716
21 | P a g e
Works Cited
There are no sources in the current document.
22 | P a g e
23 | P a g e
24 | P a g e