Sie sind auf Seite 1von 8

2.

0 Evaluation: When The Community


Speaks

“How do we really know that this young man is being served, feels hopeful, has a
future?”
Problem: Lots of talk about metrics &
evaluation
… but the community is not involved, and therefore
passive
 Data gathered is inconsequential or missing
entirely
 Process is disconnected from projects and
intimidating
 Results are delayed, expensive, and top-down

 No accountability, transparency, or connection


to change HARD QUESTIONS
Problem: “I can rate and
provide feedback on a
restaurant, but how come I
can’t rate or give impacts
on projects in my
community?”
ANSWER:
It’s an issue of control; it’s also a
significant threat to an expensive status
quo
Assessment: In, Through, and Out
…the evidence about sticky learning is clear:

 Learners need to care about the goals


 The goals need to be clear: What does success look
like?
 Feedback must be built in from the very beginning
 If learners gain the skills to assess their work, they’ll
achieve
 If I am part of the solution, then I can solve the
problems HARD QUESTIONS

Assessment: “You can


teach me how to fish, but
shouldn’t I know if the
water is polluted?”
ANSWER:
If people can’t measure it, then its
absolutely unsustainable. It simply
evaporates.
It’s Not About Supply, But Demand
2.0, transparent, accessible, accountable, practical

 2.0: “My voice, my needs, and my vote – count”


 Transparent: “I need to trust the data, that NGO,
this process”
 Accessible: “I can learn to gather, organize, &
categorize data”
 Accountable: “I must be invested in results”
 Practical: “We can turn data can turn into
HARD QUESTION:
deliverables”
Demand: “Do people really
know what they need? After
all, isn’t that what we’re here
for….to help? ”
ANSWER:
Remember – brains are evenly
distributed throughout the world.
Tap the local community. The
supply approach has not worked.
OK. “Amazon” Meets Social
Impacts
… yet it’s far more than stars and comments and
 Community learns whyblogs
data matters
 Community gathers data from multiple sources
 Community learns how to sort it, ensure it’s secure,
verify it
 Community establishes a baseline and monitors
progress
 Community reports from cell-phones to web
 We can discuss and make change – together HARD QUESTION:
Rating: “I get the
concept, but there are so
many variables and
obstacles. How will you
ANSWER:
overcome them?”
Start small. Be reasonable. Narrow
the focus. Share best practices.
Ask for help
Everyone is Afraid of This. Why?
… because it holds us all accountable and it’s scary

 It’s easier to count #’s “served” than impacts


“achieved”
 It’s easier to provide “solutions” rather than ask
questions
 It may expose over-spending and under-producing
 It’s a thorny problem, subject to criticism and
ridicule
 Because it’s easier to follow the money than trackHARD QUESTION:
the impacts
Fear: “What if, after all
my hard work as a funder
or grantee, the community
gives me low marks?”
ANSWER:
By being open to scrutiny, you build
trust. The community will help you
be successful.
Why Does TWB Want To Do This?
… because this is a huge gap and a moral obligation

 We have to know that what we do – matters


 If one can’t measure it, it’s not worth doing
 Even if our initial ratings are low – the community will help
 We can revolutionize the field, demonstrate impacts, lower
costs
 With a high standard, we stimulate global, personal
philanthropy
 We want to show proof that education is the key to HARD QUESTION:
development
Why Do This?: “Are you
sure you want to disrupt the
field and expose your
weaknesses? This is
ANSWER: suicidal!?”
We want to show how education is
the key to development. We’ll be
the guinea pig. If one cannot prove
program quality, then it’s all smoke
and mirrors.
What We’re Working On
…building a model, thinking it through, consulting villages

 May, 2009: introduce a basic “report to web” function


 Synthesize research on community evaluation/participatory
“best practices”
 Teach a selected community about evaluation (Nigeria,
Mexico, China)
 Work with Bureaus and communities to design
metrics/evaluation up front

TODAY’S REPORT (May 13, 2009)


16,370 more graduates of the Certificate of Teaching Mastery
AT A GLANCE
7,110 Men | 74% online
9,260 Women | 83% online
Community metrics: achievement, attendance, behavior
Community Data Collection Methods
Feedback by Course (stars and comments)
Overall Satisfaction: 4.7/5
Information Verified | Not Verified
Full Report

Das könnte Ihnen auch gefallen