Sie sind auf Seite 1von 13

Terms of Reference- Electoral Expert

PROJECT FINAL EVALUATION


Project Title:
Post Title:
Application Deadline:

Strengthening of the Democratic Process in Egypt


Short-term Electoral Expert
10 April 2016

I.
Background and Context:
The UNDP electoral assistance project Strengthening of the Democratic Process in Egypt was signed in
November 2011 between the High Elections Committee, the Ministry of Foreign Affairs and UNDP and
was designed to support national efforts to enhance the credibility and sustainability of electoral
institutions and processes, with a particular emphasis on capacity development and womens
empowerment. The project focused on four components: (i) strengthening public outreach and voter
information, (ii) upgrading technical and operational capacity of national electoral authorities;
(iii) incorporating lessons-learned and best practices in the subsequent elections processes, and (iv)
increasing access of women and rural dwellers to their citizenship rights. The project, which was
completed in December 2015, had a total budget of approximately US$ US$20,668,187 supported by 12
Development Partners namely Austria, Australia, Belgium, Germany, Ireland, Japan, Norway,
Netherlands, Romania, Sweden and Switzerland and UNDP.
Within the first component strengthening the public outreach and voter information the project
supported SIS/HEC/PEC in a) the implementation of the public outreach campaigns; b) the dissemination
of nationally devised gender sensitive voter information products; c) the implementation of a media
relations strategy through organization of media trainings and workshops; and d) the production of a
media-elections handbook to enhance the medias understanding of the electoral cycle. Furthermore,
during 2013-2015, the project supported SIS/HEC in the production of a gender sensitive civic and voter
education programme for broadcast, TV and print, through the Voter Education Initiative focusing on
rural women and youth.
Within the second component upgrading technical and operational capacity of national electoral
authorities, the project provided technical assistance to SIS/HEC/PEC/MOI by a) developing and
producing training materials such as illustrated procedural manuals and simulation videos for judges and
arranging an orientation on electoral procedures, b) upgrading IT infrastructure and procuring various
electoral materials and equipment including the procurement of a data centre to host the electoral
register and all electoral related information; c) availing opportunities for electoral authorities to
participate in international for a to share comparative knowledge and experiences and d) organizing 6
workshops and international conferences in Cairo on different electoral topics.
Within component 3, incorporating lesson-learned and best practices of the 2011 elections for
subsequent electoral processes, the project helped organize lessons-learned workshops after various
electoral events such that future electoral processes benefit from knowledge sharing and experiences.
Furthermore, an Arabic lexicon of electoral terminology was developed, published and disseminated in
order to increase the understanding within the Arab region of the different electoral terms in eight
countries.
Within component 4, increasing access of women and rural dwellers to their citizenship rights in the
2011 and future elections, the project contributed to the Women Citizenship Initiative led by
1

UNWomen and which aims to facilitate the on-going national registration process for the issuance of
national ID cards for women over a period of three years. As of November 2015, the National Council
for Women (NCW) became the lead national partner for the initiative. Nonetheless, MPMAR was
responsible for issuing 181,868 ID cards to date. The Ministry has developed different implementation
models to speed up the process of ID cards issuance and to maximize efficiency. The initiative is
supported by a strong gender information campaign on TV, radio, social media and printed materials.
Throughout its implementation , the project worked with various counterparts including the High
Electoral Commission (HEC), the Presidential Electoral Commission (PEC), the Ministry of
Communication and Information Technologies (MCIT), the Ministry of Foreign Affairs (MoFA), the
Ministry of Interior (MoI), the Ministry of State for Administrative Development (MSAD) (currently the
Ministry of Planning, Monitoring and Administrative Reform (MPMAR)), the State Information Service
(SIS), the National Council for Women (NCW), the Social Fund for Development (SFD), the Information
Decision Support Centre (IDSC); United Nations UNWomen and the International Foundation for
Electoral Systems (IFES).
II.

Evaluation purpose:
As stipulated in the signed project document, the project is expected to conduct a project evaluation to
analyse the achievements of the project against its original objectives while providing project partners
with an independent review of project outputs. The evaluation will review technical and managerial
aspects and consider issues of effectiveness, efficiency, relevance, impact and sustainability. The
evaluation will identify factors that have facilitated and/or impeded the achievement of objectives and
should result in recommendations and lessons learned that could benefit the various stakeholders. The
evaluation will specifically consider aspects of monitoring and evaluation; capacity building in electoral
administration; and gender mainstreaming.

III.

Evaluation scope and objectives:

Assess the status of project results and how they have been achieved with an assessment of
UNDPs contribution/approach
Identify factors that have facilitated and/or impeded the achievement of objectives
Assess the relevance, efficiency, effectiveness, and sustainability of the project
Assess the impact of external and internal factors on the effectiveness of the project
Assess the efficiency and the adequacy of the management arrangements of the project
Evaluation of the project strategy in enhancing national capacity in electoral administration
Identify lessons learned and good practices with regards to project implementation and
partnerships
Assess the Monitoring and Evaluation framework used by the project

The evaluation will cover the duration of the project starting from November 2011 till the project
completion in December 2015, and will also take into account those activities that were carried out
before the signing of the project document.
IV.

Evaluation Criteria:
The project will be evaluated against the following criteria:
Relevance: The evaluation will assess the degree to which the project remained relevant to
the context during which it was implemented and in accordance with the priorities of the
2

V.

Government of Egypt, the mandate of UN electoral assistance and UNDP goals in


Democratic Governance.
Efficiency/management: The evaluation will assess the outputs realized in relation to the
inputs, timeframe, funds and expertise provided, looking for example at whether the
management structure was appropriate.
Effectiveness: The evaluation will assess the extent to which the project outputs, outcomes
and results have been achieved.
Impact and sustainability: The evaluation team should assess the lasting change brought
about by the project.
Gender sensitivity: The evaluation will assess to what degree was the project sensitive to the
inclusion of Gender.

Evaluation questions
Evaluation questions set the parameters of the evaluation and when answered, will give users of the
evaluation the information they seek in order to make decisions, take action or add to knowledge.
Evaluation Questions should include at least the following questions:
Were stated outputs achieved?
What progress toward the outputs has been made?
What factors have contributed to achieving or not achieving intended results?
Were the actions to achieve the outputs and outcomes effective and efficient? Has the
project been effective in providing support to the electoral authorities in Egypt in effectively
conducting elections in the areas stated in the project?
Did the project make the best use of its resources to achieve its results? Has the project
been efficient in implementing its activities?
What factors contributed to effectiveness or ineffectiveness?
Has the project partnership strategy been appropriate and effective? To what extent the
project has been able to build and promote its partnership with other relevant stakeholders
for greater results?
What efforts have been made to encourage sustainability of the project results?
What are the lessons-learned of this project for stakeholders in and outside Egypt?

VI.

Methodology
1. Preparation of Inception Report: The methodology that will be used by the evaluators should be
presented in the inception brief and the final report in detail. The methodology must be agreed
upon between UNDP, the evaluators and project partners prior to the start of the evaluation.
2. Data Collection:
a. Documentation review: This would include a) the Project Document with the
description of the project, its intended outcome and outputs, the baseline for the
outcome/outputs and the indicators used; b) Project Progress Reports; c) Project
Technical Reports including training materials, publications etc.
b. Key participants interviews with MOFA, UNDP, key focal points in HEC, PEC, and SIS,
MOI, MPMAR, Development Partners, the Chief Electoral Advisor (via skype call) and
the Project Coordinator (via Skype Calls), former project staff, UNWomen, and other
partners such as Al Ahram Center for Strategic Studies and international partners such
as IFES.
3. Preparation of Draft Final Report: with comments and inputs from various stakeholders and
partners.
3

4. Preparation of Final Report


VII.

Composition of the Evaluation Team


The final evaluation will be carried out by a team of three national consultants: Monitoring and
Evaluation Expert; Electoral Expert and Gender Expert to evaluate the gender dimensions of the
project. All three consultants should not have participated in the project preparation and/or
implementation and should not have any conflict of interest with project related activities. The
estimated duration of the Evaluation is 2 months with a level of effort of 25 working days.
The Evaluation Team will be responsible for producing the following deliverables:
1. Inception Reports: each of the evaluation experts will be responsible for preparing an inception
report for their respective focus area
2. Draft Evaluation Reports: each of the evaluation experts will be responsible for preparing a
draft report for their respective focus area for discussion with MOFA and UNDP
3. Final Evaluation Reports (See Evaluation Report Template in Annex 3): each of the evaluation
experts will be responsible for submitting a final report for their respective focus area
4. Brief Executive Summary for each of the reports

VIII.

Time-frame for the evaluation process


The evaluation will take place over a period of 25 days. The tentative schedule follows:
Planned Activities

Tentative
Days

Desk review, Briefing Meetings with MOFA and UNDP and


preparation of the evaluation i n c e p t i o n
r e r e p r e p oand
r t rPresentation
e p o r t design,
Finalization
of Inception Report to MOFA and

3 days
2 days

UNDP
Meetings with National Partners, Development partners, Project 15 days
Team etc
Reference
for Report
feedback
PreparationGroup
of Draft
and Presentation of draft findings to
3 days
MOFA and UNDP
Finalize and submit Report

2 days

Total

25 days

To facilitate the evaluation process, MOFA and UNDP will assist with the organization of meetings
with the relevant electoral authorities, development partners, institutions and key stakeholders.
MOFA and UNDP will be responsible for preparing and coordinating the full agenda of the final
evaluation in consultation with the relevant national and international stakeholders. It is anticipated
that the mission will meet the following key partners including:
HEC Chair and spokesperson;
Former Chair of PEC;
Representatives of relevant ministries including Ministry of State for Administrative
development, Ministry of Interior, State Information Services, Information Decision
Support Centre
4

IX.

MOFA
Representatives of Think Tanks and Research Centers namely Al Ahram Center for
Political and Strategic Studies and Baseera
Representatives of Development Partners.
Representative of UN Women
UNDP

Roles and Responsibilities of the Electoral Expert

Prepare an inception brief identifying Electoral aspects of project formulation and


implementation to be addressed and the proposed methodology of data collection and
analysis
Assess the effectiveness of the project in enhancing the electoral institutions and processes
during project design and implementation
Analyse project achievements and conduct impact evaluation
Prepare a separate Electoral report to be integrated into the final evaluation report and
presentation

X.

Deliverables of Electoral Expert:


1. Inception Brief to outline main evaluation issues that will be addressed, including the relevant
evaluation questions and the proposed and final methodology that has been agreed upon
before the evaluation is set to begin. The Electoral Expert will prepare an inception brief on the
Electoral dimensions of the project. The brief will be presented to MOFA and UNDP in a joint
meeting with the M&E and Gender Experts.
2. Draft Evaluation Report and Presentation of Findings-- The draft report covering the Electoral
component of the project will be prepared and presented by the Electoral Expert. The report
will be shared with both MOFA and UNDP, following which a meeting will be held to discuss it,
along with reports prepared by the M&E and Gender Experts, to ensure that the evaluation
meets MOFA and UNDP expectations as stipulated in the Evaluation Terms of Reference.
3. Final Evaluation Report including Executive Summary

XI.

Qualifications:
The Electoral Expert will possess the following qualifications and competencies:
Advanced university degree in Public Policy and Administration, Law, International law,
Economics, Development Management or related discipline with 5 years experience
Recognized experience in implementing and evaluating development projects and resultsbased management
Advanced knowledge and extensive experience in electoral issues preferably in developing
countries
Familiarity with recent UNDP Monitoring and Evaluation Policy would be an asset
Experience with multilateral or bilateral supported projects
Fluency in English and possession of strong technical writing and analytical skills

XII.

Evaluation of Applicants
Individual consultants will be evaluated based on a cumulative analysis taking into consideration the
combination of the applicants qualifications and financial proposal. The award of the contract
should be made to the individual consultant whose offer has been evaluated and determined as:

Responsive/compliant/acceptable, and
Having received the highest score out of a pre-determined set of weighted technical and
financial criteria specific to the solicitation.

Only the highest ranked candidates who would be found qualified for the job will be considered
for the Financial Evaluation.
Technical Criteria 70% of total evaluation
Weight
Relevant professional experience
Proposed methodology, its appropriateness to
the assignment, and timeliness of the
implementation plan
Previous working experience on similar
assignments
Total

30
50

20
100

Financial Criteria 30% of total evaluation


To be computed as a ratio of the Proposals offer to the lowest price among the proposals
received by UNDP.
Applicants receiving a score less than 70% will be technically disqualified.
XIII.

Terms of Payment
30% upon submission of Inception Report on Electoral component
30% upon submission of Draft Evaluation Report on Electoral component
40% upon submission of Final Report, Executive Summary on Electoral component

XIV.

Evaluation ethics
All UNDP Programme and project evaluations are to be conducted in accordance with the principles
outlined in the UNEG Ethical Guidelines for Evaluation and the UNEG Code of Conduct for
Evaluation in the UN System. Both documents can be found at the following link:
http://www.uneval.org/search/index.jsp?q=ethical+guidelines
Evaluations of UNDP-supported activities need to be independent, impartial and rigorous. Each
evaluation should clearly contribute to learning and accountability. Hence, evaluators must have
personal and professional integrity and be guided by propriety in the conduct of their business.

Evaluators:

Must present information that is complete and fair in its assessment of strengths and
weaknesses so that decisions or actions taken are well founded.
Must disclose the full set of evaluation findings along with information on their limitations
and have this accessible to all affected by the evaluation with expressed legal rights to
receive results.
Should protect the anonymity and confidentiality of individual informants. They should
provide maximum notice, minimize demands on time, and: respect peoples right not to
engage. Evaluators must respect peoples right to provide information in confidence, and
must ensure that sensitive information cannot be traced to its source. Evaluators are not
expected to evaluate individuals, and must balance an evaluation of management functions
with this general principle.
Evaluators should consult with other relevant oversight entities when there is any doubt
about if and how issues should be reported.
Should be sensitive to beliefs, manners and customs and act with integrity and honesty in
their relations with all stakeholders. In line with the UN Universal Declaration of Human
Rights, evaluators must be sensitive to and address issues of discrimination and gender
equality. They should avoid offending the dignity and self-respect of those persons with
whom they come in contact in the course of the evaluation. Knowing that evaluation might
negatively affect the interests of some stakeholders, evaluators should conduct the
evaluation and communicate its purpose and results in a way that clearly respects the
stakeholders dignity and self-worth.
Are responsible for their performance and their product(s). They are responsible for the
clear, accurate and fair written and/or oral presentation of study limitations, findings and
recommendations.
Should reflect sound accounting procedures and be prudent in using the resources of the
evaluation.

The Evaluators must sign an Evaluation Staff Agreement form at the start of their contract (see
Annex 2).
XV. Implementation Arrangements
The principal responsibility for managing this evaluation lies with MOFA and UNDP. The report will
be cleared by MOFA and UNDP.
The time frame above does not include two weeks of unpaid time, during which MOFA and UNDP
Egypt will analyse, provide comments to the evaluators and share the draft report with different
stakeholders.
Interested applicants are invited to send a CV, P11 form and cover letter (specifying the title of the
consultancy); proposed methodology outlining how the Consultant will execute the assignment;
written sample of prior evaluation and/or assessment work; and detailed financial proposal to the
following e-mail: alya.hegazy@undp.org
Vacancy Notice issued on: 27 March 2016
Deadline for application is: 10 April 2016
7

ANNEX 1
Documents to be consulted this is a list of important documents and Webpages that the evaluator(s)
should read at the outset of the evaluation and before finalizing the evaluation design and the inception
report.

Project Document and any revisions

Websites:
o www.undp.org.eg
o https://www.elections.eg/

UNDAF and UNDP CPD/CPAP


UNDP M&E Yellow Handbook
Project operational guidelines, manuals and systems
Quarterly Progress Report and detailed activity progress reports
Project Annual reports
Minutes of Board meetings and other project management meetings.
Presentations and other inputs to Board Meetings and project management meetings
Combined Delivery Report
Atlas Reports (such as the AWP and Project Budget Balance report)
Project Implementation Reviews
UNDP User Guide (relevant sections)

Other Reference Documents produced by the project

Citizenship Initiative project document

Mapping study of UN WOMEN

Concept notes on the various round tables and sub-regional conferences

Reports on study tours and lessons learnt exercise of HEC, PEC

Reports of the internal UNDP meetings with the UNDP electoral assistance projects in Libya
and Tunisia.

Action plan

ANNEX 2
United Nations Evaluation Group Code of Conduct for Evaluation in the UN System

Evaluation Consultants Agreement Form


To be signed by all consultants as individuals (not by or on behalf of a consultancy company) before a
contract can be issued.

Agreement to abide by the Code of Conduct for Evaluation in the UN System

Name of Consultant:
__________________________________________________________________
Name of Consultancy Organization (where relevant):
________________________________________
I confirm that I have received and understood and will abide by the United Nations Code of Conduct
for Evaluation.

Signed at (

) on (

Signature: __________________________________________________________________

ANNEX 3
EVALUATION REPORT TEMPLATE AND QUALITY STANDARDS
This evaluation report template is intended to serve as a guide for preparing meaningful, useful and
credible evaluation reports that meet quality standards. It does not prescribe a definitive section-bysection format that all evaluation reports should follow. Rather, it suggests the content that should be
included in a quality evaluation report. The descriptions that follow are derived from the UNEG
Standards for Evaluation in the UN System and Ethical Standards for Evaluations.
The evaluation report should be complete and logically organized. It should be written clearly and
understandable to the intended audience. In a country context, the report should be translated into
local languages whenever possible (see Chapter 8 for more information). The report should also include
the following:
Title and opening pagesmust provide the following basic information:

Name of the evaluation intervention


Time-frame of the evaluation and date of the report
Countries of the evaluation intervention
Names and organizations of evaluators
Name of the organization commissioning the evaluation
Acknowledgements

Table of contentsmust always include boxes, figures, tables and annexes with page references.
List of acronyms and abbreviations
Executive summaryA stand-alone section of two to three pages that should:

Briefly describe the intervention of the evaluation (the project(s), programme(s), policies or
other intervention) that was evaluated.
Explain the purpose and objectives of the evaluation, including the audience for the evaluation
and the intended uses.
Describe key aspect of the evaluation approach and methods.
Summarize principle findings, conclusions, and recommendations.

IntroductionShould:

Explain why the evaluation was conducted (the purpose), why the intervention is being
evaluated at this point in time, and why it addressed the questions it did.
Identify the primary audience or users of the evaluation, what they wanted to learn from the
evaluation and why and how they are expected to use the evaluation results.
Identify the intervention of the evaluation (the project(s) programme(s) policies, or other
interventionsee upcoming section on intervention.)
10

Acquaint the reader with the structure and contents of the report and how the information
contained in the report will meet the purposes of the evaluation and satisfy the information
needs of the reports intended users.

Description of the interventionprovides the basis for report users to understand the logic and asses
the merits of the evaluation methodology and understand the applicability of the evaluation results. The
description needs to provide sufficient detail for the report user to derive meaning from the evaluation.
The description should:

Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to
address.
Explain the expected results map or results framework, implementation strategies, and the key
assumptions underlying the strategy.
Link the intervention to national priorities, UNDAF priorities, corporate multi-year funding
frameworks or strategic plan goals, or other programme or country specific plans and goals.
Identify the phase in the implementation of the intervention and any significant changes (e.g.,
plans, strategies, logical frameworks) that have occurred over time, and explain the implications
of those changes for the evaluation.
Identify and describe the key partners involved in the implementation and their roles.
Describe the scale of the intervention, such as the number of components (e.g., phases of a
project) and the size of the target population for each component.
Indicate the total resources, including human resources and budgets.
Describe the context of the social, political, economic and institutional factors, and the
geographical landscape within which the intervention operates and explain the effects
(challenges and opportunities) those factors present for its implementation and outcomes.
Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g.,
resource limitations).

Evaluation scope and objectivesThe report should provide a clear explanation of the evaluations
scope, primary objectives and main questions.

Evaluation scopethe report should define the parameters of the evaluation, for example, the
time period, the segments of the target population included, the geographic area included, and
which components, outputs or outcomes were and were not assessed.
Evaluation objectivesthe report should spell out the types of decisions evaluation users will
make, the issues they will need to consider in making those decisions, and what the evaluation
will need to achieve to contribute to those decisions.
Evaluation criteriathe report should define the evaluation criteria or performance standards
used. The report should explain the rationale for selecting the particular criteria used in the
evaluation.

11

Evaluation questionsEvaluation questions define the information that the evaluation will
generate. The report should detail the main evaluation questions addressed by the evaluation
and explain how the answers to these questions address the information needs of users.

Evaluation approach and methodsThe evaluation report should describe in detail the selected
methodological approaches, methods and analysis; the rationale for their selection; and how, within the
constraints of time and money, the approaches and methods employed yielded data that helped answer
the evaluation questions and achieved the evaluation purposes. The description should help the report
users judge the merits of the methods used in the evaluation and the credibility of the findings,
conclusions and recommendations. The description on methodology should include discussion of each
of the following:

Data sourcesthe sources of information (documents reviewed and stakeholders), the rationale
for their selection and how the information obtained addressed the evaluation questions.
Sample and sampling frameIf a sample was used: the sample size and characteristics; the
sample selection criteria (e.g., single women, under 45); the process for selecting the sample
(e.g., random, purposive); if applicable, how comparison and treatment groups were assigned;
and the extent to which the sample is representative of the entire target population, including
discussion of the limitations of sample for generalizing results.
Data collection procedures and instrumentsMethods or procedures used to collect data,
including discussion of data collection instruments (e.g., interview protocols), their
appropriateness for the data source, and evidence of their reliability and validity.
Performance standardsthe standard or measure that will be used to evaluate performance
relative to the evaluation questions (e.g., national or regional indicators, rating scales).
Stakeholder participationStakeholders participation in the evaluation and how the level of
involvement contributed to the credibility of the evaluation and the results.
Ethical considerationsThe measures taken to protect the rights and confidentiality of
informants (see UNEG Ethical Guidelines for Evaluators for more information).
Background information on evaluatorsthe composition of the evaluation team, the
background and skills of team members, and the appropriateness of the technical skill mix,
gender balance and geographical representation for the evaluation.
Major limitations of the methodologyMajor limitations of the methodology should be
identified and openly discussed as to their implications for evaluation, as well as steps taken to
mitigate those limitations.

Data analysisthe report should describe the procedures used to analyse the data collected to answer
the evaluation questions. It should detail the various steps and stages of analysis that were carried out,
including the steps to confirm the accuracy of data and the results. The report also should discuss the
appropriateness of the analyses to the evaluation questions. Potential weaknesses in the data analysis
and gaps or limitations of the data should be discussed, including their possible influence on the way
findings may be interpreted and conclusions drawn.

12

Findings and conclusionsthe report should present the evaluation findings based on the analysis and
conclusions drawn from the findings.

Findingsshould be presented as statements of fact that are based on analysis of the data. They
should be structured around the evaluation questions so that report users can readily make the
connection between what was asked and what was found. Variances between planned and
actual results should be explained, as well as factors affecting the achievement of intended
results. Assumptions or risks in the project or programme design that subsequently affected
implementation should be discussed.
Conclusionsshould be comprehensive and balanced, and highlight the strengths, weaknesses
and outcomes of the intervention. They should be well substantiated by the evidence and
logically connected to evaluation findings. They should respond to key evaluation questions and
provide insights into the identification of and/or solutions to important problems or issues
pertinent to the decision-making of intended users.

Recommendationsthe report should provide practical, feasible recommendations directed to the


intended users of the report about what actions to take or decisions to make. The recommendations
should be specifically supported by the evidence and linked to the findings and conclusions around key
questions addressed by the evaluation. They should address sustainability of the initiative and comment
on the adequacy of the project exit strategy, if applicable. Recommendations should also provide
specific advice for future or similar projects or programming.
Lessons learntAs appropriate, the report should include discussion of lessons learnt from the
evaluation, that is, new knowledge gained from the particular circumstance (intervention, context
outcomes, even about evaluation methods) that are applicable to a similar context. Lessons should be
concise and based on specific evidence presented in the report.
Report annexessuggested annexes should include the following to provide the report user with
supplemental background and methodological details that enhance the credibility of the report:

ToR for the evaluation


Additional methodology-related documentation, such as the evaluation matrix and data
collection instruments (questionnaires, interview guides, observation protocols, etc.) as
appropriate
List of individuals or groups interviewed or consulted and sites visited
List of supporting documents reviewed
Project or programme results map or results framework
Summary tables of findings, such as tables displaying progress towards outputs, targets, and
goals relative to established indicators
Short biographies of the evaluators and justification of team composition
Code of conduct signed by evaluators

13

Das könnte Ihnen auch gefallen