Evaluability Assessment Consultant - Addressing Learning Losses

This specific tender is managed via the new supplier portal system of UNDP Quantum. If you are interested in submitting a bid for this tender, you must subscribe following the instructions in the user guide. If you have not registered a profile with this system, you can do so by following the link for Supplier Registration.

If you already have a supplier profile, please access the negotiation using quicklink or please login to the Supplier Portal, then search for the negotiation using the reference number UNDP-PHL-00071,1, following the instructions in the user guide.

Introduction

I.                    Project Title

 

Using Strategic Monitoring and Evaluation to Accelerate the Achievement of National Development Goals (Strategic M&E Project)

 

II.                  Project Description (Background and Rationale)

 

In 2017, the National Economic and Development Authority (NEDA) and the United Nations Development Programme (UNDP) partnered to strengthen the conduct of evaluations in support of the National Evaluation Policy Framework (NEPF). The Strategic Monitoring and Evaluation (M&E) Project, financed by NEDA and implemented with full UNDP country office support, aims to build the capacity of NEDA and select government agencies to conduct evaluations of priority programs, in turn supporting the achievement of national development goals outlined in the Philippine Development Plan (PDP) as well as the UN Sustainable Development Goals (SDGs).

 

In December 2022, NEDA completed the Philippine Development Plan (PDP) 2023-2028, which is anchored on the 8-point socioeconomic agenda of the current administration. In parallel with the PDP formulation, the NEDA-Investment Programming Group (IPG) likewise drafted a six-year evaluation agenda (SYEA), which contributes to engendering the evaluation culture in the public sector in support of the objectives of the National Evaluation Policy Framework (NEPF). Specifically, the SYEA is designed to help enhance the implementation, updating, and assessment of the PDP by providing the necessary evidence concerning the progress, completion, outcome and impact of the government policies, programs, and projects in pursuit of national development goals. While a NEDA-IPG initiative, the implementation of SYEA will involve various evaluation stakeholders (e.g., government agencies, development partners, and the academe) either as partners in conducting key activities or as users of the evaluation/knowledge products. The expectation is that the SYEA will form the core of NEDA’s own evaluation roadmap, and a component of the bigger National Evaluation Agenda of the whole government to be determined jointly by NEDA, Department of Budget and Management (DBM), and Office of the President-Presidential Management Staff (OP-PMS) as mandated under the NEPF.  

 

NEDA, through the Project, is commissioning an evaluability assessment study to help and provide NEDA with a decision guide to determine and prioritize relevant government interventions on this sector that are ripe and ready for the conduct of an evaluation to determine their direct attributable effect – intended or unintended – towards the attainment of the results identified in the PDP[1]. The focus is on the government interventions that are directly identified to be contributing to the specific socioeconomic agenda, and to assess the current state of plans to evaluate their impact and scope the available literature about the evidence related to the interventions’ impact. Ultimately, the exercise is intended to inform the midterm updating of the current PDP and the formulation of the successor Plan.

 

In light of the above, the Project requires the services of a National Evaluation Consultant, with experience in conducting evaluability assessment within the relevant thematic area and target agenda, who shall assess relevant government programs and interventions on learning losses.

 

 

III.                Scope of Work

 

Under the overall guidance of the NEDA Undersecretary for Investment Programming, Monitoring and Evaluation Staff (MES) Director, and the Team Leader of the UNDP Institutions and Partnerships Programme, and reporting directly and regularly to the Chief Evaluation Officer of the Central Evaluation Unit (CEU) and Strategic M&E Project Officer, the National Evaluation Consultant shall be responsible to address the following specific objectives of the evaluability assessment study:

 

a.       establish an inventory of government interventions[2] considered under, and contributing to the results of the specific agenda as well as the results identified in the PDP Results Matrices;

b.       assess the evaluability (for an impact evaluation) of the interventions that are contributory to the agenda of addressing learning losses;

c.       recommend the best impact evaluation approach and evaluation questions to be taken for interventions ready for impact evaluation; otherwise, determine appropriate evaluation approach to be pursued for the intervention at given periods; and,

d.       describe the extant literature on the evidence[3] related to the intervention’s impact.

 

In line with the above, the Consultant shall be guided by the following research questions:

 

1.       What are the ongoing and to-be implemented[4] interventions (programs and projects) of the government in response to, or under the agenda?[5]

                                          i.      Are the interventions identified evaluable through impact evaluation?

                                        ii.      What is the recommended prioritization order of the interventions per evaluation approach and, theoretically, per year?

 

2.       What are the existing local and national literature about the impacts of the projects under the agenda (or similar)?

                                          i.      What are the current or existing methods/designs used in conducting impact evaluation for the interventions identified? If none, what are the appropriate methods/designs to be used based on the emerging trends globally?

 

3.       What are the possible evaluation questions for the conduct of impact evaluation of each identified intervention?

‘Evaluability’ is defined as the extent to which an activity or project can be evaluated in a reliable and credible fashion[6]. The evaluability assessment shall then examine if the identified interventions under the agenda are evaluable – specifically, the study seeks to assess intervention’s readiness to be subject to an impact evaluation. Impact evaluation is defined as “empirical studies that quantify the causal effects of interventions on outcomes of interest”[7] and “are a particular type of evaluation that seeks to answer a specific cause-and-effect question: What is the impact (or causal effect) of a program on an outcome of interest?”[8]

 

Further, the consultant is also expected to recommend prioritization of interventions to evaluate, justified according to relevance or potential contribution to the socio-economic agenda and their readiness to be evaluated in the short-term, particularly in time for the mid-term review of the PDP in 2025. The prioritization will also be assessed on a per year basis, from the year 2023 – 2026, to account for the differing maturity of the interventions and in anticipation of the possibility that the implementation of evaluations may be on a rolling basis.

 

Notwithstanding broader definitions of government interventions and evaluations, it is emphasized that the consultant identify only projects and programs, and to assess the interventions’ evaluability through impact evaluation. Note as well that the scoping of extant literature shall be  largely exploratory and does not seek to examine the quality of the studies nor synthesize their results and their findings around the topics under the agenda.

 

There is no set scientific protocol to be followed for this study, and the consultant is expected to accomplish the questionnaires and forms that will be provided (Annex 2) as outputs of the study. The consultant is given the liberty to conduct both consultative and desk data collection measures to accomplish the forms but will be expected to describe the methods and data sources to confirm the veracity of the information reported.

 

The relevant forms and questionnaires that the consultant is expected to accomplish will be provided as a packet upon commencement of the activity. The consultant will also be provided a template summary sheet of the interventions from Step 1 to check the completeness of the reports and forms to be accomplished for each intervention.

Table 1. Detailed steps to conduct the study

 

Step

Description

Expected Output

Output Description

1

As the first stage of data collection meant to establish the pool of possible programs the consultant will work with, the consultant will enumerate all government interventions that are considered to be under, and/or contributing to the agenda. The consultant may confirm this data through consultative activities with relevant stakeholders and is expected to reflect these data sources in the provided form.

Inventory of Interventions

This is a relatively simple listing of projects and their basic information, that have been confirmed to be under the agenda.

2

From the enumerated interventions, the consultant will confirm and assess if these interventions indeed contribute to the intended results of the agenda by assessing the relevant project documents (theory of change, logical framework, etc) vis-à-vis the PDP results.

 

This step will narrow down the interventions the consultant will work with next, as only the interventions assessed to be contributory will be kept and processed in the next step.

Accomplished Contribution Assessment Form/Matrix of Theorized Contribution

This matrix/form is a simplified assessment of an intervention’s results logic vis-à-vis the results in the PDP Results Matrices under the agenda. The goal is to be able to confirm that the goals or objectives of the intervention is aligned with that of the agenda.

3

For the next step, only the interventions assessed as contributory to the agenda/PDP will be catered to. The consultant is expected to collect data about the intervention that will answer the provided questionnaire which will assess if the intervention is ready to be evaluated, specifically through an impact evaluation.

 

The consultant may conduct further consultative activities with stakeholders or do desk work to collect the needed data.

Accomplished Evaluability Assessment Questionnaire

This questionnaire is a modified version of the Evaluability Assessment Checklist and Evaluability Plan templates included in the National Evaluation Policy Framework Guidelines[9]. The evaluability assessment will focus on assessing whether the intervention is ready for an impact evaluation.

4

For the interventions that have been determined as evaluable through impact evaluation, the consultant will proceed to accomplish an indicative evaluation plan for the applicable interventions.

Indicative Evaluation Plan

The Evaluation Plan is a modified version of the Evaluation Plan also included as a template in the NEPF Guidelines[10]. Generally, the consultant will recommend the research questions, design of the evaluation, and other details to be considered by NEDA when planning for the conduct of the actual impact evaluation.

 

For the interventions that were deemed unready for impact evaluation in the previous step, the consultant will recommend the type of evaluation that the intervention may pursue instead, taking into account the differing timelines vis-a-vis the mid-term review of the PDP. The recommendations may also be to inform the intervention on the relevant documents or other factors that need to be present in order to make the intervention viable for impact evaluation.

Accomplished Evaluation Recommendation Form

This form is meant to focus the recommendation of the consultant on the ways forward of the intervention vis-à-vis the appropriate and feasible evaluation to pursue given the maturity and readiness of the intervention, as well as the factors that the stakeholders need to provide or procure in order to make impact evaluation possible.

 

A summary of the process may be visualized in the figure found in the evaluability assessment design in Annex 1.

 

 

IV.                Expected Output and Deliverables

 

The Consultant is expected to accomplish the following:

 

Output

Estimated Duration to Complete

Target Due Date

Approvals Required

Designated person to review and accept the output

Inception Report

Includes interpretation of tasks, description of approach and methodology, study protocol, actors to be involved, key issues to be addressed, limits and possible constraints, detailed work plan and data protection measures.

7 days

14 April 2023

NEDA IPG, MES Director

 

Team Leader, Institutions and Partnerships Programme

 

Strategic M&E Project Officer, in consultation with NEDA Central Evaluation Unit

 

First Progress Report

To be submitted within 2 months after acceptance of the inception report. Template to report progress will be provided.

 

This shall include the Inventory of Interventions (template will be provided).

 

14 days

14 June 2023

Second Progress Report

To be submitted within 1 month after 1st Progress Report. Template to report progress will be provided.

 

This shall include the accomplished Contribution Assessment Form/Matrix of Theorized Contribution (template will be provided).

14 days

14 July 2023

Final Output Dossier

The final dossier shall include all outputs identified from the four (4) steps: accomplished inventory of interventions, contribution assessment form/matrix of theorized contribution, evaluability assessment questionnaire, indicative evaluation plan, and accomplished evaluation recommendation form (as applicable), and shall be accompanied by an executive summary.

 

A draft dossier and its initial findings shall be submitted and presented to relevant stakeholders, NEDA-CEU and/or NEDA-IPG (if there is a learning session scheduled in time) for comments and vetting prior to the finalization of the report. The presentation may be done at most twice – one for NEDA and the other for other stakeholders.

 

The final output shall be presented to NEDA-IPG learning events (such as the M&E Forum). Should there be none scheduled within the contract period, the consultant shall commit to present the study to at least one learning event even beyond contract period when scheduled by NEDA and/or UNDP.

35 days

08 September 2023

Total

70 days

 

 

 

 

 

V.                  Institutional Arrangements

 

1.       The Consultant shall be under the overall guidance of the NEDA Undersecretary for Investment Programming, Monitoring and Evaluation Staff (MES) Director and the Team Leader, UNDP Institutions and Partnerships Programme, and reporting directly and regularly to the Chief Evaluation Officer of the Central Evaluation Unit (CEU) and Strategic M&E Project Officer with whom all outputs shall be submitted and through whom all communications related to the conduct of the study shall be coursed.

Amendment

correction (scoring)

Documents :

Negotiation Document(s) (Before Accessing other negotiations Document(s), please click on this link)