These Terms of Reference (ToR) outline the scope of work and working arrangements for an external evaluation team to conduct the end-term evaluation (ETE) for the Break Free! programme.
Background
Break Free! is a joint lobby and advocacy programme of Plan International Netherlands, SRHR Africa Trust (SAT) and Forum for African Women Educationalists (FAWE) in collaboration with technical partners KIT Institute (KIT) and Rozaria Memorial Trust (RMT). Envisaged in the Break Free! consortium is a society where adolescents can exercise their right to live free from Teenage Pregnancy (TP) and Child Marriage (CM), supported by civil society.
The programme is funded by and in partnership with the Dutch Ministry of Foreign Affairs (MoFA) under the Strengthening Civil Society for SRHR partnership fund with 25 million euros for the period of five years from January 2021 to December 2025. A baseline study was conducted in 2021 to serve as a benchmark for programme’s progress tracking and achievements. A mid-term review (MTR) was conducted in 2023, tracking progress and identifying lessons. In 2025, an external end evaluation will be conducted.
The Break Free! programme
The strategic objective of the Break Free! consortium is that adolescents exercise their right to live free from teenage pregnancy (TP) and child marriage (CM) supported by civil society. The Break Free! programme strengthens civil society organizations, youth-led groups and networks to lobby and advocate for improved legislation and policy implementation to increase youths’ agency and to promote social norm change in favour of the prevention of CM/TP. Three pathways of change have been identified leading towards three outcomes and the strategic programme objective:
To achieve these outcomes, seven key strategies have been prioritized:
Break Free! is implemented in Burkina Faso, Ethiopia, Kenya, Mali, Malawi, Mozambique, Niger, Sudan, Zambia, as well as at Pan-African regional level, targeting institutions and other stakeholders operating at this broader level.
The Break Free! Mid-Term Review report describes that trends in the context have intensified dramatically. Intersecting political, climate and economic crises reinforce each other in many of the programme implementation countries. The effect on target groups is detrimental: the vulnerability of women and girls to harmful practices increases and negatively affects target groups’ safety, livelihoods, and access to essential services. Significant negative changes have happened in Sudan, Mali, Niger, Burkina Faso and Ethiopia. The conflict in Sudan resulted in the relocation of the programme to implementation areas in Kassala and White Nile States. In Sudan, Ethiopia, Niger, Burkina Faso and Mali, the programme strategies were adapted to the local humanitarian context and have included IDPs in the groups targeted.
Scope of the evaluation
The evaluation will cover the period of January 2021 – December 2025. The evaluation shall provide a representative conclusion on the whole programme including achievements to date on the programme indicators for all 9 countries and the regional African component. The evaluation shall be guided by the guidelines for evaluations of the Policy and Operations Evaluation Department of the MOFA (IOB), and the 17 evaluation criteria of the IOB.
Due to the significant contextual changes in several countries, it is in the scope of this evaluation to explore the pathways of change over time within a country while considering the contextual changes, and how country programmes adapted to the changes. To zoom into specific programme components, In-depth evaluation can be proposed for a selected number of countries. The consultant shall select the countries and components, in consultation with the consortium. The main criteria for selection are: 1) thematic coverage of the Break Free! outcome areas; 2) geographical spread within countries and regional representation; 3) key developments in the context in which the programme operates, including in space for civil society.
The response to evaluation question 5 should be a cohesive conclusion of the findings related to the evaluation questions.
Purpose and objectives of the end-term evaluation
The Break Free! programme formally ends in December 2025 and per the requirements of the Dutch Ministry of Foreign Affairs, the Break Free! consortium commissions an independent end evaluation by an external party in the final year of implementation of the programme, covering the full programme period of January 2021 until December 2025.
The objective of the end-term evaluation is to fulfil the accountability requirements of the MoFA and serves a learning purpose, with special attention to the following:
Evaluation questions
In response to the above questions for each of the outcome areas (see questions 1.1-1.3), the consultant is to validate assumptions and pay particular attention to the following:
a. To what extent were results achieved per indicator?
b. How were results achieved and how can these be explained and/or attributed to programme activities?
c. Comparing the planned final outcomes versus achieved outcomes: did change occur in the way we expected?
d. Unintended and unexpected results (positive or negative) stemming from programme implementation.
e. How did the programme adjust its strategies and outcomes in response to contextual challenges (e.g. humanitarian crisis) and how did adaptations contribute to the programme outcomes?
1.1 Outcome area 1: How did the programme support adolescent girls at risk of CM/TP/FGM-C to access safe and quality education, and what outcomes were achieved?
1.2 Outcome area 2: How did the programme support the development, resourcing and implementation of laws and policies that respond to adolescent girls’ needs; and what outcomes were achieved?
1.3 Outcome area 3: How did the programme support improved access for adolescents to quality SRHR education, information and services; and what outcomes were achieved
2. What strategies have been applied to promote sustainability of results beyond 2025?
2.1 What are the enablers and disablers that contributed to sustainability in: youth voice and agency, (youth) organization’s capacity; linking and networking/ network building of youth (organization); and an enabling environment?
3. How well did the outcome areas work together and contribute to the programme goal(s)?
3.1 Did change occur in line with expectations and assumptions in the programme ToC? If not, what might be alternatives pathways and assumptions?
3.2 To what extent did the Break Free! consortium work in coherence and how did this contribute to reaching the programme objectives and intended results?
3.3 In what ways did Break Free! align with national policies and priorities in the implementation countries, and worked in complementarity with other (local) SRHR programmes, MFA and the embassies and the MFA SCS policy framework?
4. To what extent was the internal and external collaboration of Break Free! equitable?
4.1 To what extent was the external collaboration of Break Free! Equitable?
4.2 To what extent did the Break Free! consortium contribute to southern leadership and equitable internal relationships and decision making?
5. What are lessons learned in the Break Free! programme?
5.1 How did the consortium incorporate findings of the mid-term review?
5.2 What are the lessons learned from programme strategies and implementation?
5.3 What are lessons learned from the partnership collaboration?
5.4 What good practices can be identified?
5.5 What recommendations can be made for future programming?
Quality criteria and approach of the evaluation
The evaluation process shall be guided by the guidelines for evaluations of the Policy and Operations Evaluation Department of the MOFA (IOB), and the 17 evaluation criteria of the IOB. Furthermore, the evaluation conduct (evaluation methodology, data-collection and analysis), and corresponding products, will need to abide by the IOB Evaluation Quality criteria.
The criteria are organized by 3 phases, i.e., Phase I – Terms of Reference, Phase II – Elaborated methodology, and Phase III – Draft and final report. The consultant is requested to pay particular attention to the criteria in Phase II and Phase III. More details on the criteria and how they are assessed can be found here: Kwaliteitscriteria voor evaluaties | Richtlijn | Directie Internationaal Onderzoek en Beleidsevaluatie (IOB) (iob-evaluatie.nl)
Apart from the IOB evaluation quality criteria, the evaluation also has to meet the requirements of effectiveness and coherence from the OECD DAC Evaluation Criteria:
More details on the criteria can be found here: Evaluation Criteria | OECD.
Finally, the ETE provides progress information on indicators within the results areas of the overall Break Free! Theory of Change (ToC). The evaluation will assess the Theory of Change (ToC) to determine its relevance, coherence, and validity. The evaluation will identify which pathways worked as expected, where assumptions held true, and where course corrections were needed. Several indicators of the Break Free! programme are linked with the basket indicators of MFA’s Strengthening Civil Society grant framework. More details about the indicator framework are available here: https://helpdesk-opendata-minbuza.nl/guidelines-for-partnerships-strengthening-civil-society/.
The full overview of the Break Free! indicators is included in Annex 1, including disaggregation and reference to the MFA basket indicators.
Methodology
The consultant is requested to propose an appropriate mix of quantitative and qualitative methods that meet the quality requirements of the above-mentioned criteria set out by OECD and IOB. The methodology needs to be robust and establish causality between the Break Free! intervention activities and the Break Free! results and the extent to which these results (and outcomes) can convincingly be linked to the Break Free! programme. It should also be appropriate for making judgement on the effectiveness of the programme and to explain underlying generative causal mechanisms and how context influenced the results. We would like to know what works, for whom and under what circumstances. The preferred methods are youth friendly and participatory and include marginalised groups. Methodologies can be fine-tuned during the inception phase.
The evaluation should employ qualitative methods that are appropriate to validate results and to establish causal chain between the intervention activities and the results, and to identify unintended results. These methods may be contribution analysis, process tracing or general elimination methodology. The evaluation should also include a mapping of funding streams, e.g., the budget received by the consortium, and the budget spent on implementation (including coordination and organisation costs) and outcome areas.
Further, since one of the components of the programme is network building and capacity building of local CSOs and youth groups, the consultant is invited to propose a method that is able to capture the achievements of this component. This may be a measurement to explore capacity of CSOs involved in the Programme and active on national and sub-national level. As for the interconnectedness of CSOs network, the measurement may involve a form of social network analysis, e.g., event-based network analysis.
The evaluation team can make use of existing (quantitative and qualitative) data, including: baseline reports, M&E data, midline reports, operational research studies, IATI publication, annual plans and reports at both country and consortium level. The programme includes outcome harvesting, and advocacy tracking as regular monitoring tools. The baseline and mid-term evaluations employed questionnaires, focus-group discussions, interviews, a partnership assessment survey, CSO capacity assessments, checklists for child protection in schools and a youth engagement survey. The documentation and tools will be available for the consultant to build on previous experiences and to ensure consistency across evaluations.
The available programme information should be used as input for the desk study and can be used to triangulate and validate findings of the evaluation team through primary data collection from internal and external sources.
Sample
The sampling strategy should also consider the most marginalised target groups, youths and women. It should be appropriate to the local context. The quantitative and qualitative methods should complement each other. Finally, the sampling strategy should also adhere to the IOB Evaluation Criteria.
Analysis, feedback and sense-making
The analysis and sense-making of the information will happen after the data collection. The consultant is expected to analyse the collected data and bring together results and learning from each method. It is also expected that the results from different sources will be triangulated[3] and form a coherent evaluation on programme level by the end of the assignment. The sense-making will help to develop a coherent evaluation and may involve (online) workshops/webinars with programme staff and/or external stakeholders to improve shared understanding, sharpen findings and validate results.
The draft and final reports must undergo a structured feedback process. The consultant will present findings to the consortium and stakeholders, allowing for iterative revisions based on feedback. The consultant is responsible for consolidating feedback into the final report and ensuring the quality of deliverables in line with IOB and OECD-DAC criteria.
Reporting
The consultant is to develop a synthesis report containing a coherent evaluation of the programme (approximately 50 pages) and additional country specific chapters or reports (about 10 pages for each country). We expect the report outline to be included in the inception report.
Research ethics and safeguarding
The Break Free! consortium is committed to ensuring that the rights of those participating in data collection or analysis are respected and protected, and to act in accordance with Plan International’s Child and Youth Safeguarding Policy as well as the UNEG Code of Conduct for Evaluation.
The evaluation plan has to be approved by a formal ethical board. This may be an academic body the consultant is affiliated with, country level (national) ethical committee or Plan International’s Ethics Review Team.
Ethical and child protection issues need to be taken into consideration by the researcher when carrying out the evaluation. A meeting will take place with country consortium members to know in detail the organizational values, the applicable Child Safeguarding policy and the expected behaviour by the consultancy team before, during and after the fieldwork.
All applicants should include details in their proposal on how they will ensure ethics and child protection in the data collection process. Specifically, the consultant(s) shall explain how appropriate, safe, non-discriminatory participation of all stakeholders will be ensured and how special attention will be paid to the needs of children and other vulnerable groups. Safety of evaluation respondents also includes considerations that the respondent will not face backlash or risk in their community due their participation.
The consultant(s) shall also explain how confidentiality and anonymity of participants will be guaranteed. The data management of the participants’ personal information should adhere to the GDPR. Data collection instruments should include measures to ensure informed consent, confidentiality, and anonymity for all participants, with extra care taken when working with children, adolescents, and vulnerable groups.
The consultant(s) will have to provide a police certificate of good conduct and sign Plan International’s Child and Safeguarding Policy before commencement of the assignment.
In conflict-affected regions, the evaluation team must implement conflict-sensitive protocols to ensure the safety and well-being of participants and field staff. This includes contingency plans for fieldwork disruptions and remote data collection strategies, and psychosocial support where needed.
Users of the evaluation
The principal users of the end evaluation will be the consortium members, and (technical and implementation) partners of Break Free!, both at global level as in the programme implementation countries; as well as the MoFA. Evaluation findings will be used for accountability and learning purposes and to identify successes, failures and lessons learned: what approaches work and why/why not. Further, the evaluation findings will be shared with the programme target groups in an easily accessible format.
The consultant will prepare a dissemination plan in collaboration with the consortium, identifying key stakeholders, communication channels, and events for sharing findings.
The Evaluation report may be shared with the other SRHR strategic partnerships funded by MoFA, the Girls Not Brides Network, and key stakeholders in the Break Free! countries, including embassies. The final evaluation report will be published in the International Aid Transparency Initiative (IATI) platform and it will be published on the penholder’s website. Where possible, evaluation findings will be shared through regional and international conferences and other learning events.
Deliverables
1. Evaluation inception report
A concise inception report should be prepared by the evaluators before starting the evaluation. This report details the evaluators’ understanding of what is being evaluated and why, showing how each evaluation question will be answered. The inception report should elaborate on the research design, highlighting why the – combination of – method(s) and data sources have been selected and how they are expected to validly and reliably contribute to answering the research questions.
The inception report should include an evaluation matrix, proposed data collection tools, an explanation of to be applied analysis methods, a proposed schedule of tasks, activities, proposal for ethics approval, proposed report outline, and deliverables.
2. Draft evaluation report
An English[4] synthesis report with approximately 50 pages, excluding annexes, possible case studies and executive summary.
Additionally, summary country chapters or reports with a maximum of 10 pages for each Break Free! country. The country reports for the Sahel countries should also be made available in French.
The draft and final reports must undergo a structured feedback process. The consultant team will present findings to the consortium and stakeholders, allowing for iterative revisions based on feedback. The consultant is responsible for consolidating feedback into the final report and ensuring the quality of deliverables in line with IOB and OECD-DAC criteria.
3. Final evaluation report
An English synthesis report with approximately 50 pages, and summary country chapters or reports with a maximum of 10 pages for each Break Free! country, excluding annexes, possible case studies and executive summary. The report will follow the agreed outline and be delivered in a timely manner.
Reports should also include or be accompanied with summary versions in accessible language for non-technical audiences including youth and community members.
Roles and Responsibilities
Consortium Desk
PMEL working group
BF! Programme Committee
BF! Board of Directors
BF! Country consortia
Evaluation/ Consultant Team
Timeline
Evaluation team requirements
The evaluation team should be multidisciplinary, including expertise in SRHR, gender analysis, youth engagement, social network analysis, conflict-sensitive programming. At least one member of the team must have extensive experience in participatory research methods. To ensure contextual relevance and adherence to ethical standards, the team must include local evaluators or consultants from the programme countries (to ensure contextual relevance and deeper insights from on-the-ground realities). The consulting team should reflect gender, age and cultural diversity, with clear roles for youth engagement (to foster meaningful participation), gender responsive education, youth advocacy and local policy context experts. Given the focus on youth empowerment, young evaluators or youth-led organizations can be included or partnered with, to align with the principles of the program.
The evaluator(s) are expected to meet the following qualifications:
Please note: (Police) Certificates of Good Conduct of consultants and/or staff involved in the evaluation will have to be submitted before signing the consultancy agreement.
Budget
Proposals with a value of between € 160,000 and € 190,000 are welcome.
The consultant’s proposal should include a detailed breakdown including number of working days, consultant fees, level of effort, travel costs, per diem, VAT/taxes, etc. Payments will be based on deliverables as per the schedule above, including adherence to the IOB criteria. All costs proposals should be made in Euro.
Payment of the contractor will be agreed and take place upon approval of the deliverables, for instance:
Proposal assessment criteria
Proposals will be considered on the following basis:
Technical Criteria
Quality of the Team
Methodology & Approach
Commercial Criteria
The reference document for the call for proposal can be found here.
[1] The compatibility of the intervention with other interventions in a country, sector or institution. The extent to which other interventions (particularly policies) support or undermine the intervention, and vice versa. Includes internal coherence and external coherence: Internal coherence addresses the synergies and interlinkages between the intervention and other interventions carried out by the same institution/government, as well as the consistency of the intervention with the relevant international norms and standards to which that institution/government adheres. External coherence considers the consistency of the intervention with other actors’ interventions in the same context
[2] The extent to which the intervention achieved, or is expected to achieve, its objectives, and its results, including any differential results across groups. Analysis of effectiveness involves taking account of the relative importance of the objectives or results.
[3] Triangulationis a strategy to enhance the validity and reliability of the findings by cross-verifying information from different perspectives in order to provide a more comprehensive and accurate understanding of what is being evaluated.
[4] The raw data sets must be made available to the client upon request
Proposals should address: understanding of the TOR and methodology, proposed team structure, workplan and timeline, budget breakdown, risk management plan, and references. Proposals should include at least one example of a previous evaluation report conducted by the team or lead consultant. The evaluation team must declare any actual or potential conflicts of interest, including relationships with project stakeholders or funders.
Proposals should be submitted to Agnes Neray ([email protected]) by Thursday 23 January 2025.
Tagged as: Burkina Faso, Ethiopia, Kenya, Malawi, Mali, Mozambique, Niger, Plan International, Sudan, Zambia
About the role: This is a 24 month, replacement role with non-accompanied terms based in Port Sudan or Darfur, with frequent...
Apply For This JobAbout Us GOAL has responded to the world’s major humanitarian crises since its founding in 1977, working with vulnerable communities...
Apply For This JobA propos de la Chaîne de l’Espoir La Chaîne de l’Espoir est une ONG internationale fondée en 1994. Elle a...
Apply For This JobDescription Areas of expertise Genetic Innovation, Program Management, Financial Management Duties and Responsibilities 1. Program management & administration Provide critical support...
Apply For This JobWith over 70 years of experience, our focus is on helping the most vulnerable children overcome poverty and experience fullness...
Apply For This JobSecteur de tutelle : Secteur des sciences sociales et humaines (SHS) Lieu d’affectation: Libreville Catégorie d’emplois: Sciences sociales et humaines Type de contrat : Personnel non...
Apply For This Job