Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Closed Compact Report:  Closed Compact Report: Honduras Compact
  • May 2020

Lessons Learned

MCC’s five-year compact timeframe places a premium on establishing implementation structures before implementation begins: Honduras underwent a national political transition very shortly after the compact was signed, which resulted in delays in setting up the key structures for program implementation, such as full staffing for MCA-H, detailed project planning, and establishing the MCA-H board of directors. All of this resulted in a slow start to the compact. Learning from this experience, MCC and partner countries have since allowed more time for compacts to enter into force so that key implementation structures can be established to allow a full five-year compact implementation period.

The FTDA evaluation contributed to MCC learning in the agriculture sector, as detailed in the “Impact Evaluations of Agriculture Projects” (2012) publication in MCC’s Principles into Practice series. MCC also notes the following lessons related to evaluations for agriculture projects from Honduras FTDA:

  • Integrate implementers and evaluators early. One key lesson is that MCC brought in the independent evaluator after key program design and implementation actions had been taken, which affected the feasibility of a rigorous impact evaluation. Involving the evaluators in the early stages of the program could have helped mitigate some of the evaluability challenges that subsequently emerged.
  • Clearly define program participants. For any intervention, MCC and country counterparts must work toward having clearly defined program participants and eligibility criteria when necessary. In Honduras, there was a mix of broad selection criteria from MCA-H and more specific selection criteria by the implementer in order to target farmers in the field. These two criteria although somewhat complementary, still resulted in challenges for replicating farmer selection for the purpose of a rigorous impact evaluation.
  • Align incentives. It is almost impossible to have a successful evaluation if program implementers and evaluators are not working in lock-step. This requires not only early integration, but also aligning incentives between the two. There must be clear understanding and commitment by the implementing entity to cooperate with the evaluator and vice versa. In Honduras, the implementer was contracted two years before the evaluator, which resulted in the implementer’s contract not including specific responsibilities regarding collaboration with the evaluator. In addition, the implementer was committed to delivering training to 6,000 farmers and increasing average income by $2,000. Therefore, the implementer was incentivized to find successful program participants who were selected in part based on difficult-to-replicate criteria, which did not align with the evaluation design.
The Transportation evaluation contributed significantly to MCC learning in the transport sector, as detailed in the “Lessons from MCC’s Investments in Roads” (2017) publication in MCC’s Principles into Practice series. MCC noted the following lessons from the Honduras Transportation Project:
  • Set realistic time horizons. Inevitably there are delays in large infrastructure projects. From the beginning, implementers and evaluators should build into the evaluation design actions for mitigating risk to the evaluation associated with implementation. In the case of Honduras, given the implementation delays and inflexibility in the evaluation schedule, the exposure period (or time between implementation and evaluation) to the improved roads network in some cases was only 5-6 months,[[Average exposure periods for evaluations in the roads sector across MCC portfolio is 50-57 months.]] whereas for some sections of highway CA-5 rehabilitation wasn’t completed at the time of end-line data collection. This is a limited exposure period when decision makers are interested in looking at longer term outcomes, such as changes in prices and income.
  • Understand your target beneficiary population. For this evaluation, the target population for the household survey was the population of all households in Honduras at the beginning and end of the project. For the evaluation, the evaluator used a sample frame constructed for the most recent national census. By focusing on the broader Honduran population, while the evaluation looks at average effects across the country, it is reasonable to expect that some specific groups, particularly those who live closer to the upgraded roads and businesses that rely on the highway CA-5, would benefit more from the investments.
It is worth noting these evaluations were designed and implemented during MCC’s first few years. The lessons learned from these evaluations not only informed MCC program operations, but directly contributed to the establishment of the MCC Evaluation Management and Review Process in 2013. Through establishment of this rigorous internal quality assurance process, MCC is able to more closely coordinate evaluation design and implementation with the program logic, ensure internal and external stakeholders are aligned on research questions and methods, and maximize learning through appropriate dissemination platforms and events for both MCC staff and country partners. These lessons were documented in “Learning from Evaluations at the Millennium Challenge Corporation” by Sturdy, Aquino, Molyneaux (2014).