Learning by doing: How to successfully apply the Collaborating, Learning, and Adapting (CLA) approach in your programs

The Collaborating, Learning, and Adapting (CLA) approach has gained popularity from funders and practitioners in recent years for its instrumental role in enabling effective data utilization for adaptive management. Briefly, the CLA approach is the ongoing process of formative evaluation for the purpose of learning and adapting a program to better fit the needs and realities of the context and beneficiaries. This approach is different from program monitoring and from performance management in that data are used to make both improvements to program delivery and adjustments to program activities.

The CLA approach is the ongoing process of formative evaluation for the purpose of learning and adapting a program to better fit the needs and realities of the context and beneficiaries.
In this post, I describe my recent experience applying the CLA approach to the seven-month, USAID-funded Mahay Mamaky Teny (MMT) or “I know how to read” program in Madagascar. I explain what data collection tools we used and their purpose, explain what made the CLA approach so valuable to ensuring the success and buy-in of the program, and share some tips for how you can successfully employ the CLA approach in your programs.

The goal of the MMT program was to provide the Ministry of National Education with key knowledge, tools and expertise to be capable of sustainably developing and improving a set of core inputs into an effective early grade reading instructional approach. The CLA approach was a perfect fit to accompany the Ministry to make data-driven decisions for ongoing development and finalization of a reading program, training approach, and community communication strategy.

The tools we used included:

ToolPurpose of dataWith whom?
Fidelity of implementation classroom observation toolTo explain student outcome data on the Early Grade Reading Assessment (EGRA) and to track the degree to which teachers respected the lessons as designedEach Grade 1 teacher in pilot schools
Teacher and student questionnaireTo obtain feedback on the teacher guide and student book from their usersEach Grade 1 teacher in pilot schools and a group of Grade 1 students per classroom observed
School director questionnaireTo collect feedback from the school director on his/her responsibilities as a pedagogical mentor and on parent sensitizationEach school director from each pilot school
Pedagogical district leader questionnaire To collect feedback from the pedagogical district leader on his/her responsibilities as a support person to the school director in his/her role of pedagogical mentorEach pedagogical district leader from each pilot zone
Focus group with parents of Grade 1 studentsTo collect feedback from parents on their participation in sensitization sessions and to gauge their involvement in their child’s education, and ideas for how to engage them more effectivelyA group of parents of Grade 1 students per school

These tools directly aligned with the program inputs that is, teaching and learning materials, training content, and communication strategy. They were administered twice during the four-month pilot period. We analyzed the data and shared it the week following each data collection period to ensure it could be used to immediately adapt program implementation.

What was it about using the CLA approach that was so valuable?
Though the ultimate and intended objective of CLA is to use data to improve programs as you go, there were also unintended yet positive benefits of using this approach.

Though the ultimate and intended objective of CLA is to use data to improve programs as you go, there were also unintended yet positive benefits of using this approach.

  1. Ground-truthing is everything. Most governments don’t want programs imported from elsewhere. They want something that works for their context. In Madagascar, employing a process that ensured ongoing data collection to inform and improve the program ensured all key aspects of the program were ground-truthed. This ground-truthing contributed to the Ministry’s and other education partners’ rapid buy-in of the MMT program.
  2. Using data depoliticizes decision making. Decision making is often highly politicized and can unintentionally lead to decisions based on perception rather than reality. When real-time data are available, decisions can be data-driven. Aware of this, the government of Madagascar requested ongoing experimentation of programs prior to making a choice about what program they wanted to go with. MMT jumped on the opportunity and made sure to continuously communicate the data findings as they were collected.
  3. When you are evaluating, you are learning. Because CLA relies on data collection directly tied to the program design, the personal experience of collecting data helps understand the ‘why’ behind the program. The Ministry enumerators – composed of evaluation, training, and material developers – directly observed and talked with program beneficiaries. This personal experience helped them to better understand the logic and coherence of the program. In turn, this understanding helped them make the appropriate program improvements and to justify the program design to the policy decision makers.
How can you design a CLA approach to maximize these benefits?
  1. Involve the Ministry and other education stakeholders in data collection. For the reasons cited above, when stakeholders are involved in talking with and learning from direct beneficiaries, they are much more likely to get your decision makers onboard with an appropriate and ground-truthed program.
  2. Make sure your tools align with your program inputs. It is essential your CLA tools are connected to your program activities. This is because CLA tools are not diagnostic tools but tools to track and assess whether your activities are on the right track.
  3. Turn data analysis around quickly. The whole point of CLA is to make quick yet data-driven decisions about improving your program and your program delivery. Once the data collection is done, your data should be turned around in no fewer than two weeks to a month, depending on your program timeline.
  4. Share the lessons learned and recommendations for adaptation. Too often, data flow is slow. This slowness is often due to the perceived need for a perfect analysis and report, which hinders your ability to quickly and effectively respond to your beneficiaries needs and feedback. As soon as data analysis is ready, share it! Keeping stakeholders abreast of program use and giving them the opportunity to provide their input regarding how to address any identified issues allows you to make relevant adjustments. This iterative process will build stakeholder confidence and buy-in for your program.

For more information about the CLA approach, see resources here and here.


Photo caption: Classroom observation using the tablet-based tool
Photo credit: Nathalie Louge/FHI 360

Sharing is caring!