Home/Programmes/Research, MEL
Programme Pillar · Lesotho & Africa

Research, Monitoring, Evaluation, and Learning as How Serious Organisations Stay Honest

CDSJ treats this pillar as a wider institutional commitment to evidence, reflection, accountability, and informed action — not a technical back-office function that exists only to satisfy reporting requirements.

Applied ResearchProgramme Monitoring EvaluationAdaptive Learning Evidence TranslationKnowledge ManagementDecision Support
Our Approach

Making Evidence Useful, Learning Deliberate, Decisions Grounded

Too much development work still produces information without producing enough understanding. Data is collected, indicators are reported, activities are counted — yet organisations still struggle to answer the most important questions: What is changing in people's lives? Why are some interventions gaining traction while others stall? Where is progress fragile?

This pillar brings together applied research, programme monitoring, evaluation, adaptive learning, evidence translation, knowledge management, and decision support — because better programmes are not built only by good intentions, but by organisations that pay attention, learn deliberately, and adjust when reality doesn't match assumption.

"CDSJ uses RMEL not only to look back, but also to improve what happens next — paying attention to whose perspective is missing, which assumptions need testing, and how evidence can strengthen both accountability and adaptive management."
Research MEL
Programme Ambition

Building a Stronger Culture of Inquiry, Reflection, and Informed Action

CDSJ's ambition is not simply to produce more reports. It is to build a stronger culture of inquiry and reflection in which the organisation is better able to understand context, track progress, assess quality, explain results, identify gaps, and act on what it learns.

Technical

Rigorous evidence that is actually useful

Research, monitoring, and evaluation that is rigorous without becoming detached — connecting data to judgement, learning, and programme improvement.

Adaptive

Learning built into implementation, not reserved for the end

Making learning operational rather than ceremonial — so that programmes become more responsive, more context-aware, and more honest about what needs to change.

Participatory

Community voice as part of the evidence base

Ensuring that evidence includes the experience and feedback of the people most affected — because some of the most important knowledge comes from how programmes are lived.

What This Pillar Covers

Seven Workstreams for Evidence and Learning

Seven interconnected workstreams from monitoring design through knowledge management and institutional memory.

1

Monitoring Systems That Track More Than Activity

Strengthening routine monitoring frameworks, indicators, data flows, and progress tracking that help organisations see patterns early, identify bottlenecks, and connect implementation to outcomes more intelligently.

2

Applied Research and Contextual Analysis

Undertaking assessments, studies, diagnostics, and thematic analysis that explain what is happening and why — grounding strategy in serious contextual understanding rather than assumption.

3

Adaptive Learning and Reflective Practice

Embedding reflective review processes, pause-and-learn moments, and implementation learning loops that allow teams and partners to test assumptions and adjust delivery with discipline.

4

Evaluation of Outcomes, Quality, and Sustainability

Conducting baseline and endline studies, mid-term and final evaluations, and quality assessments that ask not only whether something happened, but whether it mattered and is likely to endure.

5

Translating Evidence into Advocacy and Decision-Making

Strengthening the movement from evidence generation to evidence use — through learning briefs, synthesis products, decision notes, and strategy support that turns data into action.

6

Participatory Evidence and Community Feedback

Strengthening community feedback systems, participatory reviews, and accountability-oriented evidence practices that make community experience more visible in institutional judgement.

7

Institutional Memory, Knowledge Products, and Continuous Learning Culture

Building institutional memory through knowledge products, learning archives, thematic syntheses, and cross-pillar evidence systems — making learning cumulative so each piece of work strengthens the organisation's wider credibility and long-term effectiveness.

Impact

Results We Aim to Contribute To

Through this pillar, CDSJ builds stronger evidence systems, improves programme quality, and supports more informed decision-making across all areas of work.

Stronger monitoring and evidence systems that connect data to programme improvement and accountability

Better-informed strategic decisions grounded in contextual analysis and applied research

More adaptive programmes that adjust to real conditions rather than persisting with weak assumptions

Stronger evaluation culture that asks whether interventions mattered and are likely to hold

More effective use of evidence for advocacy, strategy, and institutional positioning

Deeper institutional memory and a stronger culture of continuous learning across CDSJ and its partners

Partner With CDSJ on Research, MEL and Learning

Whether you are a funder seeking stronger programme evidence, an organisation needing MEL support, or a research partner — CDSJ offers rigorous, community-grounded, and practically useful evidence capabilities across Lesotho.