Expectation vs. Reality: data collection with young people
Funders often expect young people will fill in surveys honestly and independently, producing neat insights about programme impact. The reality is that data collection takes time, relationships, and trust. Staff often need to guide young people through forms carefully, supporting them without influencing their answers.
Similarly, ‘validated’ tools that promise robust measurement are not always the right fit. Many are too long, jargon-heavy, or insensitive to young people’s lived experiences. In open access provision, there may not be a clear start or end point against which to measure change. At its worst, inappropriate use of tools can risk safeguarding issues. But even when used right validated tools have the potential to alienate young people and distort their voices.
Engagement costs: building capacity for meaningful data
Good data doesn’t come from extractive surveys alone. Creative and participatory methods, from reflective activities to peer evaluation, help make data collection more meaningful, telling us the ‘why’ as well as the ‘what’. It also makes the experience less of a chore and more of a learning process - but this requires investment in staff training and capacity.
The hidden cost here is the time and resource needed to upskill practitioners, administer the data collection, create space for reflection, and design flexible approaches that work in real-life contexts.
Frameworks: when theory doesn’t match reality
Evaluation frameworks are meant to provide clarity, with Theories of Change mapping a logical chain from activities to outcomes. In reality, youth work rarely unfolds in straight lines, and outcomes and impacts are messy, shaped by the unique needs of each young person.
Too often, frameworks reflect funder priorities rather than the realities on the ground.
“Youth workers can feel caught between multiple agendas, with young people’s needs sidelined in favour of what is measurable”
Rigid frameworks risk stifling innovation and overlooking unintended outcomes.
Flexibility and trust are essential. Frameworks should capture the richness of youth work and its impact, not force it into boxes that don’t fit.
The real cost of MEL
One of the clearest messages from the session was about the true financial cost of monitoring and evaluation. While many funders allocate 10% of budgets to MEL, research suggests the real figure is closer to 20%. That includes not just data collection, but the analysis, reflection, and learning time needed to make evaluation meaningful.
Without this investment, organisations face survey fatigue, limited insight, and missed opportunities to adapt and improve. Frontline practitioners bear the brunt, shouldering administrative burdens that eat into time with young people.
Learning gaps: closing the feedback loop
Perhaps the biggest hidden cost of MEL is when the learning never comes back to those who need it most. Delivery organisations often don’t see the full analysis or understand how findings are used. Funded programmes may end before insights can be embedded. Staff turnover and shifting contexts further erode opportunities to apply learning.
"To change this, funders and organisations need to prioritise the feedback loop: sharing findings, reflecting together, and co-owning the next steps. Without this, evaluation risks becoming a tick-box exercise rather than a tool for genuine improvement".
Key takeaways
We closed the session with a set of practical takeaways for funders and practitioners alike:
- Respect practitioners as experts: involve them, and young people, in shaping MEL.
- Be flexible: adapt frameworks and tools to reflect real contexts and consider using grantees’ tools to measure your outcomes.
- Close the loop: ensure findings are shared and applied, not just reported.
- Invest in capacity, think ‘donor+’: build skills, time, and infrastructure for meaningful evaluation - approach MEL not just as accountability, but as shared learning.
Ultimately, the hidden cost of MEL is more than financial. It’s about relationships, trust, and the opportunity costs when programmes don’t work as intended. If we want evaluation to strengthen rather than strain youth work, we need to recognise and resource it properly, ensuring that everyone - funders, practitioners, and young people, benefits from the process.
If you would like to learn more, contact Maya Reggev, London Youth’s Learning & Impact Lead [maya.reggev@londonyouth.org] or Sally Baxter, Mary’s Youth Club’s CEO and Youth Development Manager [sally.baxter@marys.org.uk]
About London Youth
London Youth is a charity on a mission to champion and strengthen London’s youth organisations so young people have the opportunities and skills they need to succeed. They do this with and through their members – a network of 600 youth organisations – and at their two outdoor learning centres, Hindleap and Woodrow.
About Mary’s Youth Club
Mary’s Youth Club is based in Islington and aims to create a safe, welcoming community where young people are respected, their voices are valued, and their uniqueness is celebrated. A space where joy thrives, fun is abundant, and every young person feels empowered to belong and grow. They also deliver training and CPD for youth workers.