Bringing together co-design and evaluation: Creating monitoring frameworks that won’t sit on the shelf
Monitoring frameworks are essential tools for tracking and understanding the impact of initiatives, but too often, they become static documents that fail to reflect the realities of those implementing them. In a recent collaboration with the Accident Compensation Corporation (ACC), Dovetail and ThinkPlace set out to change this by co-designing a performance monitoring framework that would be both rigorous and practical for the NGOs delivering workplace injury prevention initiatives.
As part of ACC’s five-year programme to create safer workplaces, grant funding has been provided to a range of organisations working to improve workplace health and safety. These organisations operate across diverse sectors, from mental health and wellbeing to good work design. Recognising the varied needs and capacities of these NGOs, we embarked on an eight-month co-design process to create a framework that was both flexible and robust.
Process: Co-designing with NGO’s to understand their needs
Our process was built on the intersection of monitoring and evaluation with co-design principles, ensuring that NGOs were central to the development of a framework. The key phases included:
Project initiation: Meeting NGOs face-to-face to establish relationships and understand their work.
Needs assessment: Conducting online hui with NGOs to understand their current monitoring capabilities and challenges. Using online whiteboarding tools, we identified key needs, including the desire for simple, flexible, and meaningful performance monitoring.
Framework co-design: Facilitating iterative co-design workshops with NGOs to develop a shared Theory of Change and a set of relevant indicators.
Framework delivery and support: Developing a suite of resources, including guidance materials, templates, and indicator tables, to ensure NGOs could confidently implement the framework.
Socialisation and sensemaking: Engaging ACC and NGOs in testing and refining the framework to ensure it met both practical needs and strategic reporting requirements.
Throughout the process, we balanced structure with adaptability, ensuring that the framework was not just a compliance tool but a meaningful way for NGOs to track and communicate their impact.
“As well as instructions and resources to support implementation, we wanted to make sure that NGOs felt empowered and confident to collect their own data to support monitoring their initiatives against the co-designed framework going forward, nobody wanted this to be a framework that would sit on a shelf collecting dust!” – Avara Moody, Senior Designer at ThinkPlace
Key lessons for future work in this space
Through this process, we identified several critical lessons for designing monitoring frameworks that are both effective and sustainable:
Bringing co-design and evaluation together requires balancing structure and flexibility
One of the biggest challenges—and opportunities—of this project was balancing the structured approach that evaluation typically requires with the adaptability needed in co-design. Evaluators often rely on systematic analysis and predefined frameworks, while co-design processes prioritise iteration, feedback loops, and responsiveness to participants’ needs. Successfully marrying these two approaches required a high level of trust, open communication, and a willingness to adapt. Future projects seeking to integrate these methodologies should be prepared to embrace a level of fluidity while still maintaining a clear line of sight to overall monitoring goals.
A monitoring framework is never finished, be prepared to continue iterating
For monitoring frameworks to be useful, they must evolve. NGOs and commissioners should work flexibly to keep frameworks relevant, realistic, and aligned with the changing nature of their work. This means regularly revisiting and refining indicators, methods, and reporting expectations. Throughout this process we recommended striving for progress and not perfection. In this project it was unlikely that everyone would be able to measure everything perfectly from day one (no matter how badly they wanted to!) Monitoring is a journey, not a destination and in these cases you shouldn’t let perfection get in the way of progress. Making space for organisations to continue learning, growing and measuring is key.
“We are thrilled to share the success of our collaboration with Dovetail and ThinkPlace in developing a robust performance monitoring framework. The key to our success was the seamless integration of strong relationships between ACC and our partner NGOs, the technical expertise of our evaluators, and the iterative, need-focused co-design approach. As we continue to implement and refine this framework, we are excited to see it come to life and deliver its full value.” Diego Rodríguez – Injury Prevention Leader
Evaluation is about learning, not just accountability
A monitoring framework should not just be a compliance exercise, it should also support continuous improvement. One of the strengths of this co-design process was that it helped NGOs clarify what success looked like for them and how they could measure progress toward it. ACC created a space where NGOs could feel free to contribute and learn, without feeling they were being assessed against their current monitoring efforts, and this was crucial. Future initiatives should ensure that monitoring is framed as a learning tool rather than just a reporting requirement. Encouraging a culture of curiosity and reflection can help organisations see monitoring as a valuable part of their work rather than an external obligation.
“From our perspective, the trust was really evident between ACC and the NGOs. Early on, ACC said they would like to be present in sessions with NGOs and we had wondered at the time if their presence may make NGOs feel pressured to answer questions in certain ways or present their work in certain ways. But when we asked NGOs if they would mind ACC being in the room, they had absolutely no concerns. ACC attended those sessions, and this actually created this feeling that we were all on the same team, learning and developing together to create something that would work for everyone.” – Amanda Hunter, Director at Dovetail
This project demonstrated that co-designing performance monitoring frameworks with those who will use them results in tools that are not only methodologically sound but also practical and meaningful. By embedding flexibility, fostering trust, and planning for long-term use, we can create monitoring frameworks that actively support learning and improvement rather than gathering dust on a shelf.
We’re excited to continue sharing insights from this mahi and to see how this approach can be applied in other contexts. If you’re interested in learning more about our work or discussing how co-design can strengthen performance monitoring in your organisation, get in touch!