Back to latest

Making our learning systems concrete

In a previous post, we outlined the general thinking and framework behind our Learning Plan. Now, we want to share a bit of what that framework means in practice. This happens through what we call our learning systems: how we enable learning on a set of topics, by a group of actors, through the use of specific practices.

We have six systems in place, each working in quite different ways to promote learning for ourselves and others working on open contracting. Those six systems are:

  1. Country tracking of open contracting forms a foundation for many of the other learning systems, as our learning is grounded in having an accurate understanding of the global state of open contracting. We (with inputs from partners) periodically update our global landscape to measure the extent to which contracting is open in various countries, whether it’s established as a positive norm, and the strength of implementation and reform efforts by in-country champions and others. The data resulting from this tracking shared through our open contracting map.
  1. Showcase & Learning reviews aim to track the progress of our Showcase & Learning projects in ways that foster learning among participants and capture learning to share with others. Each of these  projects has a monitoring, evaluation, and learning consultant who tailors our general indicator framework—especially the indicators for understanding how open contracting is and what benefits it has—to the particular concerns and needs of stakeholders in that context, collecting data through assessments, surveys, and other tools. That data feeds into multi-stakeholder reflections: critical convenings that occur throughout the project, typically at kick-off, at one or more mid-points, and after the close. Reviews aim to capture a project narrative, identify trouble spots, and surface insights.
  1. Open Contracting Data Standard (OCDS) development supports the evolution and expansion of the OCDS. Through this learning system, the OCDS team (composed of OCP staff and external consultants) brings qualitative and quantitative data together in reviewing the data standard itself (including revisions to documentation, upgrades to the standard, and governance of the standard) as well as our support to OCDS implementers and data users. These reviews occur at an annual meeting, as well as through periodic reports that the team generates throughout the year on specific projects and discussion issues (e.g. sectoral applications of OCDS).
  1. Advocacy reviews tackle one of the most difficult topics for learning, monitoring, and evaluation. The uncertainty and opportunism that characterize advocacy call for multiple perspectives—staff, partners, advisory board members, external observers, and others—to triangulate, understand, and reflect on the impact of our advocacy work. These multi-stakeholder reflections occur twice a year, drawing on recent activity to inform future advocacy, communications, and other engagement.
  1. Research projects are targeted to advance or investigate specific questions that are important to our strategy. They serve a range of functions; for example, they may contribute to advocacy by identifying and highlighting evidence of impact, strengthen implementation by surfacing tools for practitioners, or further organizational goals. The research agenda evolves over time in response to gaps and needs.
  1. Our organizational and strategy reviews bring staff together to capture tacit knowledge and reflect on all aspects of our work. They create space for reflection and conversation among staff, at various intervals and levels of depth. For example, our briefest reviews take less than half an hour: after-action debriefs capture lessons in a simple plus-minus-delta format following major events or meetings; and monthly rapid reflections bring the full team together to update our key organizational indicators. Longer but less frequent sessions include: quarterly reviews (typically three hours) involve more detailed discussion of key challenges and potential changes in direction; and day-long annual reviews create space for deeper re-assessment of our overall strategy.

Our learning plan describes each of these six learning systems in further detail, linking each one more explicitly to the topics, actors, and practices involved in each. The plan also includes a list of sample indicators for each of the topics, which are tailored for each project where they’re relevant.

We crafted this plan over the course of several team sessions and drafts, with invaluable feedback from board members and other advisors. We’ve shared it through these blog posts in the hopes of promoting better alignment with our partners, getting feedback for future iterations (this will always be a living document), and to help inspire other organizations on their learning journeys. Hopefully there are elements here that are relevant and useful to your own work—and if so, please let us know.

Want to help us go deeper on learning for open contracting? We’re hiring for a Learning Director to make this plan a reality.

Related Stories