Evaluation and Learning

Real Time Evaluation Practice: Preschool

Since 2004, our Preschool subprogram has been engaged with the Harvard Family Research Project (HFRP) in real-time evaluation (RTE).

Initially, the HFRP’s approach represented a new way of doing evaluation at the Foundation, with an emphasis on continuous (or real-time) feedback and learning. Because the Preschool subprogram’s strategy relied on advocacy and policy change—for which there were practically no established evaluation methods—the RTE also required methodological innovation.

The goals of the Preschool subprogram RTE were to:

  • Provide feedback on strategy outcomes, including short-term and intermediate outcomes
  • Provide ongoing feedback to inform strategy modifications and mid-course corrections
  • Facilitate grantee reporting in a way that maximizes its value to the evaluation, minimizes grantee burden, and encourages grantees to collect information that is useful for their own purposes.

Achieving these goals required an innovative evaluation approach that underscored collaboration, continuous feedback, and learning. The benefits of this approach have been that the evaluation has offered both real-time learning for multiple audiences (staff, grantees, key stakeholders, Trustees), and data for course correction and program improvement.

For example, the HFRP conducted a 2005 bellwether survey, which indicated that while the preschool issue was gaining traction among several key constituencies, it had yet to gain relevancy among business leaders and Latino opinion leaders. With this knowledge, Preschool program staff was able to focus grantmaking in ways that built awareness and engagement of these two constituencies over the subsequent three years. In 2008, follow-up bellwether interviews directly informed the team’s consideration of whether to revise its goal of achieving universal preschool by 2013. (See Preschool for California’s Children.)

Examples of additional practices used by the Preschool team in order to be real-time and responsive to program strategic needs include:

  • A dynamic logic model that is regularly revisited to determine whether it still accurately reflects the ongoing direction
  • Development of data collection tools to meet data needs.
  • Transparency with Foundation staff and grantees, including regular evaluation briefs and briefings with staff, and presentations of interim findings at annual grantee meetings
  • Critical indicators that are tracked regularly and updated annually
  • A willingness to be flexible and adjust the evaluation plan as the grantmaking strategy and Foundation’s learning needs evolve.

For example, for 2011, the Preschool program staff and evaluation team has set aside a part of the evaluation plan for “rapid response” data collection. HFRP annually reviews the evaluation methods with the Preschool team and together they decide whether there is a unique piece of data collection that can meet emerging learning needs and help inform program strategy.