Evaluation Academy Impact and Process Evaluation, 2025

Blumenau, Jack (2026). Evaluation Academy Impact and Process Evaluation, 2025. [Data Collection]. Colchester, Essex: UK Data Service. 10.5255/UKDA-SN-858158

Data description (abstract)

This archive contains the data and code required to replicate the results in the Cabinet Office’s “Evaluation Academy 2025: Impact and Process Evaluation Final Report”.

The Evaluation Academy is a capability building course, developed by the Evaluation Task Force (ETF) in the Cabinet Office, which uses a train-the-trainer model to improve civil service analysts’ ability to teach others about evaluation methods.

Participants of the Academy were provided free, in-person training over the course of five days across a two-week period. The course material for the workshop was split across 10 modules: Introduction to evaluation; Theories of change; Scoping an evaluation; Process evaluations; Experimental designs; Quasi-experimental methods; Theory-based evaluations; Value for money evaluations; Planning and managing an evaluation; Communicating evidence.

The Evaluation Academy aims to ensure that participants will be able to deliver and explain the content of the course to a high level. The goal is for participants to be able to both understand and deliver the theoretical material on evaluation, as well as to be able to explain to participants how to apply the content in real-life evaluation scenarios.

The ETF conducted an impact evaluation of the first stage of the Evaluation Academy’s train-the-trainer model in July 2025. The impact evaluation employed a randomised controlled trial (RCT) which used a two-arm, individual-level waitlist design to investigate the impact of the Evaluation Academy on two primary outcomes – participants’ confidence in delivering evaluation training and the size of participants’ evaluation network – and one secondary outcome – participants’ knowledge of evaluation methods and processes.

The impact evaluation demonstrated large and significant effects of the Evaluation Academy on both primary outcomes. Compared to those in the control group, participants reported substantially higher levels of confidence in delivering evaluation training (an average increase of approximately 1-point on a 5-point scale) as well as substantial increases in the size of their cross-government evaluation networks (an average increase of 8 people).

By contrast, the impact evaluation found more limited evidence of impact on participants’ knowledge of evaluation methods and processes. Although treatment group participants performed better on average than the control group on a post-intervention set of evaluation knowledge questions, this difference was not statistically significant. The absence of large knowledge effects reflects the fact that civil servants participating in the first stage of the train-the-trainer model had reasonably high levels of pre-existing evaluation knowledge.

Data creators:
Creator Name Affiliation ORCID (as URL)
Blumenau Jack University College London https://orcid.org/0000-0002-0536-1564
Sponsors: NA
Topic classification: Education
Keywords: EDUCATION, EVALUATION, TEACHER TRAINING, DATA
Date published: 20 Mar 2026 12:58
Last modified: 20 Mar 2026 12:58

Available Files

Data

Downloads

data downloads and page views since this item was published

View more statistics

Altmetric

No resources to display

Edit item (login required)

Edit Item Edit Item