Sorry, you need to enable JavaScript to visit this website.
Search
November 22, 2021

Democratizing the Development of Evidence

Authors

Cara Jackson, Abt Global

“Democratizing evidence” means enabling a variety of stakeholders to participate in the production of evidence. The result should be an evidence base that is more relevant to stakeholders’ concerns. Greater transparency and inclusivity in the evidence development process—to say nothing of increased responsiveness to stakeholder concerns—may increase the subject community’s desire to act on findings, which in turn should create incentives for research leaders to consider democratizing evidence in the decisionmaking processes.

This approach absolutely speaks to current research needs in the education sector. The purpose of this essay by Abt’s Cara Jackson is to establish a framework for broader community involvement as a means of increasing the relevance and usefulness of evidence developed to support the needs of communities served by educational institutions.


Read More

Children's absenteeism from pre-K to kindergarten: A focus on children receiving child care subsidies

This article reveals absenteeism is high among children receiving child care subsidies during pre-K and pre-K absenteeism predicts kindergarten absenteeism.

Learn More
Publication

Laying a Solid Foundation for the Next Generation of Evaluation Capacity Building: Findings from an Integrative Review

A study co-authored by Abt’s Sebastian Lemire captured the state of the art of evaluation capacity building research, including 20 years of trends and findings.

Learn More
Publication

Selecting Districts and Schools for Impact Studies in Education: A Simulation Study of Different Strategies

This simulation study explores whether formal sampling strategies for selecting districts and schools improve the generalizability of impact evidence from experimental studies. Specifically, the simulation study evaluates a hypothetical intervention targeting K–5 schools. The authors constructed a national target population of schools from the Common Core of Data and generated simulated impacts of the intervention for the entire population. From this population, the authors selected a sample of districts and schools, simulated district and school decisions about whether to participate, and simulated replacing districts and schools that decline to participate. The authors then calculate the average school-level impact for the resulting sample of schools and compare it to the average impact for the target population. The simulation repeats this procedure many times, each time selecting a different sample.The selection strategies the authors tested include: (1) a stylized approach that recruits districts and schools in order from largest to smallest; (2) random selection with probabilities proportional to district size, as used in some surveys; and (3) balanced selection, which prioritizes the most typical districts and schools based on their characteristics. The authors tested all combinations of these three approaches for both districts and schools.The study finds that random selection of districts with either balanced or random selection of schools produced samples with the most consistently strong generalizability. The study also explores recruiting burden, selecting replacement districts when using random selection, and sensitivity of the findings to simulation parameters.

Learn More
Publication

Estimating the Impact of Emergency Assistance on Educational Progress for Low-Income Adults: Experimental and Nonexperimental Evidence

Three nonexperimental approaches returned poor estimates of the impact of emergency assistance on educational progress for job trainees.

Learn More
Publication