Back to Bayes-ics: Improving universal screening decisions by quantifying uncertainty

Abstract

Universal screeners of academic skills in schools are intended to predict the probability of academic risk in an efficient and economical manner. Recent methods of calculating post-test risk probabilities (Klingbeil et al., 2019; 2021) have been demonstrated to be simple and efficient to calculate, improving data-based decision-making practices in schools. However, these methods do not leverage the full advantages of Bayesian statistical inference, thereby limiting the quantification of uncertainty in the calculation of posterior probabilities of risk. This could produce overly deterministic data-based decisions. Bayesian ordinal regression models (BORMs) are a fully Bayesian extension of existing posterior probability calculations, and they offer multiple potential advantages for enhancing universal screening practices in schools. Through simulations and an applied example using real screening data, we elucidate some of the issues around BORMs in screening, including potential strengths (e.g., multilevel modeling) and barriers to practice (difficulty of interpretation/implementation). We discuss how BORMs can further advance both research and practice of data-based decision making in universal screening in schools.

Publication
EdArXiv preprint
Garret Hall
Garret Hall
Assistant Professor

I research children’s development of academic and behavioral skills, how contexts that shape that development, and the quantitative methods that are used to examine these areas.