Date of this Version
Journal of Research on Educational Effectiveness
This paper complements existing power analysis tools by offering tools to compute minimum detectable effect sizes (MDES) for existing studies and to estimate minimum required sample sizes (MRSS) for studies under design. The tools that accompany this paper support estimates of MDES or MSSR for 21 different study designs that include 14 random assignment designs (6 designs in which individuals are randomly assigned to treatment or control condition and 8 in which clusters of individuals are randomly assigned to condition, with models differing depending on whether the sample was blocked prior to random assignment and by whether the analytic models assume constant, fixed, or random effects across blocks or assignment clusters); and 7 quasi-experimental designs (an interrupted time series design and 6 regression discontinuity designs that vary depending on whether the sample was blocked prior to randomization, whether individuals or clusters of individuals are assigned to treatment or control condition, and whether the analytic models assume fixed or random effects).
The Version of Record of this manuscript has been published and is available in the Journal of Research on Educational Effectiveness, 2013, http://www.tandfonline.com/10.1080/19345747.2012.673143.
sample design, power analysis, minimum detectable effect size (MDES), minimum required sample size (MSSR), multilevel experimental and quasi-experimental designs
Maynard, R. A., & Dong, N. (2013). PowerUp!: A Tool for Calculating Minimum Detectable Effect Sizes and Minimum Required Sample Sizes for Experimental and Quasi-Experimental Design Studies. Journal of Research on Educational Effectiveness, 6 (1), 24-67. http://dx.doi.org/10.1080/19345747.2012.673143
Date Posted: 20 May 2015
This document has been peer reviewed.