GSE Publications

Document Type

Journal Article

Date of this Version

2013

Publication Source

Journal of Research on Educational Effectiveness

Volume

6

Issue

1

Start Page

24

Last Page

67

DOI

10.1080/19345747.2012.673143

Abstract

This paper complements existing power analysis tools by offering tools to compute minimum detectable effect sizes (MDES) for existing studies and to estimate minimum required sample sizes (MRSS) for studies under design. The tools that accompany this paper support estimates of MDES or MSSR for 21 different study designs that include 14 random assignment designs (6 designs in which individuals are randomly assigned to treatment or control condition and 8 in which clusters of individuals are randomly assigned to condition, with models differing depending on whether the sample was blocked prior to random assignment and by whether the analytic models assume constant, fixed, or random effects across blocks or assignment clusters); and 7 quasi-experimental designs (an interrupted time series design and 6 regression discontinuity designs that vary depending on whether the sample was blocked prior to randomization, whether individuals or clusters of individuals are assigned to treatment or control condition, and whether the analytic models assume fixed or random effects).

Copyright/Permission Statement

The Version of Record of this manuscript has been published and is available in the Journal of Research on Educational Effectiveness, 2013, http://www.tandfonline.com/10.1080/19345747.2012.673143.

Keywords

sample design, power analysis, minimum detectable effect size (MDES), minimum required sample size (MSSR), multilevel experimental and quasi-experimental designs

Share

COinS
 

Date Posted: 20 May 2015

This document has been peer reviewed.