Terracotta is designed to lower the technical and methodological barriers to conducting education research. These barriers are well-known in the education research community. I've started collecting quotes about the difficulty of running field experiments in education. Once Terracotta enlightens education research methods, perhaps we'll all look back on these quotes as evidence of a dark age, when progress was impeded by the absence of technological innovation. If you have a quote you'd like to share for this list, send it to me and I'll include it here with appreciation!
Randomized experiments of interventions applying to entire classrooms can be extremely difficult and expensive to do.
Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31(7), 15-21. https://doi.org/10.3102/0013189X031007015
Although random trials remain the gold standard for assessing intervention effectiveness, it is often impractical and sometimes unethical to conduct such trials in the everyday contexts of postsecondary institution operations.
Borden VMH, Hosch BJ. (2018). Institutional Research and Themes, North America. Encyclopedia of International Higher Education Systems and Institutions. (pp. 1–10.) https://doi.org/10.1007/978-94-017-9553-1_586-2
In addition to feasibility considerations ... In education studies, variables can rarely be controlled tightly and blinding of subjects and study personnel may be unethical or impossible.
Sullivan, G. M. (2011). Getting off the “gold standard”: Randomized controlled trials and education research. Journal of Graduate Medical Education, 3(3), 285-289. https://doi.org/10.4300/JGME-D-11-00147.1
The requisite resources are generally far in excess of what most educational researchers could hope to amass in the absence of considerable extramural funding. Consequently, researchers elect to conduct more manageable, less ambitious, and typically, less carefully-controlled classroom-based investigations.
Levin JR. (2005). Randomized classroom trials on trial. In Empirical Methods for Evaluating Educational Interventions (Phye GD, Robinson DH, and Levin JR, Eds.). pp. 3–27. Burlington: Academic Press.
By the time an experiment is designed, implemented, and evaluated, it is often true that the policy debate has moved ahead and the results are no longer of direct policy interest
Schanzenbach, D. W. (2012). Limitations of experiments in education research. Education Finance and Policy, 7(2), 219-232. https://doi.org/10.1162/EDFP_a_00063
Another possibility [that would explain the rarity of experimental research] is that the decline is related to researchers’ perceptions of the rigorous methodological standards, challenging practical constraints, and needed resources associated with conducting scientifically credible educational intervention research (e.g., Levin, 2005; Mosteller & Boruch, 2002). Although Pressley and Harris (1994) and Levin (1994) argued for “better” intervention studies, the perceived obstacles and costs may dissuade investigators from conducting such research.
Hsieh, P., Acee, T., Chung, W.-H., Hsieh, Y.-P., Kim, H., Thomas, G. D., You, J.-i., Levin, J. R., & Robinson, D. H. (2005). Is Educational Intervention Research on the Decline? Journal of Educational Psychology, 97(4), 523–529. https://doi.org/10.1037/0022-0663.97.4.523