Science is dangerous, we have to keep it most carefully chained and muzzled.

Mustapha Mond, The Controller
Aldous Huxley (1932)

In Huxley’s Brave New World, in Orwell’s 1984, in Bradbury’s Fahrenheit 451, and even in Plato’s Allegory of the Cave, stable dystopias have two common features: a lack of freedom to explore, and an ignorance of one’s own imprisonment.  People are kept within a restricted range of movement spanning minimal degrees of freedom, but at the same time they are complacent.  By virtue of the minor flexibilities afforded within their boundaries, people are misled into believing they are free, even when they are restrained.

Fictional renderings of dystopian societies may seem wildly out there, particularly in the context of learning technologies, but they nevertheless provide a critical lens with which to view our present-day nonfictional world.  For example, this lens has been used to consider whether we are currently living in a technological dystopia (Kolitz, 2020; Morozov, 2012; Winner, 1997).  As with science fiction, a benchmark commonly applied in these considerations is whether technology enables freedom to openly explore and improve, or instead whether it (intentionally or unintentionally) stifles this freedom.  In this regard, an area of emerging relevance is learning technology.

Amidst the current proliferation of online learning technologies, learning scientists are grappling with the challenges of translating science into practice at scale.  The central problem goes something like this: any single learning tool or platform brims with assumptions, designed in a particular way, to enable a particular kind of interaction, for a particular kind of student, to benefit a particular kind of outcome, to be measured in a particular kind of way.  Studies conducted within these platforms are also influenced and constrained by these particularities, and in turn, the idiosyncrasies of a learning platform limit research flexibility and generalizability.  Even if it were possible for any researcher to experimentally manipulate any element within the ASSISTments platform, for example, the resulting inferences would still be specific to the platform, or at best, to intelligent tutoring systems.  Estimates of the effectiveness of intelligent tutoring systems are further limited by the local relevance of the outcome measures under analysis and their implementations within the local learning environment (Kulik & Fletcher, 2016) — variables not manipulable within the tools themselves.  Moreover, if a brilliant inventor stumbled upon a novel instructional system that promised improvements over intelligent tutoring systems, the research community’s reliance on existing platforms as principle tools for conducting experiments would no longer enable progress, this reliance would be a barrier to progress.

No platform developer, or learning technology startup sees themselves as Controller Mond, intentionally stifling science and innovation.  On the contrary, we have gravitated toward the education sector out of an authentic interest in helping people, improving equity, and facilitating social progress.  It is, without a doubt, a good thing that these platforms are developing new degrees of freedom to allow researchers to engineer and explore innovations within them.  But if the only space to innovate is within walled gardens, we will be ever-limited in our ability to understand our potential, and to effect change.

That is why we need Terracotta. We need to make a research platform compatible with education settings, rather than focusing solely on making educational platforms compatible with research.  While platforms and support tools unquestionably should be empowered to conduct rigorous experimental research on what works, they can’t be the only game in town. In turn, the primary goal of Terracotta is to democratize experimental learning research, and thus to advance the generalizability and translatability of research findings.


Huxley, A. (1932). Brave New World. Chatto & Windus.

Kolitz, D. (2020, August 24). Are We Already Living in a Tech Dystopia? Gizmodo.

Kulik, J. A., & Fletcher, J. D. (2016). Effectiveness of Intelligent Tutoring Systems: A Meta-Analytic Review. Review of Educational Research, 86(1), 42–78.

Morozov, E. (2012). The Net Delusion: The Dark Side of Internet Freedom. PublicAffairs.

Winner, L. (1997). Technology Today: Utopia or Dystopia? Social Research, 64(3), 989–1017.