Case studies

Have you ever wondered…?

Case study

Understanding how well your new assignment idea works

Let’s say you’re a teacher, and you’re wondering if a restudy assignment will improve students’ performance on the midterm exam. In Terracotta, you can build the experiment and link it to the midterm exam outcome.
Case study

Investigating how well a department- or district-wide intervention works

District administrators have been talking about recent research on mindset interventions in STEM courses and want to see how it works in your context. Rather than relying on anecdotal evidence, they decide to create the experiment in Terracotta, link it to outcomes, and contribute to the broader conversation.
Case study

Testing your hypothesis about an education method

As a researcher, you’re always on the lookout for a tool that can help you access high-quality data, but you’re well aware of the difficulties of conducting responsible experiments in classroom contexts. With Terracotta, you can build the experiment, integrate informed consent into the process, run the experiment, and access your deidentified data when it’s done, with students who opted out automatically excluded from the data export.
Case study

ManyClasses

This study sought to determine the effect of an instructional practice (in this case, prequestions), when its implementation might vary between classes. To do so, the research team used Terracotta to run a multi-site within-subject randomized controlled experiment, conducted across 26 diverse classes ranging from 6th grade to college seniors, examining the generalizable effect of prequestions on student learning from online media. Terracotta made this otherwise unwieldy study efficient, and expanded the scope of data collection to include raw clickstream interactions with learning materials, item-level assessment responses, and more. Terracotta also standardizes them, deidentifies them, and automatically removes nonconsenting participants, making it possible to post all the raw data publicly, without manipulation, in a common, well-documented format.
Case study

Retrieval vs. Restudy

Researchers used Terracotta to manipulate online review assignments so that consenting students alternated, on a weekly basis, between taking multiple-choice quizzes (as retrieval practice) and reading answers to these quizzes (restudy). Students' performance on subsequent exams was significantly improved for items that had been in retrieval practice review assignments. All materials, data, and analyses are publicly available at https://osf.io/yrbhe/,and; read the full study here: https://link.springer.com/article/10.3758/s13428-023-02164-8.
Case study

Virtual Exchange

A business professor at a US university wanted to test whether a virtual exchange experience affected students’ international communication and critical thinking skills. To do so, she and a colleague from a university in Ecuador conducted a common activity with their students over two Zoom sessions. The professor used Terracotta to administer surveys before and after the exchange. She found that her students at the US university did improve their intercultural communication and critical thinking skills through the virtual exchange program.

Terracotta helps you build the experiment and gives you the data you need when it's complete.