University of Canterbury Case Study

How Innovative Learning Analytics Allowed Researchers to Optimize Chemistry Education

The University of Canterbury used Stemble's innovative learning analytics to learn how different chemistry instruction models affected student performance.

Two students studying together outside with notebooks and a laptop

Stemble helped the University of Canterbury study the effect of different instruction models on how well students learned concepts in chemistry by using its innovative learning analytics. The powerful analytics allowed the research team to understand which methods of instruction performed the best and to objectively optimize ways to teach chemistry.

About the University of Canterbury

The University of Canterbury, Christchurch, New Zealand, is a public research university founded in 1873. Dr. Owen J. Curnow is a professor of chemistry working in organometallic chemistry and chemistry education. In addition to teaching general chemistry, Dr. Curnow works on developing new, practical methods of teaching chemistry, particularly those leveraging technology.

Bringing Stemble's learning analytics capabilities into the classroom instantly puts powerful teaching and learning tools in the hands of educators.

The Challenges

Measuring the effectiveness of teaching methods is notoriously difficult. It’s challenging to isolate the variables, such as what a student may have already known before instruction began.

For example, how do you know if a student really understands a concept before or after an intervention? Incorrect answers on tests don’t always indicate a lack of understanding. Conversely, correct answers don’t always correlate to sound comprehension either.

Even if you were to test students before and after teaching them something, it’s still hard to attribute their performance to your teaching. What if they performed well because they read the book or were good at searching for information online? You could survey students, but survey data is subjective and prone to bias. Getting data — especially accurate data — is tricky at best.

Therefore, researching the efficacy of innovative teaching methods is a great challenge.

Stemble’s Solution

Stemble’s experience as a data-driven company helped Dr. Curnow identify measurable differences in students’ performance after experiencing different teaching methods.

The Stemble platform tracks student performance and behavior constantly. In fact, Stemble’s chemistry experts use that data to map performance onto the platform’s native curriculum.

The result is that educators can observe and monitor student performance within a particular subject area. Stemble’s previous internal studies show that student performance data can provide two key metrics.

  1. Initial error fraction: The “initial error fraction” describes what percentage of your students don’t solve the problem on their first try.
  2. Learning rate: The “learning rate” indicates how quickly those students catch on. In other words, what fraction of the class can be expected to get the problem correct after a given number of attempts.

The Stemble research team knew that all of the homework tasks ever assigned — many hundreds of them — follow the same general trend where more and more students catch on over time.

But those two key metrics — the initial error fraction and learning rate — are unique for every task.

Some tasks are really easy, and (1) most students get it correct immediately, and (2) everyone else catches on quickly. Other tasks are very difficult, and (1) very few get it right immediately, and (2) everyone takes a long time to catch on. We’ve seen all the other combinations as well.

These key metrics provide a definitive way to measure difficulty and learning rates in chemistry, based on objective, repeatable data in a readily available tool for educators and researchers.

Because Stemble had this exciting data template, Dr. Curnow could have different cohorts of students do the same problems sets in chemistry. The research team gave the cohorts different experiences: one cohort was taught in one particular way, and the other cohort was taught another way.

It was possible to find measurable differences in student performance by objectively comparing the student cohorts’ initial error fractions and learning rates.

"We are very happy with the Stemble app for our chemistry education research. The app allowed us to investigate student performance in a systematic way, and the staff were very helpful." -- Dr. Owen Curnow, University of Canterbury

The Benefits of Learning Analytics

Stemble’s unique analytics technology allowed this study to be conducted, as previously there were no other ways to have gathered accurate data. A “paper” assessment couldn’t capture the temporal data of a student retrying questions to get a solution. Measuring the number of attempts it takes a student to figure out the problem is required to make an accurate comparison.

Other digital chemistry tools on the market provide only overall grade data. Once a student gets it right, they get the credit for the problem, and that’s that. Stemble’s ability to provide student performance data allows an objective comparison for educational methodology research.

Dr. Curnow’s research team has a great teaching and learning tool, and they have a research tool that they can use to improve their practice. What is especially exciting is that Stemble can provide definitive, objective data in an area for which data gathering is notoriously difficult.

New call-to-action

The Outcome

”We are very happy with the Stemble app for our chemistry education research. The app allowed us to investigate student performance in a systematic way, and the staff were very helpful.” — Dr. Owen Curnow, University of Canterbury

Stemble has made it possible for Dr. Curnow to study the efficacy of his methods in a realistic, objective way. Not only that, but because it’s built into Stemble’s reporting, it’s so straightforward to do so that one could say that it takes no extra effort at all.

Like Dr. Curnow, chemistry teachers are scientists at heart — that’s why they went into the profession in the first place. However, these scientists are disincentivized to “experiment” in ways that could improve their students’ outcomes. Applying the scientific method to learn something new is difficult in the classroom because chemistry professors often don’t have a social psychology or education degree. Yet, any chemistry teacher can tell you about something they’d like to test out in the classroom. They know chemistry and how to design a study, but putting it into practice is difficult so many resort to little more than anecdotal evidence and “gut feelings.”

Bringing learning analytics capabilities into the classroom instantly puts powerful teaching and learning tools in the hands of educators. With this game-changing capability, they can now objectively test what works and what doesn’t.

Dr. Curnow and other researchers are now free to explore many other applications of research to optimize chemistry education. Their efforts can explain how different cohorts of students perform under different conditions. Researchers can systematically test the effect of different instructors, textbooks, choices for scope and sequence of curriculum, or any other educational variables.

As this technology becomes more widely available, the global scientific community stands to benefit, but none so much as tomorrow’s students.

Download the Full Case Study


Stemble is a data-driven teaching and learning platform built by and for chemistry instructors and students. Using the Stemble platform, you can easily deploy homework assignments and virtual labs and collect detailed analytics on student performance.

Our proprietary analytical models allow you to assess learning rates among your students and diagnose problems early. You could also experiment with different instructional models or strategies to measure their efficacy with ease.

To learn more about how to implement Stemble at your institution, book a demo today.

Published