Read our new research-based guide to pupil motivation and engagement

Is all evidence created equal? Tips for teacher educators to find reliable evidence

Share this page

Date published 31 July 2024

Designing effective professional development for teachers is a challenging task. Luckily, teacher educators don’t have to rely exclusively on trial and error.

Instead, they can stand on the shoulders of giants and rely on vast amounts of existing research to make evidence-based design decisions. But is all evidence equally reliable? It’s something that we in the Research team at Ambition have had to consider in the wake of our latest research on advance organisers.

Many teachers are likely familiar with the concept of advance organisers and might have even used them in their classrooms. Advance organisers are tools like texts or diagrams that provide the learner with a structure for incoming information. They are thought to help learning new information by guiding the learner’s attention, providing a scaffold and activating relevant prior knowledge.

Numerous studies suggest that advance organisers can help with the learning of new information. In our latest research project, we tested whether advance organisers also support learning in online professional development. Despite the large literature on advanced organisers, we found that trying to design a suitable advance organiser was quite difficult – mainly due to a lack of examples provided in existing research!

Our research found that the advance organiser we designed did not help teachers learn new information – contrary to our expectations and to what the literature suggested.

The replication crisis

Why did we fail to replicate an effect that seems to have been replicated many times before? One possible reason is that a lot of work on advance organisers was conducted before the replication crisis of the early 2010s.

The replication crisis revealed that many studies conducted in the field of psychology and social sciences could not be replicated, that is, their results could not be reproduced when the study was repeated. The replication crisis unearthed several questionable research practices that have likely contributed to the lack of reproducibility. Questionable research practices include:

  • p-hacking: manipulating data until statistically significant results are obtained
  • selective reporting: only publishing favourable outcomes
  • HARKing: hypothesising after the results are known

Questionable research practices in action

What do these questionable research practices look in action? Let’s imagine a scenario where a researcher was interested in determining whether reading an advance organiser before some new information improves the learning compared to reading a text isn’t an advance organiser.

The researcher would engage in p-hacking if they kept running the same analysis in subsets of the data, either by removing participants from the dataset due to dubious reasons or by focusing on subgroups like participants identifying as female, until they finally observe the results they were expecting.

Selective reporting would occur if the researcher then only reported the one analysis that led to the significant results but not all the others conducted while p-hacking that led to non-significant results.

Let’s imagine that the analysis revealed that advance organisers are effective but only in participants identifying as female. HARKing would occur if the researcher now wrote up the paper stating that it has always been their hypothesis that advance organisers would only be effective in participants identifying as female when in reality, they only developed the hypothesis after having analysed the data.

Such practices can lead to the publication of false-positive results (that is, the study reports advance organisers to be effective when they actually are not) or exaggerated effect sizes (that is, the study reports advance organisers to be more effective than they actually are).

Teacher cpd

Designing professional development on shaky foundations

When teacher educators rely on such evidence, they risk making design choices on potentially shaky foundations.

Another significant challenge is the inaccessibility of materials (for example the advance organiser) and methods used in older studies. This lack of transparency poses a substantial barrier to making use of the evidence to guide design decisions. Without access to the materials used, any benefits may get lost in translation.

We experienced the severity of this issue first hand when conducting this research project. While various researchers have provided definitions of what advance organisers are, the advance organisers that were used in previous studies were often not included in the supplementary materials. Without exemplification, our team had to rely on well-informed guesses when designing the advance organiser used in this study.

Another notable shift in research methodologies has been the transition from paper-pencil methods to online platforms. Studies conducted in traditional, controlled environments may not replicate when conducted online due to various factors such as differences in participant engagement, environmental control, and even the nature of stimuli presentation. This shift further complicates the use of older literature, as the context and medium of the original research may not align with modern methods.

Tips for finding high-quality evidence

How can teacher educators navigate these challenges when designing professional development? We recommend evaluating the quality of available evidence using the following criteria:

  • Did the researchers preregister their hypothesis and analysis? This increases the probability that the results are reliable. Researchers will often state this in the methods section of their paper. Preregistration combats questionable research practices because researchers have to clearly state how they will analyse the data to evaluate their hypothesis before seeing the data. Research from the last five years is more likely to have done this than older research. Some journals use 'preregistered' badges to clearly label this good practice.
  • Do they have open materials? For example, if the study includes or links to the exact material used, this makes it much clearer as to how teacher educators might design their own without the benefits being lost in translation. Some journals use 'open materials' badges to clearly label this good practice.
  • Are the effects well-replicated? If an effect has been found consistently across various contexts and ideally by different groups of researchers, teacher educators can be more confident that the effect actually exists and is not a false-positive effect.
  • Have the effects been found in contexts similar to where teacher educators would like to apply them? For example, have the effects of advance organisers only been found in classrooms but not in professional development contexts? Are advance organisers only effective in traditional learning environments or does research suggest that they also work in the context of online learning? The more similar the circumstances, the higher is the likelihood that the effects will replicate within the contexts that the teacher educators are working with.

Read the research

Stefanie Meliss Headshot-min.jpg
Dr Stefanie Meliss
Senior Research Scientist

Stefanie is a research and data scientist at Ambition Institute with a background in cognitive neuroscience. She is interested in teacher professional development, online learning and educational data mining.

Follow Dr Stefanie Meliss

Search blog posts by topic:

New research out now

Read the research