EEF Gathering and Interpreting Data Summary
EEF Gathering and Interpreting Data Summary
Often, the decision to act begins with an instinct, a feeling or a hunch. Existing beliefs about problems in school can be powerful We sometimes use data that we have to hand rather than what we need. Examine information from a range of sources to build
and useful, but they can also reflect biases (which we all have). We need to check and challenge our initial thinking until we are a rich picture of the issue, recognising the strengths and weaknesses of different sources. Find the quiet trends in the data. Go
confident that the identified problem is both important and real i.e. a priority. Such confidence relies on two factors: beyond the headlines and explore the variation.
a. Gathering relevant and rigorous data Ask yourself, ‘What cause of a problem does the data represent?’, ‘What are the trends in the data over time?’, ‘What are the
underlying issues?’.
b. Generating plausible and credible interpretations of that data
Remember that any data you use are simply representations of the effects of a problem—one of the “multiple inadequate glances” National test data Internal test data Lesson observations OfSTED data Surveys/interviews
that you can take at the perceived issue. Be careful not to mistake the cause(s) of a problem with the outcome of a problem.
For example, low attainment at Key Stage 2 will be an outcome of underlying issues (see the figure in section 4). Generally reliable Tailor tests to needs Gives holistic view of Comparability to a Gathers perceptions
Overview of achievement Can use existing tests teacher’s actions and national standard Opens lines of
To generate evidence and insights on the problem we have to interpret data and use judgement, and that begins by questioning students’ learning responses External perspective communication
Gives comparative data Cheap and efficient
the quality of your data. Actionable conclusions Tailor surveys to needs
Pros
No increased workload
Overall scores can mislead Often not as reliable Potentially unreliable Potentially unreliable Low response rates and
interpretations of specific as external tests. May not represent High stakes can drive pressure to respond means
problems (question-level Internal tests data cannot normal practice unhelpful actions data can be unreliable
analysis can help)
Relevant Plausible Confidence
Cons
be compared to national Additional workload
Initial norms
Presence of observer Presence of observer
can bias practice can bias practice
knowledge and rigorous and credible that the issue
and beliefs data interpretation is a priority
Use overall scores across Use to provide fine-grained Use to observe the Consider perceived issues Use to understand the
Using Well
year groups and over insights on an issue, perceived issue in context, raised on inspection in perceptions of a problem
several academic years to alongside larger grain- and gain a richer picture of relation to your own school in context, and gather
provide reliable trend data size data (e.g. KS2 Maths how students and teachers improvement priorities suggestions for future
attainment) experience the issue actions
To generate evidence of a problem we have to provide credible and plausible interpretations of the data—this requires There are always weaknesses in the data schools use—everything from the wording of questions, to how tired the person
triangulating data from different sources and using judgement to draw accurate conclusions. marking test papers is, can affect the robustness of the information. This is something we need to accept and respond to
constructively by interrogating data for its quality. Ask yourself:
Here are some things to bear in mind:
Lesson • Are your biases, and those of colleagues, skewing your interpretations of the data?
Describe how each piece of data provides evidence for the observations
problem e.g. behavioural issues, captured through lesson indicate variable • Are there significant gaps in your data? If so, are you filling these gaps with your own assumptions and generalisations?
observations, suggest that pupils A and B are struggling to approaches
to teaching • Is the most relevant and rigorous data—that which is most fit-for-purpose—being prioritised, while data of less relevance and
access the curriculum. Identify for whom the problem exists,
when it happens and how it manifests. phonics rigour treated with greater caution?
Avoid fitting the data to your preconceptions—while you and Source of weakness How to identify the issue
the data may end up in agreement, this is not automatically Internal test
Learning walks Poor pupil
the case. Set aside preconceptions of problems and solutions data indicates Bias in the generation Be clear on what the data represent and don’t represent, and how they were generated e.g. internal test scores may be biased if the
indicate low attainment in of the data
and let the data reveal the nature of the issue. limited tests are set and marked by a teacher in a department under pressure to show pupils making quick progress.
levels of reading KS2 Reading background
fluency Data isn’t valid Be clear that your choice of assessment is actually measuring what you set out to measure. Sometimes we overreach in our claims
knowledge
about what an assessment is telling us e.g. a survey on reading for pleasure, or motivation to read, often relates to how well pupils
Create a strong argument that is credible and acceptable
can read, but such a survey doesn’t offer an accurate assessment of reading ability.
(it will never be definitive) rather than compelling. Rather than
trying to convince yourself and your colleagues that you are Data isn’t reliable Be clear whether your data source is fair and consistent. A reliable source of data usually follows processes that increase accuracy
right, focus on demonstrating an issue with evidence. Pupil interviews and consistency, such as question trialling, marking moderation and triangulation of different data sources e.g. lesson observation
reveal weak data conducted by different school leaders could, without consistently applied processes, produce very different—and so non-
motivation for comparable—insights.
Share your interpretation with people who might disagree reading
with you, to test your thinking and identify weaknesses in it. Data isn’t manageable Be clear that the process of gathering valid and reliable data can increase workload. Weigh up the value of gathering robust data
Encourage them to challenge any assumptions and see if they with the opportunity costs in doing so e.g. a survey of staff on a whole school change can offer us limited insights, but it proves less
can disprove the existence of the problem. workload than interviewing all staff.
Example of data interpretation
This resource supports the Putting Evidence to Work: A School’s Guide to Implementation guidance report.