Lying with statistics

This week’s* SOCRMx tasK:

“… locate an example of lying with statistics—whether inadvertent or not—in the wild, online…One place to look for potential examples of ‘lying with statistics’ is in news reports about economics, science, health, education, politics, or another field where numerical data get reported. Other possibilities are advertisements, government or company reports, marketing information, and even popular memes. Your example might be a table, graph, infographic, or short written passage.

Write a blog post about it, with appropriate attribution to the original source via a link, and tell us how you think it qualifies as an example of statistics used misleadingly. Then tell us how it might be fixed.”

A few years’ ago, on the day when I was due to deliver a full-day course on the benefits of using IT in education, this was splashed all over the news:

This wasn’t a good start to the day. You can view one of the media reports here.

The headlines were a response to this global OECD** report which claimed: 

The report was based on a comparative analysis of per capita spending on ICT for students and test results, including PISA tests in English, Maths and Science and ‘digital skills tests’.

There are significant issues with this study and another paper could be written to deconstruct it. Miles Berry has gone some way towards doing that here. However, as the focus this week has been on statistics, let’s take a look at just one of the graphs which the report presents.***

On first view, this graph appears to show a clear correlation between internet use and student performance: the more time students spend browsing the web, the more their reading performance declines. However, if we look at the axes labels, we note that this is not so straightforward. The y-axis actually represents the change in student performance in PISA reading tests from 2000 – 2012.  The associated notes don’t offer clarity as to how this ‘annualised change’ is calculated, stating – vaguely – that ‘it is calculated taking into account all of a country’s and economy’s participation in PISA.’ The x-axis is also unclear with regard to defining its terms: it shows the percentage of students who use computers at school to browse the internet for schoolwork ‘at least once a week’: there is no clear detail provided as to the duration of this browsing, either as individual periods or as a cumulative total. This graph does not represent a ‘snapshot’ of the impact of ICT on performance. Rather, performance in these tests has slowly deteriorated in almost all countries over time. Instead of assessing the multitude of factors which might have impacted on test performance (including questioning the relevance and appropriateness of the tests themselves) the research has focused on cross-correlating ICT use with the decline in performance and has found and presented data which can tell that story. Some other observations which are worth making:

  • The PISA tests are developed by the OECD who also produced the report. It is in the OECD’s interests to defend their examinations and to offer a rationale for why performance might be declining in their assessments.
  • The report does not cross-correlate this data with in-country performance data based on internal examination systems.
  • There are a number of non-OECD countries not represented, whose use of ICT within education could, comparatively, be said to be ‘mature’. These include Canada, the US and the UK.

*well, last week’s: I’ve been away so I’m catching up!

** Organisation for Economic Co-operation and Development

***I could have selected any one of a number of graphs which shows a depressing downward trend in student performance in line with the amount of time spent browsing the internet and/or number of computers in schools.

Leave a Reply

Your email address will not be published. Required fields are marked *