Attention grabbing ‘research’ – the clickbait survey

If you look out for higher education stories, you’ll see a procession of stories about aspects of student behaviour.  Students are an interesting group; universities are fascinating, but generally these stories have a common theme.  They tend to be related to an aspect that has driven a commercial body to commission some ‘research’,  mostly in the form of a survey.  Students turn out to be worried about their insurance, their bank account, their gym fees or their consumption of some form of food or beverage.  Helpfully the ‘research’ commissioned by the company highlights this, and the company can explain how they can help with providing insurance or banking, or money-off vouchers, or even advice.  Often this ‘research’ produces a ranking, or at least a ‘top 10’.

As an example, Lloyds Bank produces a  Student Life Survey  which contains ‘key facts’ about universities in a top 30 of 89 universities that they have data on.  Except they won’t reveal any of their base data, much of which is from other surveys, their methodology for preparing a ‘top 30’ or even who the other 59 universities are (I have asked).    The survey document does helpfully include a disclaimer at the bottom:

“This report is prepared from information that we believe is collated with care, however, it is only intended to highlight issues and it is not intended to be comprehensive”

Which? University grabbed some headlines on 28 April 2016 with a press release entitled Three in 10 university applicants regret A-level choices’.   Now, I’m sanguine about Which? moving into higher education, where it has taken a very hard line on information provision to students.  Students should get information which enables them to make appropriate choices.   The English have a system of education which is quite unusual in its specialisation so choices really matter (although a system that aims towards an academic/vocational divide at 16 must be problematic).       

Which? University refer to ‘research’ which they have undertaken by asking applicants questions about their A level choices.   They have used Youthsight, a company that has a contract by which users of UCAS can opt to receive surveys.    The press release tell us:

Youthsight interviewed 1,020 adults aged 19 and under, who had applied to university, online between 12 – 15 February 2016. Data was weighted to be representative of gender and school type.

three in 10 (28%) university applicants wished they had chosen different A-level subjects,
four in 10 (41%) wished they had thought more about what subjects might help them get into university.
around half (53%) felt suitably informed about how their A-levels could affect their choice of university or course.
three in 10 applicants (30%) told us that the information and advice they received on which A-levels to take, failed to take into account how it may affect their degree and university choices.
less than half (41%) of those we surveyed were aware that many universities have a list of A-level subjects they view less favourably
a fifth (18%) said different A-level subjects would have been better suited to the degree they were applying for

Unfortunately, there is no further report where we can check either the exact questions the applicants were asked (were they asked about ‘regretting’ their choice?).  I did ask Which? but they just sent me a copy of the press release.  Clearly there is an important issue about the advice given to students about pathways to degree courses.  The applicants were asked these questions at a point when the bulk of them were receiving offers from universities.  Were any questions asked about other qualifications?  Do we know if the applicants had changed their minds about degrees since they were 16?   What was the real data – was it 3 in 10 or 28%?

My concern is that Which? University have adopted the commercial press release model for their ‘research’ on student choices.  I think this is inappropriate for an association which is campaigning for better information.  It appears that the purpose of this ‘research’ is to drive applicants to the Which? University  website where applicants can enter permutations of A levels and see what other students have gone on to do at university.  But what if this is picked up as a policy issue?  We know that survey data like this often gets cited uncritically.

Contrast this survey with the Student Academic Experience Survey also run by Youthsight for HEPI.  This is still cited as a key piece of evidence for the need for the TEF, but at least HEPI are happy to provide a proper report, and full data tables.  I’m not saying that companies can’t produce cheerful press-releases about students, but journalists should decline to publish stories about them unless everyone can look at the data behind them.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s