Alternative Challenger Provider Institutions

Is the change of Government terminology towards different types of higher education institution just semantics, or is there a change in policy there too?

The coalition government was keen on ‘alternative providers’, but clearly had not thought through the implications of opening up the designation of courses which would allow students to obtain £6,000 fee loans plus maintenance loans and grants.  We still do not fully know the full extent to which UK and EU students went to fast growing providers who offered quickly approved HND/C courses.  Helpfully BIS published some research on the same day as it introduced the Higher Education and Research Bill.    Many of the findings are averaged across the whole alternative provider sector, which the authors counted as including 732 providers of a very diverse range.  The large, high profile, for-profit providers form only a very small number of those providers, although they account for a larger proportion of the student numbers.

We certainly saw ‘flux’ in the number of students at providers.  Two, LSBF and St Patricks, grew very quickly and are now considerably smaller than at their apogee.  Their parent company Global University Systems has decided to withdraw LSBF from designated courses altogether.    St Patricks has published a copy of its Academic Management Review by BTEC that notes its student numbers are now around 2000, having been ‘about 4000’.

BIS came under considerable pressure to retrospectively create a regulatory framework, and one of the key tests for the White Paper is how the new Office for Students will be able to monitor providers.  One aspect is their approval, and the government’s desire to see them get degree awarding powers and university title much more quickly – an area I have already noted some concerns about.

What is clear is that BIS has changed the language it is using and the examples it is giving. In place of ‘alternative providers’ we have ‘challenger institutions’.   The same reliance on the notion of disruptive innovation is present, but the examples are not for-profit colleges in London (funded by venture capital), but not-for-profit colleges in the shires (recipients of public funding).  The press release announcing the ‘new universities’ gave two examples:

It could benefit institutions such as Hereford’s proposed New Model in Technology and Engineering and could have benefited University Campus Suffolk, which has now been given the green light from the government to apply for University Title, 9 years after admitting its first higher education student.

Both these examples owe much more to the previous government’s university centre initiative than they do to the ‘alternative provider’ model.  Suffolk, which it was announced the very next day had been awarded University title, is a long-term collaborative development very much in the mode of new university development in the twentieth century.  Hereford is also a regeneration model, albeit with an attempt to form a distinctive curriculum, but whose advisory council heaves with heavyweight higher education figures.

suffolk-pic2

University Campus Suffolk

In the debate on the Queen’s Speech on 25 May, Jo Johnson singled out these examples again, prompted by the Conservative MPs for these areas, and fixing the new university programme to ‘cold spots’.

I happily join my hon. Friend in congratulating the new University of Suffolk. It is terrific that one of four counties in this country that did not have a full university now has one. There are three other counties and we hope to encourage new institutions of similar quality to the University of Suffolk to come to the higher education cold spots that we have inherited.

However hard you try, you would be very hard pressed to identify the places where alternative providers have clustered in London as ‘cold spots’.

Before 2010 BIS had a programme of planning for new university centres, complete with a bidding process.    Mostly these are delivered by a further education college with a link to a university, but sometimes these are outposts of a university.  They continue to fill ‘cold spots’, and can be found in Croydon, Grimsby, Milton Keynes, Peterborough, Shrewsbury, etc.  Have these really replaced the ‘alternative providers’ as the object of affection for BIS?  Are these the ‘challenger institutions’ that will drive up quality? Or will it be Google or Facebook -the giant corporations that the Government first briefed the media were their models – but whose participation cannot be confirmed.

I think it most than just semantics that has seen us move from ‘alternative providers’ to ‘challenger institutions’, and hopefully towards a more cautious approach towards the quality of these institutions.

 

 

The A Priori University

The HE White Paper Success as a Knowledge Economy looks to competition to improve quality in universities.  BIS is to be commended for going through the Green/White paper process (as contrasted with, say, their colleagues in DfE) but that has left many wondering where the evidence is.  Many of the propositions in the White Paper appear to arrive without any evidence, especially regarding lacklustre or lamentable teaching.

There is one area where this eschewing of the need for evidence is taken to a new level. The Green Paper posed a conundrum of how new ‘providers’ could have the freedom to offer their own higher education courses more quickly.  Generally British Higher Education has adopted an apprenticeship model, and this is enshrined in the current process for acquiring degree awarding powers and university title.  You need to show that you can do this; normally for at least several cohorts.  As an example, the new University of Suffolk has been running since 2007 and got its degree awarding powers based on a long track record, having been supported by the universities of Essex and East Anglia.

However, this is not fast enough, it seems.  Challengers are supposed to be distinctive – new places have to teach courses that lead to other universities’ degrees, who have all sorts of rules and regulations that the new places must follow.  There is an expectation that degrees at partners will be equivalent.

So, the White Paper brings the notion of the probationary degree awarding powers:

It will be possible for high quality providers to enter the sector on the basis of their potential (subject to rigorous quality controls) and gain probationary foundation or taught DAPs as soon as the OfS is satisfied that the conditions of being an Approved provider have been provisionally met. They can then offer their own degrees while building up a 3 year track record for full DAPs. This is a significant improvement on the current system in which DAPs take at least 6 years to gain (p29)

Here we have the rigorous quality control of a new provider a priori – a judgement will be made before there is any experience of the provider to assess.  The current rules are clearly far too stuck in the realms of the a posteriori – checking that a provider is doing what it says it will do, looking to see that it can maintain standards etc (not through metrics – but by looking at external examiners’ reports).  A key concern about new ‘providers’ is that they expand too quickly – seeking ‘marginal’ students who might then struggle.  Weirdly there’s a section of the White Paper later on where the normal government-speak slips:

For too long we have been overly tolerant of the fact that some providers have significantly and materially higher drop-out rates than others with very similar intakes in terms of demographics and prior attainment. This applies equally at both the high tariff and low tariff ends of the sector. Such variability is not simply a statistic, nor even simply a squandering of taxpayers’ money. It is worse: it represents thousands of life opportunities wasted, of young dreams unfulfilled, all because of teaching that was not as good as it should have been, or because students were recruited who were not capable of benefitting from higher education (p46)

How will the new OfS assess providers in advance?  Clearly this is some way off, but it does seem a strange prospect.  Given  the strictures of the CMA guidance, surely no application could be accepted from a student before the OfS had approved the probationary degree awarding powers?  The provider would need a fully worked-up proposal at least a year in advance of starting; it’s hard to see how this would help.  It isn’t hard to obtain a set of procedures for running a university – there are QA consultants out there – but you can just as easily assemble a set from university webpages.  What’s important is how they’ll actually run for you.   BIS highlighted the project at Hereford as a potential beneficiary of these faster powers – but that is on a three year schedule to open as it is prepares a new type of curriculum and recruits staff (and Hereford has been given a substantial subsidy by the Treasury).

We don’t have a full analysis of the boom and bust of alternative providers in the period 2010-2015 in the public domain.   What is clear is that the providers who appeared with off-the-shelf quality assurance systems and taught off-the-shelf HND/C courses are over-represented among the cause for concern reports and the removals of course designation.  Surely the long-term future of the UK’s HE system, and in particular its ‘controlled reputational range’ is best served by an apprenticeship model – rather than a probation model?   And if we are to launch challengers, why not assess them on what they do, rather than what they say they’ll do.

Obfuscation: BIS in Sheffield

Parliament has a role in holding Government to account.  Ministers should be able to explain their decisions, so I was interested in the account given by the Minister to a series of questions posed by Paul Blomfield MP in a backbench business debate on 9 May 2016.  The issue concerns the closure of the BIS office in Sheffield, where a large proportion of the work is connected with higher education policy.   Although announced as a decision, it had emerged that the BIS Board had suspended the decision for two weeks.

Although there was some point scoring in the debate, Blomfield was calm in his introduction and posed four questions to the minister.  He introduced them:

Today’s debate came about because our key questions were not answered by the permanent secretary. Now is the Minister’s opportunity, so I want to conclude by asking four questions, to which Members and the hard-working staff of the BIS office in Sheffield have been seeking answers since January. I gave the Department advance sight of the questions last Wednesday to allow for full consideration and comprehensive answers.

Remember, he had sent these in advance.  Anna Soubry was answering for BIS,  she noted:

I will deal with the points he made, but in the time allowed to me I will not be able to answer them all in the sort of length that I would like.

[The way that the debate works means that these questions and answers were not posed and answered like this – but it helps to show the linkages]

PB: First, in reaching the decision to close the Sheffield office, what assessment has been made of the additional costs of moving the posts to London? That is the core question that we have been asking all along.

AS: He asked what assessment had been made of the cost of replacing jobs and moving them to London. A full assessment has not yet been made, but, as he will know from the evidence of the permanent secretary, the total over time for the Sheffield office was thought to be some £14 million. As I have said, however, this is not just about costs. …

PB: Secondly, what assessment of the decision has been made against the Government objectives of moving out of expensive Whitehall accommodation, diversifying the civil service, and not locating head office functions in the capital?

AS:  As for the assessment of the cost of replacing Sheffield jobs in London, the final decision has not been taken, and until it has been and we know all its ramifications it will not be possible to give that assessment.

PB: Thirdly, what assessment has been made of the impression created by the decision to move to London the functions of an office of the Department responsible for the northern powerhouse?

AS: The hon. Gentleman and other hon. Members asked about the northern powerhouse, but I do not need to be told what a great and wonderful city Sheffield is. [Followed by some comments about her childhood near Sheffield and HS2]

PB: Fourthly, aside from the proposals to centralise policy functions in London, what consideration has been given to the other options for achieving the “BIS 2020” objectives?

AS: The final question from the hon. Member for Sheffield Central concerned what other options there are apart from the proposal. Full consultation has taken place with unions and staff, and several alternative proposals have been received. The BIS executive board will take full account of those when reaching its decision on the proposal, and I hope that goes some way to answering his question.

Now, we are all well aware of the ‘rough and tumble’ of politics that allows questions to ministers to go unanswered, but these do seem to be particularly fine exemplars of how Parliament struggles here.  As the final decision has not been made, the assessment has not been made as to the costs.  You could ask how the final decision could be made without the assessment of costs, but hopefully the NAO will shed some light on that.

Attention grabbing ‘research’ – the clickbait survey

If you look out for higher education stories, you’ll see a procession of stories about aspects of student behaviour.  Students are an interesting group; universities are fascinating, but generally these stories have a common theme.  They tend to be related to an aspect that has driven a commercial body to commission some ‘research’,  mostly in the form of a survey.  Students turn out to be worried about their insurance, their bank account, their gym fees or their consumption of some form of food or beverage.  Helpfully the ‘research’ commissioned by the company highlights this, and the company can explain how they can help with providing insurance or banking, or money-off vouchers, or even advice.  Often this ‘research’ produces a ranking, or at least a ‘top 10’.

As an example, Lloyds Bank produces a  Student Life Survey  which contains ‘key facts’ about universities in a top 30 of 89 universities that they have data on.  Except they won’t reveal any of their base data, much of which is from other surveys, their methodology for preparing a ‘top 30’ or even who the other 59 universities are (I have asked).    The survey document does helpfully include a disclaimer at the bottom:

“This report is prepared from information that we believe is collated with care, however, it is only intended to highlight issues and it is not intended to be comprehensive”

Which? University grabbed some headlines on 28 April 2016 with a press release entitled Three in 10 university applicants regret A-level choices’.   Now, I’m sanguine about Which? moving into higher education, where it has taken a very hard line on information provision to students.  Students should get information which enables them to make appropriate choices.   The English have a system of education which is quite unusual in its specialisation so choices really matter (although a system that aims towards an academic/vocational divide at 16 must be problematic).       

Which? University refer to ‘research’ which they have undertaken by asking applicants questions about their A level choices.   They have used Youthsight, a company that has a contract by which users of UCAS can opt to receive surveys.    The press release tell us:

Youthsight interviewed 1,020 adults aged 19 and under, who had applied to university, online between 12 – 15 February 2016. Data was weighted to be representative of gender and school type.

three in 10 (28%) university applicants wished they had chosen different A-level subjects,
four in 10 (41%) wished they had thought more about what subjects might help them get into university.
around half (53%) felt suitably informed about how their A-levels could affect their choice of university or course.
three in 10 applicants (30%) told us that the information and advice they received on which A-levels to take, failed to take into account how it may affect their degree and university choices.
less than half (41%) of those we surveyed were aware that many universities have a list of A-level subjects they view less favourably
a fifth (18%) said different A-level subjects would have been better suited to the degree they were applying for

Unfortunately, there is no further report where we can check either the exact questions the applicants were asked (were they asked about ‘regretting’ their choice?).  I did ask Which? but they just sent me a copy of the press release.  Clearly there is an important issue about the advice given to students about pathways to degree courses.  The applicants were asked these questions at a point when the bulk of them were receiving offers from universities.  Were any questions asked about other qualifications?  Do we know if the applicants had changed their minds about degrees since they were 16?   What was the real data – was it 3 in 10 or 28%?

My concern is that Which? University have adopted the commercial press release model for their ‘research’ on student choices.  I think this is inappropriate for an association which is campaigning for better information.  It appears that the purpose of this ‘research’ is to drive applicants to the Which? University  website where applicants can enter permutations of A levels and see what other students have gone on to do at university.  But what if this is picked up as a policy issue?  We know that survey data like this often gets cited uncritically.

Contrast this survey with the Student Academic Experience Survey also run by Youthsight for HEPI.  This is still cited as a key piece of evidence for the need for the TEF, but at least HEPI are happy to provide a proper report, and full data tables.  I’m not saying that companies can’t produce cheerful press-releases about students, but journalists should decline to publish stories about them unless everyone can look at the data behind them.