The UK government’s austerity policies are soon expected to deliver swingeing cuts in some departments. In the teeth of those prospects, British researchers, and an influential parliamentary science and technology committee, have lobbied hard to make the case that even a flat research budget, after five continuous years of the same, would be a betrayal of the country’s needs.

How successful they have been, and how worthy of exception the government considers them to be, will become clear only when UK spending plans are announced on 25 November. On page 144, we explore the worrisome prospects.

As if that wasn’t enough, last week the government began a consultation over its proposed restructuring of the way it administers higher-education funding (see go.nature.com/c97sww). It announced that it wants to abolish the Higher Education Funding Council for England (HEFCE), the body that distributes £1.6 billion (US$2.4 billion) of ‘quality related’ research money to universities. Those core funds would still be administered separately from the more-responsive funding by the UK research councils, probably by a research-funding organization that would be responsible for both areas.

The case for those particular changes has not been adequately made, but other aspects of the proposals have virtues. The government is tackling two scandals in the UK higher-education system: its relative neglect of quality standards in teaching, and its inadequacies in contributing to social mobility.

The impacts case studies provide welcome ammunition to the case for supporting research in all disciplines.

Another positive feature is that the government supports the continuation of the Research Excellence Framework (REF) for assessing research quality and impact, despite the proposed abolition of HEFCE, which successfully implemented that process. It is worth taking stock of the REF — not least because its results strengthened the case for research investment by government.

True, many academics hated the REF, which required them to submit large quantities of information to justify their funding. But an inspection of the REF’s outcomes, and of the retrospective reviews of the process by international members of the REF assessment panels (see go.nature.com/q919oe), suggests that it has many strengths.

Take the database of nearly 7,000 case studies of the societal impacts of academic research (seehttp://impact.ref.ac.uk/casestudies). The diversity of the impacts in terms of (for example) health, sustainability, education and economic growth — in the United Kingdom and beyond — is remarkable and inspiring.

And there is no reason to suppose that identifying these outcomes is the equivalent of the impacts tail wagging the research dog. In the REF, the assessment panels gave the societal case studies a mere 20% weighting, whereas academic performance had a 65% weighting. No one could sensibly maintain that the outcomes are necessarily predictable and should be required as a basis for funding in future. What does make sense is that the research community can help to ensure a maximal return on taxpayers’ money by becoming aware of impacts pathways, and by broadening its outlook on its roles. As the case studies show, this can happen even in areas of research that are unapologetically fundamental.

A study by the independent consultants RAND Europe estimates that the REF impacts-submissions process cost universities about £55 million (see go.nature.com/dzwbjn). That may seem to justify the concerns of academics and politicians about the burden. But set against the £1.6-billion budget that it relates to, one might even describe this 3.4% overhead as a bargain — especially given that the assessment system may become more efficient, and given the virtues of encouraging such impacts.

Mindful of the burdens, the government is evidently tempted to try to find a cheaper method of assessment of both academic and societal impacts using metrics. As is made clear by the REF panels, by a RAND analysis of the impacts evaluation (go.nature.com/yysa6m) and by an independent assessment of metrics in research (go.nature.com/rfrgql), this temptation should be avoided. Insightful review of both types of outputs is the only way to do justice to them.

The impacts case studies provide welcome ammunition to the case for supporting research in all disciplines, and some government departments have been deploying them in that spirit. Readers who care about UK higher education should give their own responses to the proposals before 15 January 2016, at go.nature.com/l3rrtx.