One hundred and fifty years ago this week, on 10 April 1861, the Massachusetts Institute of Technology (MIT) received its charter. Although hardly the oldest institution of higher learning in the Anglo-American world — Harvard University was already well into its third century by then, and the British universities of Cambridge and Oxford were each on the cusp of their eighth — MIT quickly became a trendsetter. Founder William Barton Rogers built a curriculum around the school's motto Mens et manus: mind and hand. He and his faculty members incorporated laboratory instruction into the most elementary undergraduate courses and fostered close ties between basic science and the practical arts — pedagogical innovations that quickly inspired many imitators.

Perhaps the most influential of MIT's many innovations lay not in curricula or textbooks but in patronage. Time and again over its history, the institute has experimented with new ways to fund its research and teaching. While preparing a book on MIT, Becoming MIT: Moments of Decision (MIT Press, 2010), I learned just how vigorously the funding pendulum has swung between government and private funds. Every few decades, a decision inspired impassioned charges about whose money seemed appropriate or tainted, eliciting much hand-wringing from faculty members, administrators and alumni about what consequences might befall the scholarly community should the wrong choice be made. Each new scheme unleashed a battle for the soul of MIT. Yet proposals that had struck observers as bizarre or brash on first hearing were quickly absorbed into daily operations, and promptly emulated elsewhere.

Protests against military research on MIT's campus flared on Alumni day on 16 June 1969. Credit: MIT MUSEUM

Amid today's economic uncertainty, universities around the world again face difficult questions about how to fund their operations. MIT's experiences throw these struggles into sharper relief, showing that today's bandits were yesterday's heroes.

Early trade-offs

Just two days after MIT's charter was signed, mortar rounds began to fall on Fort Sumter near Charleston, South Carolina: the US civil war had begun. Although it hardly seemed a propitious start for the young institute, the outbreak of war bought Rogers time to continue searching for funds.

Fifteen months into the fighting, President Abraham Lincoln signed into law the Morrill Act. The law allowed individual states to sell federal land and use the profit to fund colleges that focused on applied or practical topics, such as agriculture or engineering. 'Land-grant colleges' quickly sprouted across the United States, nearly all of them public institutions. Rogers convinced the state legislature of Massachusetts to donate a handsome portion of its land-grant funds to the fledgling MIT — a private institution — and, in exchange, he promised to offer military instruction to all its students. The infusion of government cash convinced private donors that MIT was worth the investment. From the start, MIT thus functioned as a financial oddity: a private university buoyed by public funds.

By the end of the First World War, having fended off several merger attempts from nearby Harvard — which struck faculty members and alumni as hostile-takeover bids — MIT found its budget strained to the limit. In 1919, its president Richard Maclaurin launched a new campaign known as the Tech Plan. Until that time, MIT, like almost all other US universities, had relied on student tuition, private philanthropy and occasional grants from local industries to fund research.

Unlike those earlier efforts, the Tech Plan rebuilt MIT's entire operation around corporate patronage. It created a centralized Division of Industrial Cooperation and Research — the forerunner of today's ubiquitous technology-transfer offices — to facilitate corporate-funded research projects on campus, open the institute's libraries to industrial sponsors and share alumni records with corporate recruiters. Nothing like this had been attempted before in US higher education. MIT's Tech Plan immediately attracted hundreds of companies and generated hundreds of thousands of dollars (several million dollars in today's currency). From the start, it also courted controversy.

A decade after the Tech Plan was established, well over one-third of faculty members were conducting work for a corporate sponsor. What sort of work? Faculty members found it difficult to say, because many of their arrangements forbade publication of results without the sponsor's approval. Sentiment on campus for the Tech Plan further soured after the stock-market crash of 1929 and the onset of the Depression. Annual budgets for departments such as electrical engineering plummeted by 60% in just four years. A growing chorus concluded that MIT had been short-sighted to rely so heavily on corporate patronage which, after all, could be as fickle as the latest business cycle. Critics went further: overreliance on industrial funding was corrupting. In pursuit of quick money, the critics charged, MIT had auctioned off its intellectual autonomy.

Government's turn

Few alternative funding models seemed obvious. The Morrill Act notwithstanding, many fiscally conservative administrators at private universities across the country — from MIT to Stanford University in California — believed that the federal government had no business meddling in local affairs such as higher education. Self-made entrepreneurs, industrialists and philanthropists were one thing; federal bureaucrats were quite another. Their peers at public universities, who relied on state legislatures for funding, largely agreed. Yet the stark economic realities of the 1930s forced MIT vice-president Vannevar Bush to reconsider federal patronage. No industrial partners could rival the research budgets of projects such as the Tennessee Valley Authority, a sprawling, government-owned company founded in 1933 to investigate everything from agricultural productivity to hydroelectric power (see E. Rauchway Nature 457, 959–960; 2009).

Bush's solution was to rely on contracts with the federal government. Both this funding source and the legal arrangement — contracts rather than grants or donations — were new. To keep up the appearance of fair-market transactions between autonomous agents, Bush insisted that MIT, desperate for cash though it may be, hammer out contracts at the negotiating table like any other private enterprise, rather than seemingly beg for handouts. Bush took this new approach with him to the brand-new National Defense Research Committee in June 1940, and its successor, the Office of Scientific Research and Development (OSRD), in 1941. On Bush's watch, the OSRD awarded thousands of research contracts to universities across the United States, for everything from radar to the atomic bomb.

More of those wartime research contracts flowed to MIT than to any other university. By 1945, MIT had secured defence-related research contracts worth three times more than those of stalwart industrial contractors Western Electric (AT&T), General Electric, RCA, DuPont and Westinghouse combined. More than 90% of MIT's annual operating budget derived from federal research contracts. In short order, MIT's model became the basic template for research universities across the United States. Other institutions (most famously Stanford) studied MIT's transformation and sought to replicate it. After the tremendous upheavals of wartime, few questioned whether the federal government was an appropriate source of funding for basic research and university education. The sheer scale of funding, which continued to rise after the onset of the cold war, quickly cemented the new norm.

During the 1950s and 1960s, federal patronage drove the fastest expansion of higher education in American history (if not the world). The new contracts, largely from military and defence-related agencies, underwrote massive new equipment on campus such as nuclear reactors and electronic computers, opening up unparalleled opportunities for faculty members and students. Few questioned the relationship too sharply until the escalation of fighting in the Vietnam War in the late 1960s. Only then did a critical mass of campus voices reconsider whether the Pentagon should have a role in education. Government money once again seemed 'dirty'.

Even money from private foundations comes with baggage.

The stage was set for yet another funding model. Not long after the campus protests had faded, molecular biologists began to worry about the potential dangers of combining DNA sequences that don't occur together in nature. Cambridge emerged as one of the first cities in the United States to forge its own rules and procedures to allow such research. In short order, the area near MIT's campus became known as 'gene town', an incubator for private biotechnology companies, many of which enjoyed close ties to MIT faculty members and students.

The revolving door that has since existed between MIT life scientists, their students, corporate boards of directors and venture capitalists has surely been a great boon for research. But who benefits? Many critics fear that modern non-disclosure agreements are just as stifling as the corporate censorship rules of the Tech Plan or the defence department's classification codes. Other concerns loom as well. How much does MIT benefit when faculty members split their time between campus responsibilities and spin-off companies? Is private investment any more reliable or morally pure than public investment from the federal government?

Since the 1980s, MIT has followed a hybrid funding scheme, with clear roots in its earlier experiments. Now, about half of its annual budget comes from non-military branches of the federal government (the largest share from the Department of Health and Human Services); one-quarter from private industries and foundations; one-sixth from the military; and the remaining few per cent from state, local and foreign governments.

Even money from private foundations comes with baggage. The latest sparkling new building on MIT's campus — the David H. Koch Institute for Integrative Cancer Research, which officially opened last month — exemplifies the tensions. Billionaire businessman David Koch, an MIT alumnus and cancer survivor, generously funded the new building; he has donated comparable amounts to refurbish medical centres, museums and theatres across the country. Alongside his philanthropic giving, he has also funded conservative political groups associated with the 'Tea Party', although he denies any direct connection with that movement. Deserved or not, his name has become polarizing in today's political climate. At the building's grand opening, Koch declared that cancer is “absolutely non-partisan”. True enough. But patronage, unlike the disease, is all about political choices and intellectual trade-offs.

Since MIT's founding, government sources and industrial sponsors have traded places several times, each held up alternately as saviour or poisoned fruit. As the institute's history has shown, no one model holds a monopoly on virtue. All patronage involves a delicate balance between opening up new opportunities and mortgaging intellectual autonomy. Rather than focus on the source of cash — public or private — we must remember to scrutinize the inevitable strings attached.