Since March 1st, when the 2013 budget sequestration began, many political pundits have decried the “inevitable” and “disastrous” effects of cuts on one special interest group or another. Recent commentary on marginal cuts to the National Institute of Health (NIH) has followed suit, with some even going so far as to call this the beginning of a “dark age” for science.
Such commentary typically features stories of scientists—each pursuing promising research leads at prominent universities—who have been reduced to practically begging the NIH for what they believe to be increasingly sparse funds. Yet the current cuts have merely reduced the NIH budget from $30.8 billion to $29.1 billion.
Implicit in many of these hyperbolic arguments against cuts is the notion that it’s the role of the government to fund scientific research. These commentators suppose that state coercion is necessary to achieve scientific progress and that private individuals should not make decisions about how best to manage their own their wealth. They prefer, thus, that the government take the wealth it deems necessary for scientific progress.
But the only proper role of government where science is concerned is to protect individual rights in the free market, allowing new innovation to continue unimpeded.
When the government coercively funds scientific research, it demands that individuals in the free market suspend their judgment about which research to invest in, to the extent of the amount that they are taxed. This means that for every percentage point of an individual’s income the state takes to fund “scientific investment” in programs like the NIH, that potential investor may no longer use his own mind to determine how it should be spent.
But it’s precisely because private individuals are the proper owners and managers of their own wealth that they tend to spend it more efficiently than government does. Individuals have to work hard for their paychecks—time, energy, and their greatest efforts are put into acquiring them. By contrast, politicians and bureaucrats have nothing personal invested in a government budget, having never worked hard to earn it or even to persuade others to voluntarily give it. They don’t face the kind of consequences that private individuals do if their investment is a failure, since the blank check provided by taxpayer pockets will always be there to fund the next social experiment.
After the 2008 financial crisis, for example, the American Recovery and Reinvestment Act was touted as a plan which would invest in the technology of tomorrow. Over $500 million taxpayer dollars were loaned to Solyndra, despite multiple and ultimately prophetic warnings that this was not a promising investment.
In fact, government money also endangers innovation when it comes—as it always does—with political strings attached. A comparative study by MIT economist Pierre Azoulay of two groups of scientists—one with funding from the private non-profit Howard Hughes Medical Institute (HHMI), and the other consisting of NIH MERIT award winners—revealed that researchers funded by the former were more successful in their respective fields, and had more promising projects, than did those of the latter. Federal granting agencies tend to target research projects which are most likely to achieve specific government-oriented goals, and thus severely limit creativity and flexibility by demanding the achievement of those goals, as opposed to allowing the discovery process to guide subsequent research.
The heavy-handed effect of government on innovation is painfully obvious in the applied science of medicine. Medicare and Medicaid have created a web of red tape which doctors must conform to simply to continue practicing at a profit. According to a 2011 survey of U.S. and Canadian doctors and administrators, administrative costs were a far greater burden on the resources of American physicians than on those of their Canadian counterparts. Even in the U.S., the effects of the new Affordable Care Act on innovation have been felt palpably in the medical device industry, where a tax on venture capital investment has slowed the pace of innovation to a trickle.
What evidence is there that the state’s will is a suitable replacement for the discipline of the market? Many will cite the development of the Internet—the crown jewel of “government research.” The Pentagon’s Advance Research Projects Agency Network is often cited as the program which jump-started the Internet, for the sake of national defense. But Robert Taylor, director of ARPA in the 1960s, clarifies that it didn’t create the Internet: “The creation of the Arpanet was not motivated by considerations of war. The Arpanet was not an Internet. An Internet is a connection between two or more computer networks.”
Though the Arpanet was one of the world’s first basic computer networks, it was at Xerox labs, in 1970s Silicon Valley, where the first structured network of such connections was established through the development of Ethernet technology. Because ARPA had not developed the technology quickly enough, Xerox proceeded to do so on its own.
So the men who can truly be credited with the development of the Internet into a globally-accessible commercial matrix are thinkers such as: Vint Cerf, former program manager for ARPA who took his skills to the market, where he played an instrumental role in the development of the first commercial email system; Tim Berners-Lee, who invented the World Wide Web; and Steve Jobs, who put the ideas developed by Xerox to commercial use in many of Apple’s first marketed computers. Productivity and innovation tend to flow from the minds of self-interested individuals pursuing profit—not from government orders.
The broader story of scientific progress from the Industrial Revolution to the present day provides an overwhelming case for a laissez-faire approach to scientific research and technological development. Clinical biochemist Dr. Terence Kealy notes that the rate of economic growth for the most prosperous industrialized nations does not seem to have been improved in any drastic way in the United States by the massive boom in Cold War science funding. In fact, despite exponential increases in government spending on research over the past few decades, findings published by the Organization of Economic Cooperation and Development indicate that between 1971 and 1998, publicly funded research and development had virtually no impact on growth rates.
This much should be obvious, though. The vast majority of paradigm-shifting innovators over the past two centuries were not the products of government research. Henry Ford did not revolutionize the automobile industry with his Model T Ford and the assembly line by the grace of government subsidies. Alexander Fleming did not need to beg for government funds in order to observe the growth of a Pennicilium mold in his laboratory. Alexander Graham Bell’s curiosity, experimentation, and eventual invention of the telephone were not inspired by government edict.
Most innovation today flows from research funded by private investment. Often this investment comes from businessmen running large-scale commercial enterprises, seeking to most efficiently manage their resources. One of the most important components of successful businesses, especially those specializing in technology, is a research and development team that can outdo competitors. Consider Steve Jobs once again. When the idea for an MP3 player which had its own digital marketing interface and software (iTunes) was proposed at Apple, Jobs dedicated 100% of his energy to its development. 200 Apple employees, 80 engineers, and untold funds were funneled into the development of the device that would revolutionize the entire music and entertainment industry. Countless technological developments since then, from touch-screen to razor-thin processing systems, have been made in attempts to compete with the iPod and its Apple-born progeny.
Private interests also have a long history of funding groundbreaking, farsighted scientific research. After Flemming’s initial observation of Pennicilium, British researchers Howard Florey and Ernst Chain became interested in the practical applications of his discovery. Upon rejection of their application for a grant by the British government’s Medical Research Council, they turned to The Rockefeller Foundation. John D. Rockefeller established the foundation, in part, to see the kinds of game-changing innovations which Florey and Chain would make come to light. By 1945, after receiving a 5-year grant from Rockefeller’s foundation for their research, Florey and Chain shared the Nobel Prize with Flemming for the development of medical penicillin.
Whether through directly commercial research or philanthropy funded by commerce, science is advanced best when individuals must use their minds to choose where to put their money, in hopes of funding the next big idea. Government coercion is conspicuously absent from this equation.
The expansive story of scientific progress over the past two centuries involves many compelling examples of private innovation. But their story is largely untold, because it occurs all around us, every single day. It is the story of every individual’s pursuit of his creative passions, curiosities, and interests. And in order for this story to continue, each individual’s mind—and the wealth that it produces—must be his to use.
Scientific progress, thus, is much more likely to develop in a system devoid of state coercion—that is, in a laissez-faire economy such as was present during the immense scientific progress of the Industrial Revolution in America. In a system that permits private individuals to make, keep, and spend their wealth in the manner which they see fit, the state is properly barred from inhibiting individuals from the peaceful pursuit of their own lives. No one can force the minds of private market participants to innovate, and no one should try to.
J.A. Windham is a first-year law student at the University of North Carolina, Chapel Hill.
Creative Commons-licensed image from Flickr user dbking.