\
Yet in the eighteenth century, in the face of burgeoning world trade and bustling commerce in Europe and North America, Scotland's intellectual giant David Hume outlined the clear link between economic development and political systems. His colleague Adam Smith-who happened to publish The Wealth of Nations in 1776-suggested in no uncertain terms that state action had real implications for commerce and human welfare. These insights were imported to our shores with the prescient belief by our founders that the system they designed would bring with it economic expansion. This view was based on an innate sense that individual freedom found authentic expression in commerce: innovation and rising living standards would result from a freer political system. In turn, various economic interests competing in a market-James Madison's famous "factions"-would help guarantee political freedom.
By the time Alexis de Tocqueville arrived in the 1830s to chronicle the American experience, an early form of entrepreneurial capitalism was already in bloom. He observed that among our common characteristics was a fascination with money and the making of it. Imagine coming from France, where one's occupation and social order were largely ordained by birth (the French Revolution notwithstanding), and looking at the unbounded freedom Americans had to make their own futures. Left to individual citizens, this freedom was the overarching theme of de Tocqueville's narrative, accurately recorded but perhaps not fully understood.
Early America was alive with a conversation about how knowledge could be shaped into science and the expansion of welfare-a drive for usefulness that, while owing some inheritance to Britain, found a uniquely American expression. Without a culture of the state directing life, the field was open to anyone who sought to bring his wits to bear on solving problems that could improve the human condition. Thus, there was never any question about the democratization of science and know-how in the new nation. The haughty redoubts of European academies, guarding old knowledge and monitoring the new, never truly emerged in early America in part because they offended the notion that anyone was free to contribute.
So, more than any other nation, America produced an iconic citizen-the innovator. And so powerful was this image it became recognized as part of our national character. Famed historian Frederick Jackson Turner's "frontier hypothesis" argued that all Americans were pioneers. Some of us explored the earth, a number made discoveries in science, and still others "fiddled," creating inventions like the telegraph. Among us were engineers who developed automobiles, radios and rockets that could carry humans into space and entrepreneurs who could turn the discoveries of their countrymen into new companies that often became breakthroughs themselves.
TODAY, THIS national self-identity as world-beating innovators feels threatened; the United States has reached its peak, now tracing the apparently inexorable trajectory of decline followed by great powers of the past. The evidence, seemingly, is abundant. China's new alabaster cities gleam, India manages the backrooms of many American businesses and Iran flaunts its newfound nuclear capabilities. As the story goes, this all came about because of a slow-footed America that failed to understand how circumstances were changing and failed to keep pace with the frontier. One usual suspect is the inability of our schools to educate our young properly. More Chinese children take the SAT in English on certain Saturdays than Americans. Our universities no longer produce sufficient numbers of engineers. Indian graduates from Bangalore alone could populate all the slots in American graduate programs for computer science. We hear that the Indian Institutes of Technology are better than MIT. And on top of all this, we are told (with a contradictory tone of betrayal) that many of the scientists and engineers who threaten our technological prowess learned their innovative skills in the American academy.
This composite narrative of American decline occurs periodically and is clearly in tide once again. The assumption is that American leadership is being displaced. It is unclear, however, that this is either a valid conclusion or an inevitable one. In fact, China's and India's entrance into the modern world economy was assisted by American entrepreneurial capitalism. The appearance of its permutations in Beijing and New Delhi has resulted in a stunningly unprecedented reduction in global poverty. Moreover, even with all the growth in China and India, both countries' per capita levels of wealth remain fractions of the American enterprise. And, much of the economic performance in those nations is related to the continuous and, until recently, accelerating expansion of the American economy.
Innovation and its consequences must be seen as part of a tremendously complex system. Its very nature defies any simple description-"innovation" is really an epiphenomenon of a vast number of factors, including human talent, training, extant technology, economic freedom, individual imagination, shared social norms, family composition, role modeling, financial systems, long-term personal expectations, government rules (e.g., patents and bankruptcy), international markets, social infrastructure, and combinations of these and other forces that we cannot begin to conceptualize.
AMERICAN'S POSITION at the global frontier of innovation has been challenged once before: when, in 1957, the Soviets launched Sputnik, the first earth-orbiting satellite. Immediately, the United States set out to close the technology gap that had suddenly appeared. The Second World War-and the subsequent emergence of the United States as the global economic power-had been won largely on the basis of extraordinary American production capacity and the innovation that made it possible. Now, such an ability seemed to be not only a competency of our enemy but, worse, the Soviets seemed to be beating us at our own game.
Overnight, the United States committed itself to regaining the lead not only in engineering but also in research and development. In 1958, the new Defense Advanced Research Projects Agency (DARPA) was established as an entity to encourage and fund new discoveries that might be vital to recapturing our edge over the Soviets. The warnings and prescriptions advanced by engineer Vannevar Bush in his 1945 report, "Science, the Endless Frontier" (which formed the basis for creation of the National Science Foundation in 1950), emerged as the focal point of a national effort to establish an American edge in innovation.
The push to regain leadership in science required moving way beyond the normal precincts governed by the Pentagon. President John F. Kennedy's goal to send a man to the moon, while no doubt meant to be a direct response to the Sputnik scare, served also to release huge investments by government and corporations into the vast array of technologies needed to support the lunar mission. Telecommunications, computers and materials science were but a few of the areas where we developed new knowledge, and new industries, overnight.
Soon after, in the 1970s, the federal government declared "war" on cancer. The National Institutes of Health (NIH) rapidly became the quarterback and funder of what might be truly seen as the beginning of large-scale research in medicine. The expansion of medical knowledge, much of it dependent on the technologies developed in defense research and the lunar-mission programs, produced nothing short of a revolution in health care between 1960 and 1980. It created our capacity to speak of "evidence-based" medicine for the first time.
GIVEN THAT until World War II government had played a very small role in steering research or directing business, when Washington turned toward a national effort to "win the space race" it was in entirely new terrain. Often lost in the telling, our success was achieved not only because of a massive influx of government money but also because the new ideas and technology that emerged reflected an extraordinary innovation in managing an unprecedented private-public partnership. In retrospect, government's management of the space race was itself an innovation, one that might be called "light touch" public administration. It provided funding and left an assembly of private and public actors to coordinate among themselves what was the most efficient way toward a solution.
But American economist Mancur Olson also realized that even as we were achieving these space-race milestones, our democracy was operating differently. Science and social-welfare funding redefined the federal government as a dispensing agent whose disbursement decisions could be profitably influenced by private actors. Olson saw the sclerosis that would emerge in innovation because of expanding government funding. In time, the entities populating this model of innovation-public funding, private contracting-would become protected government suppliers. The capacity for competition in ideas would narrow as various groups became rent takers. Their real-dollar value would decline as their protected status grew.
Most disturbingly, the narrative that increased government funding was the only model for American innovation gradually hardened into accepted fact. In the 1970s and early 1980s, confronted with inflation, sluggish economic growth, faltering domestic corporations and rising global competition, the impulse was to ratchet up public research expenditures.1 The same impulse arises today: increase federal research funding and we will see a corresponding increase in innovation.
YET THIS prescription does not reflect reality. Research-based innovation has evolved along a certain developmental arc. In the seventeenth and eighteenth centuries, metalworkers, watchmakers and other technologists led innovation; technology informed science. By the early 1900s this relationship was turned on its head as scientific research became increasingly important. This link was formalized in the creation of corporate research and, later, the university lab. This model-in which innovation is more deliberately pursued and, indeed, routinized-dominated most of the twentieth century.
But in the last twenty years, and increasingly so today, new and young companies-entrepreneurs-are essential to innovation in the United States, both in terms of performing research and development and in commercializing breakthroughs. Over the past several years, in fact, research in small companies (with fewer than five hundred employees) has grown more rapidly than in large companies.2
It should be no surprise then that we have approached a point of declining marginal returns to federally funded research. The chart below illustrates what we might think of as the general law of federally funded research: increasing R&D funding is accompanied over time by a falling marginal return of innovation, whether measured by outputs such as patents and licenses or more traditional measures like publications and citations. In most of the past decade, federally funded academic research has outpaced that financed by industry at a rate of twelve to one. This increase has not been accompanied by what might be expected in terms of returns. Patents and licenses by universities have risen, but the value they generate in dollars and impact has not kept pace-more does not necessarily mean better. Academic publication output, too, has lagged research spending, as has the number of "highly cited" articles.3
OUR PERSPECTIVE on innovation has continued to be oriented toward the "big science" model of the first three-quarters of the twentieth century rather than on the all-important entrepreneurs of the present day. And because the government has focused on professor-scientists, universities have been given an extraordinary institutional advantage. Additionally, the funding has been connected to single individuals who became known as "principal investigators (PI)." This approach did everything to enforce and solidify a model that is actually quite at odds with traditional models of how science moves forward. Great scientists are indeed critical-they are the people we know by reputation. But, increasingly, organizing modern science around the star-scientist model is seen to have distinct limitations, many of which operate against achieving further breakthroughs.
Investigation commonly involves m
No comments:
Post a Comment