For many of us, our early morning rituals include checking email and reading news online. The headlines we see these days often trumpet dire concerns about government spending and the burdens of an ever-growing national debt. In light of these concerns, and in an environment where headlines encourage short-term fear rather than long-term thinking, it may puzzle the average taxpayer to see that President Obama’s newly-released FY2013 budget appropriations include modest increases in long-term federal research funding for information technology (IT) via the National Science Foundation and other agencies. As taxpayers consider government spending in the midst of a vitriolic presidential election campaign season, it is natural for them to wonder if such increases are the right thing to do. The answer, quite simply, is a definitive yes. Long-term IT research has an incredible track record as a major economic driver. We wouldn’t so easily be checking email or reading news online if it weren’t for our nation’s investment in long-term funding of science and technology research, out of which the internet and other information technology was born over the past 50 years or more. Attacks on science and technology research funding are myopic, and would hamper our nation’s competitiveness and economic prosperity.
Last month, Commerce Secretary John Bryson explained, “Our ability to innovate as a nation will determine what kind of economy — what kind of country — our children and grandchildren will inherit.” He was speaking at the release of a new report, entitled “The Competitiveness and Innovative Capacity of the United States,” which cites federal support for basic research as one of three pillars on which the American high-tech innovation juggernaut of the 20th century was based. Google’s web search techniques, for example, are an outgrowth of federally-funded academic research started before the founders of Google left Stanford to form a company. In our home computers, many layers of software and hardware cooperate to access the riches of the internet, and these too have their roots in long-term basic research from decades ago or more. But that pillar of support is weakening. As an inflation-adjusted fraction of GDP, the overall federal investment in R&D (research and development) has been flat or declining for more than a decade. The federal government’s entire annual investment in basic computing research, for example, is roughly $4B. To put that number in perspective, this annual investment—less than the price of a single recent cost overrun on a nuclear submarine project— fuels advances in the IT sector that create millions of high-quality jobs and support economic growth in other areas as well.
In times of tough budget choices, it is easy to look at the computing industry and assume that its vibrant entrepreneurial culture will be enough to invent what needs inventing, without any longer-term government support. After all, with high-tech companies appearing to be so prosperous, why should the government need to fund this boom at all? That approach is, however, short sighted. The computing industry cannot sufficiently fund long-term research; pressures to maintain quarterly profits and growth reduce the likelihood of thinking on a ten-year timeframe and beyond, as does the culture of Silicon Valley, which has a notoriously short attention span. Privately-held technology companies cannot always prioritize long-term thinking in a fast-changing competitive landscape. Furthermore, companies are likely to best fund the trajectories that suit their individual strengths, rather than funding new or risky trajectories that may not be natural for any existing corporation, but which can open up new markets and create high-quality jobs.
The high-tech vibrancy we enjoy today stems in large part from government investments made decades ago. Shorting basic research funding today will have repercussions that become visible in years to come. After all, our high-tech sector operates in an environment that has become highly-competitive on the world stage. The U.S. used to be the clear leader in all aspects of information technology, from the underlying semiconductor technology used to make electronic circuits, to the highest levels of application and systems software. While the U.S. Bureau of Labor Statistics 2010-11 Occupational Outlook Handbook forecast job growth of 20 percent or more in a range of computing-related professions, our leadership role is weakening.
China, for example, now pours significant government investment into information technology and is beginning to reap rewards. The “Top 500” list of the world’s fastest supercomputers (top500.org) is often considered a bellwether of computing prowess because it succinctly summarizes who is building and using the world’s most powerful, large-scale computers. As of this writing, the top two computers on this list are in Japan and China, and China now owns 75 of the 500 fastest computers in the world. Furthermore, in September 2011, China deployed a supercomputer that, for the first time, relied on its own domestically-produced microprocessor chips rather than relying on those produced by U.S.-based vendors such as Intel and AMD. Such efforts rest heavily on Chinese government funding and support. The increasing internationalization of IT represents a trend away from US dominance of an industry that has been a major job-producer in our economy.
While flashy stories about college dropouts turned high-tech millionaires may pique our interest, they usually only tell half the story. It is tried-and-true government investment in science and technology research that has truly changed the way we live. The truth is, long-term basic science is an unparalleled catalyst of future economic growth and well-being. Cutting off its funding would be like suspending one’s contributions to a personal retirement account: while it may ease short-term month-to-month budget concerns now, you’ll pay for it in spades later.
Margaret Martonosi is a professor of computer science at Princeton University.