The diffusion of IT into all aspects of manufacturing and services is making terms like "high tech" and "low tech" obsolete. It's time for new language to describe a rapidly evolving economy.
Last week Mark Muro of the Brookings Institution’s Metropolitan Policy Program and I released a new report that highlights the need for the policy community to update our mental models of industry and the economy. Entitled “America’s Advanced Industries: New Trends,” the report—like an earlier predecessor report—advances a new way of seeing the economy and what matters in it.
All of us, including people in the manufacturing world, have a tendency to fall back on concepts like “high-tech” and “low-tech” even though these 20th-century distinctions no longer fit our twenty-first century economy very well. While the term “advanced industries” is hardly definitive, the label circumvents increasingly outmoded terms and hopefully will inform thinking that will improve the ability of policymakers to foster shared prosperity.
In this respect, to speak of America’s “advanced industries” is to frame the economy in a new way that assumes a much-needed new mental model.
To be sure, our mental models change more slowly than the world we live in. That’s because such models help us make sense of complex patterns. However, even the best mental models tend to get locked in as we teach them in our schools, use them to gather data, and employ them in our discourse. Even if our navigation falters and our influence wavers as the world changes, we hold on to our established models because we fear that we will fall back into confusion without them. You can’t beat something with nothing.
The most important trend of the 21st century economy is not the development of high-tech in its own right but rather the application of information technology to all industries, including those that were previously “low-tech.”
The concepts of “high-tech” and “low-tech” are aspects of an exemplary mental model that captured important developments in the 20th century economy. That century was marked by the systematic use of science for the first time to advance industry. Recognition of this extraordinary phenomenon became widespread after World War II. Industries that drew heavily on science, such as electronics and pharmaceuticals, were prized for their contributions to national goals, like national security and public health, as well as their rapid growth. Governments began keeping R&D statistics around 1950, and economists came to rely on the ratio of R&D spending to sales to identify industries with these uniquely valuable characteristics. Those with a high ratio were “high-tech,” and everything else was “low-tech.”
However, that binary distinction applies less and less well. The most important trend of the 21st century economy is not the development of high-tech in its own right but rather the application of information technology to all industries, including those that were previously “low-tech.” Software is “eating the world,” some say. Our factories and our cars are getting smarter, just like our phones. Meanwhile, it is not just the “high-tech” development of new products and industries based on R&D that matters but also the diffusion of IT into the rest of the economy. Automaking and machinery making are now almost as “high-tech” as computer systems design.
Hence the need for new language and new models and new categories such as the concept of “advanced industries” as opposed to “high-tech” and “low-tech” ones. “Advanced industries” differ from “high tech” ones. According to Brookings, advanced activities are defined not only by R&D spending but also by the disproportionate employment of STEM workers. It is these workers who apply IT to economic activities that have not used it effectively in the past. The recent advanced industries trends report captures this phenomenon by showing the dominance of digital services like computer systems design in recent job growth. To be sure, some of the workers in digital services industries, especially in metros like San Francisco and Boston, are designing new products and fit comfortably into our twentieth century mental model of “high tech” and “low tech.” But many more, in places like Jacksonville and Spokane, are engaged in critical, high-value work that is not “high-tech” in the conventional view but which is certainly not low-tech as it involves digitizing, office processes, insurance forms, and, eventually, pretty much everything.
“High tech” and “low tech” aren’t the only ideas that need to be refreshed. “Manufacturing” and “services,” as demonstrated by the Production in the Innovation Economy report from the Massachusetts Institute of Technology, are no longer distinct. Similarly, our idea of the “supply chain” also needs to be reinvented, as Mercedes Delagado and Karen Mills have shown.
Mental models will always lag behind the real world. As the great philosopher Joni Mitchell put it, “you don’t know what you’ve got, til it’s gone.” But we must try harder to close the gap. If policy analysts remain hostage to our legacy systems, we make it more likely that our society and economy will remain hostage to old idea, with consequences that will not help anyone.
David Hart directs the Center for Science and Technology Policy at George Mason University. Hart is a non-resident senior fellow at the Metropolitan Policy Program at Brookings and was a co-author of the Brookings report, “America’s Advanced Industries: New Trends.”