Subscribe to Here’s the Deal, our politics newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Leave your feedback
Editor’s Note: Our health care debate, Robert Graboyes says, is trapped. Caught in the back-and-forth over insurance coverage, both the proponents and opponents of the Affordable Care Act are missing the point. To Graboyes, a senior research fellow at the Mercatus Center, distribution of health care is not the problem. It’s the creation of better health care that will save more lives and cut costs.
And while his thesis resonates with the laissez-faire, pro-market attitude most often heard on the political right, he implicates both sides in holding America’s health care hostage in his recently published paper “Fortress and Frontier in American Health Care.” America’s health care has been allowed to languish, denied the opportunities to take the risks — what Graboyes and his colleague call “permissionless innovation” — that have allowed the information technology industry to flourish, and with it, all Americans. To get health care caught up to IT, he argues, we should rethink federal grants for innovation (they’re often counterproductive, he thinks) and decentralize regulatory institutions.
But Graboyes’ assessment of how and why IT has made such strides isn’t universally accepted. “We need to rebalance the story we tell about who the innovators really are,” says Mariana Mazzucato. Author of “The Entrepreneurial State,” Mazzucato wrote on Making Sen$e last year that Apple didn’t build your iPhone; the government — and your taxes — did. “How is technology fostered,” she asked? Far from the “obscure figures in garages” Graboyes credits with IT success, “more often than not, down through history,” Mazzucato argued, innovation has stemmed from “government investment, from the roads of ancient Rome to the Internet of modern America.”
Graboyes’ paper, out a year after Mazzucato’s essay, sees consumers and producers taking the risks, leading the way toward a changing health care industry. Graboyes teaches at the medical centers at Virginia Commonwealth University and the University of Virginia. He produces a twice-a-month health care policy newsletter and is on Twitter at @Robert_Graboyes. Read more from him below.
— Simone Pathe, Making Sen$e Editor
Health care is the surliest corner of American politics. For decades, a bitterly partisan debate — dueling monologues, really — has hung like smog over public discourse. Facts, misconceptions, half-truths and non sequiturs have congealed into conflicting sets of pre-packaged talking points.
Nearly five years after President Barack Obama signed the Affordable Care Act (ACA) into law, the rancor and accusations still swirl, turbulent but immovable, like the Great Red Spot of Jupiter. Maddeningly, the perpetual storm barely touches the great issues that actually determine our health and what we spend to acquire that health.
This is a pity, but the upside is that it suggests an enormous opportunity to shift the debate, dissipate reflexive partisanship, and in doing so, save lives, ease suffering and cut costs.
In short, since World War II, the health care debate has focused almost exclusively on coverage — the number of people with insurance cards. Quality of care and improvements in health have been afterthoughts.
The two sides bicker over how to distribute today’s health care rather than empowering others to create tomorrow’s health care. In focusing exclusively on coverage, both sides do much to suppress technological and managerial innovation — delaying the arrival of better, less expensive care.
All of this can change in a flash, if only we decide that it should. This requires us to examine health care from a vantage point well outside the pro- and anti-ACA talking points.
Why has health care been so much less innovative than information technology over the past 25 years? My study draws answers from diverse quarters, citing, among others, the high-tech weaponry designed by actress Hedy Lamarr, the 1910 destruction of African-American medical education, and a carpenter and a puppeteer who in 2013 collaborated to slash the costs of functional prosthetic hands by 99.9 percent.
Beneath those questions lies a pattern that reveals a philosophical demarcation I label “Fortress versus Frontier” (defined shortly). For decades, the political left and right ensured that information technology was free to innovate on the Frontier while health care was restrained within the confines of the Fortress.
In those industries, two roads diverged, and that has made all the difference. But the roads are converging once again. Innovative technologies are soon likely to render medicine as unrecognizable to our eyes as today’s Internet and smartphones would be to our own eyes as recently as, say, 25 years ago.
If we welcome and prepare for the new medical technologies, America can retain (and in certain sectors reclaim) its preeminence in health care innovation. If we resist, America’s position in medical innovation may come to resemble the 1940s steel industry – a powerful force barricading itself against competitors, followed by decline and rust. First mover’s advantage lost.
Ebola is part of the same story. While many seek individuals to blame for the chaotic response, I would argue that the accusatory headlines are what happens when 20th century institutions, designed for a world of scarce data, confront a tsunami of 21st century data flows. I personally witnessed the new century’s arrival a bit early — in 1993, when a truck contaminated with rodent droppings rolled in from the Navajo Reservation to the Federal Reserve Bank of Richmond. More on that later.
Since World War II, the health care debate has been a struggle of left versus right. The left has tended to favor federal solutions, plus increased public provision of care and coverage. The right has favored state-level solutions, plus private provision of care and coverage. The left employs more pro-regulation rhetoric, while the right professes to favor freer markets. (The latter distinction is largely illusory. Both sides favor powerful regulation and merely disagree on who regulates what and through which means.) A more meaningful distinction is between worldviews we can call the Fortress and the Frontier.
The Fortress has two goals. The first is to imagine every terrible thing that might happen to someone receiving health care and then to focus public policy on preventing any of those things from happening. The second is to protect health care insiders — doctors, hospitals, insurers, drug and device manufacturers — from outside competitors who might threaten their turf.
The Frontier focuses on innovation. It understands that we cannot obtain great quality improvements and massively reduced costs without allowing consumers and producers to take calculated risks. The Frontier also understands that innovation requires constant, uninvited input from unknown dreamers. The IT revolution did not emerge from credentialed insiders anointed by public officials or titans of industry. It came from obscure figures in garages who were allowed to challenge and defeat multinationals in skyscrapers.
From his garage, Steve Jobs challenged IBM. Jeff Bezos sold books from his garage and his business — Amazon.com — changed the way the world purchases just about everything. Smartphones did not originate with telephone industry giants like Western Electric; the BlackBerry came from a couple of engineering students with some venture capital.
In the same years that IT exploded, changing how billions of people live their daily lives, health care was painfully slow to innovate.
Step into a time machine and travel back to 1989. Gather a group of people and tell them of the advances that medical science has made in 25 years — statins, new vaccines, face transplants, and so forth. The audience will be pleased and gratified by the news, but there is little that will shock them.
Now tell them the following story (from my paper):
While camping high in the Rockies, Efram signed and deposited his paycheck in his bank account. Then he purchased The Complete Works of Shakespeare and read Macbeth. A bit later, on YouTube, he watched the Beatles sing “Yellow Submarine.” Using Google Translate, he converted the lyrics into Hindi and then Skyped his friend Arjun, who is working at McMurdo Station, Antarctica. Efram sang his translation to Arjun, who grimaced, but then commented on the beauty of the towering mountain behind Efram. After hanging up, Arjun emailed a restaurant in Denver (a city he has never visited), and an hour later a drone delivered Indian food to Efram’s campsite—all paid for with bitcoins. While eating his tikka masala, Efram toured McMurdo Station via Street View and asked Siri for the current temperature there. “Brrrr. It’s 10 degrees below zero Fahrenheit, Efram,” she answered. Then he accessed Netflix and watched Seven Samurai before dozing off to a selection of Malian jazz, courtesy of iTunes Radio. The entire cost of this sequence of events was $34.77 — $0.99 for the Kindle edition of Shakespeare, $2.00 for the film, $26.78 for the food, and $5.00 for the drone delivery service. And the whole set of interactions required only Efram’s iPad and Arjun’s cell phone — the two devices together costing less than $1,000.
Now, your audience will assume you are lying or delusional. And yet to our 2014 eyes, every step of this story is mundane and familiar — except for the drone delivering dinner to the mountain. And drone deliveries are perfectly feasible; it’s just that drones are the only part of the story that remain in the Fortress.
A common response to this comparison is that health care and information technology are different. Health care involves life and death, pain and suffering. Medical costs can bankrupt a family. Computers and cell phones, on the other hand, are just harmless machines on your desk or in your pocket.
But that distinction vanishes upon further inspection. The Internet and cell phones pose enormous threats to our finances, our happiness, our lives. These omnipresent devices have generated new and devastating forms of financial fraud, identity theft, predation, bullying, loss of privacy, destruction of reputations. Cell phones were central to Improvised Explosive Devices in Iraq and to the terrorists of 9/11. And since hospital telemetry, airliner controls, and fire and burglar alarms use the same technologies, system failures theoretically mean large-scale injury and loss of life. Yet, few of us wish to return to the landlines and encyclopedia-weighted shelves of 1989.
In less than a generation, home computers and smartphones went from dreams to toys of the rich to everyday possessions of Third World village children. America’s population now has near-universal access to information technology that is blindingly more powerful than anything the Central Intelligence Agency owned a generation ago. To get there, we accepted costs and risks, and we allowed serendipitous genius to arise in the unlikeliest of places.
We didn’t need vicious Congressional debates, invasive mandates, or a massive bureaucracy to oversee the diffusion of the new technologies. No one had to beg innovators to innovate. And innovators didn’t have to beg bureaucrats for permission to create.
Just the opposite. The IT revolution really began around 1989, when the federal government stepped away from ARPAnet — the embryonic Internet — and announced that the world was free to use it with few restrictions and little oversight.
One of the stunning questions for future political scientists to ponder is, “Why did we develop a bipartisan consensus that health care should be locked in the Fortress while information technology would be free to roam the Frontier?”
The federal and state regulatory infrastructure that governs health care was constructed in the 20th century for a world in which data was scarce. Now, we are overwhelmed with data in the 21st century. As a result, smart, well-intentioned, competent individuals in federal and state agencies cannot accommodate the floods of data that roll over them during a time of crisis. There’s simply too much data flowing through the narrow strictures of antiquated institutional structures.
In other words, when information technology moved onto the Frontier, the resulting revolution unleashed more data than the 20th century health care institutions, still firmly in the Fortress, could handle.
If this is correct, it means we cannot simply look for better people to run the CDC or the FDA when accusations of Ebola mismanagement roar. At the same time, it does not imply that we must abandon the idea of a CDC or FDA. It does suggest that we rethink the structure of our regulatory institutions.
I personally witnessed an incident that I like to think of as the moment of impact between 20th century medicine and 21st century information technology.
In the spring of 1993, the Navajo and neighboring tribes in the Four Corners Region suffered an outbreak of hantavirus — a pulmonary illness spread via deer mouse droppings. Twenty-four people fell ill, and 12 died.
Months later, a truckload of antique legal documents from the Navajo reservation arrived at the Federal Reserve Bank of Richmond, to be stored in the bank’s vault until a federal court case some months later.
An alert loading dock employee noted the shipment’s origin, saw some droppings, and remembered the news accounts. A phone call quickly alerted the bank’s physician, Dr. Victor Brugh, to the possibility of a deadly virus circulating through the 24-story tower’s ventilation system.
Brugh emailed the CDC. Questions flew, others joined the conversation, and in an hour or two, the CDC determined there was no danger to the thousands of employees.
It is virtually impossible for us today to internalize how remarkable this conventional-sounding story was on that day. The bank had only recently installed an email system, and the technology still seemed other-worldly. Brugh shook his head in disbelief at how rapidly he and others had resolved the potential crisis. Just a short time earlier, he told me, the incident would have meant hours or days of back-and-forth phone calls, busy signals, and missed connections.
Perhaps two years later, the Internet arrived at the Fed, and in a few more years, search engines and social media brought a heretofore unfathomable quantity and quality of data within instantaneous reach.
But paradoxically, the very technologies that so accelerated Brugh’s answer that day would soon clog other answers in previously unimagined ways. The Richmond Fed’s hantavirus crisis was resolved before more than a handful of people knew of the threat. In 2014, the single case of Ebola in Dallas set websites and email systems aflame worldwide in real-time.
Fingers pointed at the official who allowed the Dallas nurse to board an airliner. One of my first thoughts was, “How flooded was that person’s computer with queries and information?” Perhaps the official erred—I don’t know—or perhaps it was Frontier data colliding with a Fortress institution.
For a visual analogy, imagine an early 20th century market town, with multiple roads feeding all traffic through the center of town. This works quite well as long as the number of horse-drawn wagons and slow-moving automobiles are few and far-between. But deep into the Automobile Age, this perfectly well-designed traffic pattern clogs into maddening jams. And when an emergency occurs, traffic draws to a standstill.
With the market town, the answer is to reconfigure the hub-and-spoke roads into a more flexible grid of alternative nodes and routes, with commerce dispersed across the landscape. We will ultimately have to restructure the CDC and FDA and other institutions in parallel fashion. The European Union, for example, assigns drug approvals to multiple private agencies — sort of like competing Underwriters Laboratories. The Internet itself is a remarkable story of autonomous, decentralized regulation; in the 1990s, many feared that online commerce would be impossible on an unregulated Internet. Safety mechanisms grew organically, and just a few years later, some of us rarely set foot inside a store.
Health care is about to change, and none of us — perhaps least of all the insiders and experts — can say how. Medicine today is where information technology was in 1989. And almost no one in 1989 could have envisioned the Efram and Arjun story.
We can speculate on a few of the megatrends. Transplantable organs made of your own cells. Drugs tailored to your body’s specific DNA. Nanobots inside your body, repairing the genes that harm your health and threaten your life. Wireless telemetry monitoring your health, with artificial intelligence systems mining the data to warn you of trouble before any doctor would ever catch it.
The question is how we unleash these innovations. In information technology, the story over the last 25 years was largely what my colleague Adam Thierer calls “permissionless innovation.” A rogue’s gallery of tinkerers and dreamers thought up new hardware and software and ways to use both. The vast majority of ideas vanished without a trace. A few turned people you had never heard of into billionaires and changed our lives. And some old, esteemed businesses vanished or withered.
Apple CEO Tim Cook (reportedly quoting Steve Jobs) said, “Our whole role in life is to give you something you didn’t know you wanted. And then once you get it, you can’t imagine your life without it.”
Are we willing to move American health care to the Frontier so it can lead the way toward these new technologies? Or will we remain in the Fortress and let other countries seize the high ground? We have time to choose — but not much. Recently, the founders of Google, Sergey Brin and Larry Page, lamented the stifling burden of regulation for entrepreneurs in health care, and I suspect that the characteristics of the Fortress underlie their concerns.
One traditional Fortress approach — federal grants for innovation — is likely to push innovation in less-than-ideal directions. In the early 1970s, an innovation grant would probably have gone to IBM to build a bigger mainframe, not to Steve Jobs to produce a tiny desktop computer. And enough innovation grants to IBM might have made it impossible for Steve Jobs to compete.
Shifting health care to the Frontier opens the possibility of real progress — of better health for more people at lower cost, year after year. This approach also offers alternatives to the all-encompassing ACA and its earth-shattering repeal-and-replace alternatives. My paper suggests a few dozen small initiatives to begin the transition. There are hundreds more waiting just behind. If we so choose.
Robert Graboyes is a senior research fellow with the Mercatus Center at George Mason University, where he focuses on technological innovation in health care. He authored “Fortress and Frontier in American Health Care,” teaches health economics at Virginia Commonwealth University and is a recipient of the Bastiat Prize for Journalism.
Support Provided By:
Additional Support Provided By: