Roozbeh Ghaffari won’t let me take a picture of the thin, rectangular piece of silicone he just handed me. It’s about the size of two postage stamps, and, as you’d expect, it feels rubbery and folds freely. Shiny thin wires show through their transparent packaging. They slither through the silicone like a exuberant dragon dancers in a Chinese New Year parade. As I thumb the device, I worry I’m going to kink one of the delicate-looking metallic threads. Don’t worry, Ghaffari assures me. “You can bend it in half.”
The device is a sensor, and it represents some of the core technology of MC10, a startup that makes flexible electronics. Ghaffari, cofounder and director of advanced technology at the company, isn’t at liberty to tell me what, exactly, it senses. It could be temperature, muscle activity, or heart rate.
The sensor’s counterpart is another rectangle of silicone. This one encases more traditional semiconductor chips, each about half the size of your pinky nail. Rather than being soldered to a brittle green board that’s etched with interconnects, the chips are linked by what appear to be the same wavy, bendable wires. It’s not as flexible as the passive sensor because of the chips, but it’s still supple enough to bend around my finger. It’s the brains of the system, Ghaffari tells me. It receives data from the sensor and then processes, stores, and passes on that information.
Ghaffari and I are sitting inside a brick-walled, light-filled conference room at MC10’s headquarters in Cambridge, Massachusetts. A pleasant breeze whispers through the open windows, which look down on the packed parking lot one story below. Outside the conference room, the open office is similarly stuffed. People are buzzing about, stepping over and past other researchers and programmers hunched over their crowded desks. MC10 only makes one product that you can buy right now—a thin cap worn by football players under their helmets to alert them to potentially dangerous blows—but you get the sense that’s about to change.
Wearable sensors, like the kind that MC10 and other companies make, will take computing to its next frontier—our bodies. Until recently, big-name technology companies were content to fight for space on our desks, our laps, or in our pockets. But as each of those becomes increasingly saturated, they’ve started to turn their attention to our wrists, fingers, and faces. The technology is ripe and some of the apps have already been written. All that’s needed is a reason to buy.
Companies like Apple, Samsung, and Google are clearly hoping that health monitoring is what turns the wearable market into the next billion-dollar opportunity. It’s a road that’s been partially paved by Fitbit and Jawbone, two companies that make simple fitness trackers. They hang off our wrists or around our necks, recording things like footsteps and heart rate. The big players in mobile are muscling in on the market, having recently announced apps, prototypes, or both.
But each of those forays is a tentative toe in the water. The devices available now, and to be announced in the coming months, are just the tip of the iceberg. They’re simple and unsophisticated, like early cell phones. In the coming years, wearable health sensors will grow more capable, and they’ll likely become integrated into our daily lives. Yet the challenges they face are far more complex than those required by other revolutionary devices like smartphones, and the regulatory hurdles are far higher. That means the golden age of wearable health sensors isn’t upon us, but it will be soon. Here’s a look at how we got here—and where we’re going.
No Longer a Novelty
In the late-1990s, John Rogers, now a materials scientist at the University of Illinois, was playing around with new ways of making electronics from unusual materials at Bell Labs. He and his colleagues were working on circuits made from organic materials printed on bendable sheets of plastic. One of their projects involved making flexible displays that could curl like paper. While the work was “exploratory” with no defined product, Rogers says, “we thought that was a cool vision for a class of consumer electronics device.”
Flexible displays didn’t move much beyond concept phones and prototypes, but Rogers remained captivated by the idea of flexible electronics. In 2003, he left Bell Labs for Illinois and started his own research lab. “When I finished up in Bell Labs and made the transition back to academics, I decided that kind of form factor was interesting, but maybe something beyond flexible would be even more compelling,” Rogers says. What really interested him, he adds, was “going from flexible—things that bend like paper or plastic—to something that could not only bend but also stretch like a rubber band.”
Rogers also wanted to ditch the polymer semiconductors that drove the backplane of Bell Labs’ bendable displays. Their performance was lackluster. Instead, he wanted to make flexible materials that “could potentially support very sophisticated function in electronics—not just an active-matrix backplane, but maybe a real radio or microprocessor,” he recalls.
Rogers’s lab zeroed in on silicon, a well-understood semiconductor that he knew would offer the performance he desired. But traditional silicon doesn’t bend easily. So Rogers’s lab layered single-crystal silicon just a few hundred nanometers thick onto a rubbery substrate. It was thin enough that it wouldn’t break when bent. The next step—stretchability—required some more clever engineering. Rather than alter the silicon substantially, they pulled the rubbery substrate taut before affixing the silicon; releasing the tension caused the silicon to collapse like an accordion, but not break. The result was a device that was both flexible and stretchable, yet it still had retained silicon’s computational potential. But what, exactly, they would use them for, no one really knew at first.
“What I think qualitatively changed for us is that, as we began to give seminars on our work at various universities, various conferences, I began to notice a lot of interest from the medical community,” Roger says. Rather than just building supple gadgets, he and his lab started to think about how computers could interface with the human body. “From that point, the research took on a different tone.”
After that, things started to fall in place quickly, and within five years of moving to the University of Illinois, Rogers was ready to test some of his discoveries in the real world. He asked George Whitesides, his postdoc advisor at Harvard with experience founding companies, for some introductions. Shortly thereafter, MC10 was born.
Quick to Market
The path MC10 has taken is emblematic of the industry as a whole. Their first product is a device called the Checklight. It’s a skull-cap worn under the helmet of athletes in contact sports such as football. There’s a light that sits on the nape of the neck that glows red when a potentially harmful blow strikes the player’s head. It’s relatively simple, and, more important, it’s not regulated by the FDA. That means MC10 could get it to the market quickly while they were getting their other devices approved.
Many of the other devices sold today are also unregulated by the FDA. That means they can’t make any specific claims related to the device’s function. For example, the red light on the Checklight doesn’t necessarily mean a player has a concussion, just that they should probably take it easy and maybe see a doctor. Other sensors like the Fitbit or older devices like Polar heart monitors also operate in this unregulated space. They give people raw numbers like heart rate, blood oxygen level, or steps taken and leave the medical conclusions to the user.
“That’s a limited set of data,” says Ida Sim, a professor of medicine at the University of California, San Francisco. “That data is really being used for wellness and fitness, which really doesn’t address the bulk of the market, the bulk of people. The value of that data for clinical care is not that high.”
Many companies remain hesitant to draw medical conclusions, though, because it means going through the FDA approval process, just like any other new medical device. Depending on the claims being made, that process can be take anywhere from two to ten years or more.
Given the potential in healthcare, though, companies making wearable sensors will probably be pushed in that direction, says Daniel Oliver, a Blavatnik Fellow at Harvard University who is also working on wearable technology to monitor head impacts. “Eventually people will get tired of these if there’s not concrete conclusions being drawn from whatever sensor you’re wearing.”
Sophisticated sensors like the type I saw at MC10 may be several years from the market, but many scientists are already using off-the-shelf components to monitor our bodies. Harrison Hall is one of them. A PhD student at Dartmouth College, he’s working on a body-scale sensor network to detect and measure epileptic seizures. With enough data, he hopes that we’ll be able to better understand the different types of seizures, maybe even to the point of predict an impending episode.
Decades of research have shown that certain kinds of seizures cause a drop in blood oxygen levels, and Hall hopes to build on that work by taking more continuous measurements across a broader population. Typically, he says, blood oxygen levels are spot checked. “There’s no continuity, and they don’t really take into account what was immediately happening before or post. You lose a lot of information there.” Currently, Hall is testing various sensors, including accelerometers and pulse oximeters, both of which are inexpensive and readily available. Accelerometers would help detect the onset of a seizure, and the pulse oximeters would measurehow the person’s oxygen levels change before, during, and after the event.
As his data set grows, Hall hopes to apply machine learning algorithms that will train themselves to pick up on differences between seizures. They could help make some useful generalizations about epilepsy in general. But even if that’s not possible—there are many different forms of the neurological disorder—the algorithms can still draw conclusions about an individual. It could allow for treatments that are tailored more carefully to a person’s specific form of epilepsy.
Hall’s monitor is still years away from widespread availability, but others are already making use of sensors that most of us carry every day. Anmol Madan became interested in what our smartphones can say about ourselves when he was a graduate student at MIT’s Media Lab. “There’s about 5 billion phones on the planet,” Madan says. “It turns out your phone is the ultimate wearable because people are always carrying them, charging them, uploading the data, and all the other things we expect people to do with wearable devices and sensors.”
That got Madan thinking. For many of us, phones are a portal into our world. They see who we interact with and how. They know when we wake up in the morning and when we go to bed at night. And because they’re in our pockets or purses for so much of the day, they can tell how often we move and where we go. That data can paint an incredibly intimate portrait of our lives and, by extension, our well-being.
Madan began playing with different models of human interaction, and he soon realized that with the right observations, he could tell if a person with a history of depression was suffering from an episode. They tended not to communicate with friends and family as frequently, nor did they leave the house as often or move about their home as much.
So Madan devised software that gathers messages, phone records, GPS locations—even accelerometer data—and runs it through a machine-learning model to determine when a person is symptomatic. The software runs in the background on someone’s phone and sends data off to a server where the algorithms reside. If the algorithms suspect a person is suffering a depressive episode, his company, Ginger.io, fires off a notification to a specified person, whether that be a nurse, friend, or family member. The hope is that if those people can intervene at the right time, Madan says, it could prevent the episode from worsening.
Up in the Air
Wearable sensors and the services they power are arriving at a critical time in healthcare, especially in the United States, where the Affordable Care Act is changing how doctors are being reimbursed. The idea is that doctors shouldn’t get paid based on how many visits they squeeze in or how many tests they run, but how well their patients do. “Part of the reason why mobile is interesting is because it intersects the healthcare system at a time of rapid changes and fundamental changes in reimbursements,” says Sim, the UCSF doctor. “Everything is up in the air right now.”
Wearable technology could facilitate that transition, allowing patient outcomes to be tracked over time, efficiently and without frequent, costly follow-up visits with a doctor or nurse. That could help reign in healthcare costs, or at least blunt its seemingly inexorable rise.
Others are hoping that wearables could help us manage chronic illnesses that are prevalent in an aging population. “Patients are spending less and less time with doctors. Diseases that we cannot manage are usually diseases that are chronic and slowly or rapidly degenerative, but diseases that are changing. They’re not static,” says Vicki Sato, a professor at the Harvard Business School and advisor to Ginger.io. “As the medical need continues to increase in areas like that, but our patient-physician interface time decreases—we’re going to have to fill that gap somehow or quality of care will certainly diminish.”
Whether wearables succeed in making us healthier—and keeping us that way—could depend on how the data is used. Today, information generated by wearable sensors is treated as a competitive advantage by companies. “I really think data is the valuable commodity for a company like Fitbit,” says Oliver, the Blavatnik Fellow. Sensor companies retain massive data sets so they can refine their product and algorithms. Sharing that data with other companies could cost them their competitive advantage. “I can’t ever really imagine them just opening their data up,” Oliver says.
But such proprietary approaches could limit how extensively wearables impact healthcare, Sim says. Just look at electronic health records. “In electronic health records, it’s very siloed,” she says. Dominant players hold data very close to their chests. “You can’t share data. We spend billions of dollars trying to get data and share data. It’s just the wrong approach. And yet mobile is new. There’s no legacy systems, no dominant players, even now.”
That’s why Sim helped found Open mHealth, a set of open standards that allows any doctor, patient, or researcher to read data from any device. The goal, she says, is to “break down silos so that data can flow much more freely across different apps and differently solutions.” (Patients, she emphasizes, still have ultimate control over which data is shared with whom.) Sim likens it to TCP/IP, the standards that govern how data flows across the internet and that have enabled its exponential growth over the last several decades.
Market forces outside companies’ control may force some degree of standardization between devices and services. Early health and wellness programs used to issue specific fitness trackers to participants to ensure consistent data, says Greg Norman, a senior research scientist at American Specialty Health, a wellness program company. “Now there’s this whole movement called ‘bring your own device,’ ” he says, which greatly complicates matters. Not only is the data not shareable, it’s not always consistent from one device to another. “You hope that eventually there will be some standards and metrics.”
The tension between proprietary control and open sharing may ultimately dictate the role wearables play in our health and well-being. Closed systems may help drive development early on, but once the market matures, flexibility may win out. “The healthcare industry can’t ignore it, this idea of owning your data, having access to your data, determining who sees your data,” Norman says. “At some point, there has to be consolidation and agreement.” By then, wearables may be so pervasive and inconspicuous that we may not even notice we’re wearing them.