Support Provided ByLearn More
Tech + Engineering

Augmenting Social Cues for the Disabled

NOVA Next

Recieve emails about upcoming NOVA programs and related content, as well as featured reporting about current events through a science lens.

For the first time in 12 years, Bryan Duarte is sensing a smile. The 30 year-old software engineering student became blind at age 18 after a motorcycle accident. He spent years after the accident learning to navigate the world in darkness and figuring out how to communicate in a culture that largely relies on visual cues.

“I was very good at reading people’s body language, their nonverbal language,” Duarte says. “To not have access to that information now, it makes things pretty difficult in social situations… without those social cues, you don’t know if someone is looking at you, if you have their attention. You don’t know if you maybe offended them. You don’t know if they’re smiling, if they’re engaged.”

Support Provided ByLearn More

Now, for the first time since the accident, Duarte is getting those nonverbal cues. In Arizona State University’s Center for Cognitive Ubiquitous Computing, Duarte is sitting in what appears to be a typical black desk chair. It has a USB port built into the back where users can plug in a computer with a webcam. Sitting across from Duarte, with a webcam aimed at his face, is Troy McDaniel, associate director for CUbiC. During conversation, a built-in facial recognition system reads whether McDaniel looks happy, surprised, or neutral and communicates that information back to Bryan through specific vibration patterns emitted from the chair’s lining.

haptic-chair_2048x1142
Arizona State University’s "Haptic Chair" communicates social cues to the user through vibration patterns emitted from the chair’s lining.

These patterns are designed to mimic the way our mouths convey glee, boredom or despair. When McDaniel adopts a neutral gaze—straight-lipped, no smile or frown—Duarte feels a strip of pancake motors in the chair vibrate in a straight line, starting from the right of his lower back and moving to the left. When McDaniel looks happy, Duarte feels a U-shaped vibration that mimics the way lips curve upwards when they smile. This demonstration of ASU’s “Haptic Chair” is the first time in more than a decade that Bryan has been able to get a sense for how his conversation partner is reacting without directly asking. During the 45-minute demonstration that I watch via Skype, Duarte excitedly tells me when he can feel McDaniel’s body language shift.

“He’s smiling right now. I can tell you that,” Duarte says. “When he was talking, I was getting kind of a mix between smiles and surprised looks, which is good because he might not have been exactly smiling, he might not have been exactly surprised, but to know he had emotion going on throughout that conversation is in a lot of ways the important aspect… As a blind guy, it’s really cool to be able to sit across from somebody and know even when they’re quiet what they’re doing.”

Sensing Social Cues

The Haptic Chair is currently a proof of concept and is a long way from being ready for the commercial market. As of now, the chair’s facial recognition system doesn’t work when speakers wear glasses and it’s not capable of identifying negative emotions like sadness and fear which generally aren’t as visually distinguishable as happiness or surprise. McDaniel’s team is planning a series of user studies where they hope to tweak the system to better accommodate blind users.

When—and if—the Haptic Chair reaches the commercial market, it may be the only product of its kind. While a wide range of assistive technologies help the visually impaired accomplish specific tasks, “most of the focus has always been how do we help people navigate? How do we help people to read?” says Sethuraman Panchanathan, director of the ASU’s Center for Cognitive Ubiquitous Computing. “There hasn’t been as much focus on social interactions.”

And as a growing body of research suggests that social health is directly linked to physical health, that’s a major problem. A meta-analysis of more than 70 independent studies published last year in Perspectives on Psychological Science showed that social isolation is directly linked with higher mortality rates and is even riskier than obesity. Separate research published in PLOS Medicine analyzed 148 studies that tracked the social lives of participants for an average of seven and a half years. They found that participants with strong social relationships had a 50% higher chance of survival over that time period compared to those with weak social connections. More recent research links the lonely with a 29% increased risk of heart disease and a 32% increased risk of stroke than the well-connected. In fact, long-term loneliness can have such devastating consequences on the body that some researchers characterize it as a public health concern.

But solving social interaction issues is a huge challenge. Developing technologies that work in real time requires high processing speeds, not to mention a design that can deliver feedback without disrupting the flow of conversation or burdening the user with clunky machinery to carry around. The goal of creating assistive devices that function as fast and seamlessly as the human brain means that some of the most accessible technologies, like smartphones and laptops, devices already owned by many disabled people, aren’t always ideal since using them draws attention away from a conversation.

Samson Cheung, director of the Multimedia Information Analysis Lab at the University of Kentucky in Lexington, is currently developing assistive social technologies for people with autism using Google Glass. One of Cheung’s autism-related apps aims to help users maintain eye contact—a common problem experienced by those with mild to severe autism. When the user’s face is roughly in line with the face of someone looking back, a yellow smiley face appears on top of the Google Glass camera feed, allowing the user to better align their head with their conversation partner and “basically giving visual feedback to the user that OK, you’re doing great,” Cheung says. Should the user’s head turn away from the conversation partner, a sad face appears, letting the user know to make adjustments. Cheung has also created a similar app that monitors when a user’s voice volume is appropriate relative to the ambient noise level of a room, and he envisions both apps being used, not just in everyday social interactions, but also in professional settings like a job interview.

Despite Google halting production of the original Google Glass last year, the technology is still a favorite among researchers developing ways to help autistic people better understand social cues in real time. SayWAT, a program created by researchers at the University of California, Irvine, and the Ensenada Center for Scientific Research and Higher Education in Mexico, gives autistic users live feedback on the tone and rhythm of their voice while Stanford’s Autism Glass Project has developed a wearable behavioral aid that helps autistic children read facial expressions and maintain eye contact. The project is currently recruiting families with autistic children ages 6 to 16 for a clinical study.

The problem with Google Glass (and many other types of wearable technologies) is that providing real-time feedback takes power—lots of power. Cheung’s apps, for example, drain Google Glass in about 30 minutes, making them impossible to use throughout longer social events without a supplemental power supply. Luckily, increasing social interaction for many disabled people doesn’t require invasive technologies as pricey or as complicated as Google Glass or a Haptic Chair. A small 3D printer has radically changed the social lives of some residents at The Boston Home, a long-term care facility in Massachusetts that serves adults with progressive neurological disorders. Purchased nearly three years ago, the $3,500 printer is mainly used to create objects like customized joysticks, control knobs, and cup holders that can be mounted onto a wheelchair—items that aren’t always covered by insurance companies and may be difficult to find on the commercial market.

Don Fredette, the adaptive equipment specialist who runs The Boston Home’s 3D printer, says that the printer’s rapid prototyping capabilities allow him to better keep up with the changing needs of patients whose ability levels may change from one day to the next. Last year, Fredette designed a tool that connects a water color paint brush to a mouth stick, allowing the user to adjust the brush to be closer or farther from their face and paint without using their hands. The tool has allowed at least one resident to attend art classes.

“This is not complicated,” says Fredette, who largely taught himself both printing and design. “This isn’t a prosthetic. It isn’t high-tech… but it’s making a huge difference for the person.”

Farzin, a 53 year-old man with advanced multiple sclerosis, says that the 3D printed cup, notepad, and remote control holders Fredette created and mounted onto his wheelchair have improved his social life “big time” by decreasing his reliance on nursing staff throughout the day.

“Basically, I don’t need anybody to help me all the time,” Farzin says. “They dress me and put me in the chair in the morning, and they don’t see me until lunch time. And then they don’t see me until after dinner when I go to bed.”

Bridging the Gap

Kelly Buckland, executive director of the National Council on Independent Living, says that social interaction is directly tied to broader issues facing the disabled community, such as lack of employment options, transportation, and affordable housing.

Currently only about 17% of Americans with disabilities hold jobs, reports the Bureau of Labor Statistics, versus nearly 65% of those without disabilities, and federal disability benefits frequently can’t pay the bills either. The average annual income for disabled Americans drawing Supplemental Security Income benefits is less than $9,000 a year—more than $2,800 below the 2016 poverty line, according to a study by the Consortium for Citizens with Disabilities Housing Task Force and the affordable housing nonprofit Technical Assistance Collaborative. That amount frequently doesn’t even cover rent, as the the average monthly cost on a modest one-bedroom apartment exceeds the average SSI check by 4%, and it blocks independent disabled people who rely on federal assistance from renting in sizable swaths of the country. The study found that in four states and Washington DC, $9,000 won’t buy a year of rent in any apartment in any housing market.

Technologies that make a dent in any of these major structural problems will inherently reduce social isolation, Buckland says, and will “make people with disabilities much more able to live independently.”

One way to solve some social interaction issues is to ensure that devices developed for the mass market are accessible to disabled populations, says Gregor Wolbring, associate professor in community rehabilitation and disability studies at the University of Calgary. That means incorporating people of varying ability levels into the design process, from early product conception all the way through user testing. But making that shift requires a cultural change in the design and development process, one that would create extra work for developers.

“We need a way that [disabled] people are automatically part of the discourse, and it’s very hard to do,” says Wolbring. “There are all kinds of issues but I think we have to start with that.”

Bryan Duarte isn’t waiting around for the design process to change. Duarte is currently working with Arizona State’s Center for Cognitive Ubiquitous Computing to build a wearable version of the Haptic Chair’s feedback system that will provide information on nonverbal cues in real time. One basic goal, he says, is to create a device that lets users know if their conversation partner is still present or if they’ve nonverbally ended the interaction and left.

“They walk away and you’re still there talking,” he says. “That happens all the time to blind people.”

Hopefully not for long. Projects like the Haptic Chair may be signs that the landscape of technology accessibility is shifting to include the social and emotional needs of the disabled as well as their physical and logistical needs. That’s not a radically new concept, Duarte says, “but I think it’s starting to get more attention.”