A 10-minute, face-to-face conversation can reduce transphobia, according to researchers at Stanford University and the University of California Berkeley. The study provides hard validation for canvassing procedures developed by the Los Angeles LGBT Center, which crafted these methods by conducting more than 13,000 door-to-door interviews over the last half century. One in 10 voters shifted their attitudes toward the transgender community after these deep, one-on-one conversations, and those new feelings held firm for up to three months.
The results are reported today in the journal Science Magazine, and if this sounds familiar, don’t worry. You’re not experiencing déjà vu. Last year, Science Magazine became embroiled in controversy when a similar study from a completely separate group was retracted for ostensibly falsifying data. One more twist: the authors of today’s study are the same folks who poked holes in the earlier report.
This turn of events raises a series of questions: Why should people believe this new study? Does face-to-face conversation actually alter views on discrimination or otherwise? What do retractions mean for sociology research, if not science as a whole?
The LaCour dilemma
The saga started in December 2014, when Michael LaCour, a graduate student at UCLA and Donald Green, a political science professor at Columbia University, published a study in Science that tested whether the Los Angeles LGBT Center’s canvassing techniques could shift opinions on gay marriage. LaCour and Green reported yes — that canvassers significantly changed views in a panel of 952 California voters by using “active processing.” To implement this persuasion technique, a canvasser encourages the voter to assume the perceptions of the gay community, in this case by recounting stories about real characters that involved prejudice against same-sex marriage.
The study reported that voter opinions shifted whether the canvasser was homosexual or heterosexual, but the attitudes only remained in place after nine months if canvasser was gay. The study received press from places like This American Life, The Washington Post, BuzzFeed and the New York Times.
At the time, two other researchers — political economist David Broockman and political scientist Joshua Kalla — were launching a similar study on dissuading transphobia by using the Los Angeles LGBT Center’s practices, and they noticed a discrepancy.
These experiments typically start with a big universe of voters that you want to recruit. Most decline, but LaCour and Green noted a pretty high “yes” rate: 12 percent of the initial voters agreed to their surveys.
“Our rates were much lower rate … 2.7 percent. The difference wasn’t worrisome, but it’s what caused us to call their survey vendor,” Broockman, who works at Stanford University, told NewsHour. The outside survey distributor cited by LaCour and Green said that they hadn’t been involved in the study, which set off the alarm.
Soon after, Broockman, Kalla and Yale University biostatistics professor Peter Aronow released a report citing several irregularities, and Green ultimately stated that LaCour had faked the research. LaCour denied these claims, but said he lacked proof because the original data were destroyed. Science Magazine retracted LaCour and Green’s study on May 28, 2015.
Should we trust this new study on transphobia?
But despite the controversy, Broockman and Kalla carried on with their experiment that targeted voters in Miami. According to their new report, they chose South Florida because of a unique opening:
In December 2014, the Miami-Dade County Commission passed an ordinance protecting transgender people from discrimination in housing, employment and public accommodations. Fearing a backlash that might increase transphobia, volunteers and staff from the Los Angeles LGBT Center and SAVE (a South Florida LGBT organization) went door to door to have conversations with Miami-Dade voters.
So last June, 56 canvassers spoke with 501 voters who were randomly split into two groups. One set heard a 10-minute long, well-rehearsed speech about the pitfalls of transgender discrimination, while the control group heard a speech about the merits of recycling.
Respondents filled out surveys via email three days, three weeks, six weeks and three months after being canvassed. These questionnaires gauged their opinions, such as on whether or not transgender people should teach in schools or have access to the bathrooms that conform with their gender identity.
In this conversation, recorded in March 2016, a Leadership LAB volunteer speaks with a voter in Los Angeles about including transgender people in non-discrimination laws using the same approach that was studied by Broockman and Kalla in Miami in 2015. This is an edited version of a longer conversation.
Los Angeles LGBT Center’s approach held up and increased positivity toward the transgender community in the treatment group but not the placebo group.
Broockman and Kalla found, using a widely accepted metric called the “feeling thermometer,” that attitudes toward the transgender community saw a 10 point bump on this scale. This boost is larger than the average increase seen in American attitudes toward gay men and lesbians in national surveys of the feeling thermometer, which only jumped 8.5 points between 1998 and 2012.
The positive trend held regardless of political affiliation (Democrat/Republican) and whether or not a voter had a more supportive or less supportive view of the transgender community prior to meeting the canvasser. Transgender or non-transgender canvassers had the same influence on changing opinions.
“Our ability to change voters’ hearts and minds has been measured, this time for real,” Dave Fleischer of the Los Angeles LGBT Center said in a statement.
The funders of the study, according to Science Magazine, asked University of California Berkeley political scientist Gabriel Lenz to play watchdog and make sure the data were actually collected.
“The data are solid and the analysis convincing,” Lenz told Science.
Elizabeth Paluck, a psychologist at Princeton University who was not involved with the research, agreed. She told NPR that “They were very transparent about all the statistics. It was a really ingenious test of the change. If the change was at all fragile, we should have seen people change their minds back.”
Does deep canvassing actually garner votes?
Broockman and Kalla tested the resilience of these newly acquired attitudes by having canvassers show anti-transgender political attack ads to voters six weeks into the project. At first, both treatment and placebo groups became less supportive of Miami’s anti-discrimination ordinance, but in the end, shifts toward the acceptance of transgender rights withstood the test. The effect of attack ads wore away over time, and the groups returned to their pre-ad states.
This finding suggests the millions of dollars spent on attack ads each election cycle may fail to hit the mark, but this study doesn’t specifically confirm if deep canvassing can turn heads at the ballot box.
“I think that’s one clear next step for us, but one of the reasons that we got interested in this field is for over a decade, there have been a lot of studies on voter mobilization that find high quality face-to-face conversation is the most effective way to turn people out to vote,” Broockman said.
He cites a recent example by Harvard Business School professor Vincent Pons. While serving as the campaign field director for current French president François Hollande, Pons conducted a countrywide experiment where he randomly assigned canvassers to visit some precincts but not others. The face-to-face discussions elevated Hollande’s voter share by five percentage points and accounted for one-fifth of his margin of victory in the second round of voting.
“The treatment isn’t the same, but in the broadest sense, it shows you that this kind of high-quality canvassing activity can lead to pretty impressive effects,” Broockman said.
Did the LaCour and Green article leave a black mark on the field of sociology?
Broockman argues no:
All and all, this whole experience has shown that science works. Mistakes or fraud whether honest or dishonest isn’t unique to academia. What is unique is that we have norms and institutions like data transparency and the norms of open criticism that allow various issues in published and unpublished research to be discovered and discussed. I think that the fact that this all happened is a really good sign of the ability of science to correct mistakes.
Meanwhile, the methodology employed by Broockman and Kalla and their results provide a compass for organizations fighting discrimination by showing what works with voters.
“The bottom line is that we have new insight into how to reduce prejudice against transgender people,” Fleischer said. “Considering the recent loss at the ballot box in Houston, the new anti-LGBT legislation in North Carolina and the threat of future anti-LGBT ballot measures and bills, this study has real practical importance. We in the LGBT community can put ourselves in a better position to win if we start having deep-canvass conversations now, well in advance of a flash point.”
“These conversations are a real game-changer for us here in Florida,” said Tony Lima, executive director of SAVE. “Because of these conversations and their impact, we’re getting closer to being the first state in the South to pass statewide protections for LGBT people.”