BOSTON -- The rise of fake news has dominated the world of politics since the last U.S. election cycle. But fake news is not at all new in the world of science, notes University of Wisconsin-Madison Life Sciences Communication Professor Dominique Brossard.
"Fake news about science has always existed," she says. "What has changed now is social media and the potential to disseminate this kind of news much faster among social networks."
Addressing scientists today (Feb. 18, 2017) at the annual meeting of the American Association for the Advancement of Science, Brossard discussed the fake news phenomenon in the context of science and online social networks like Facebook and Twitter. She joined moderator Seth Borenstein of the Associated Press and speakers Julie Coiro of the University of Rhode Island and Dan Kahan of Yale Law School.
Fake news, Brossard says, is produced using false information, with the goal of sharing it as real news to influence people. However, "in the context of science, I think this is much murkier and unclear."
She recalled an unpublished study she conducted while a graduate student at Cornell University in which she examined science coverage of the supermarket tabloid Weekly World News. The black and white magazine reported on "strange news," like 30-pound newborns, giant insects and alien abductions. Most of it was made up. But some stories, Brossard says, were based on odd-but-true science. It was a way of enticing readers who were not always certain what was real and what was not.
"We've always had things that can be called inaccurate," she says. "The problem in the science realm is deciding where is the line between bad science reporting and fake news."
For instance, is a news story that says caffeine might cure cancer, based on a study of just 10 people, fake news or is the study just poorly reported?
Unlike other kinds of fake news, inaccurate science news often spreads through social networks because it sometimes offers hope, Brossard says. People will share stories that fit what they want to believe, like a new treatment might cure a loved one's Alzheimer's disease.
"Journalists are not all well-trained to assess the validity of a study," she says. "They are trying to find the human interest and the hope -- a headline like: 'New study brings hope to families with Alzheimer's.'"
Efforts like those of Facebook, which added an option to report fake news, are not going to solve the problem for science, Brossard says. "It may not be a fake story but just bad reporting. Maybe it's not a great scientific study, although I bet if you read the study they mention the limitations."
So, what is the answer?
Brossard offers three paths toward better science communication and less inaccuracy in science news.
"As scientists, we need to actually know what we're doing with respect to communicating science and break the echo chambers as much as we can," she says, explaining that social science research shows simply offering "more facts" to people will not change minds. In fact, it can cause people to double down on their beliefs. Rather, she says, scientists need to find common ground with others, including nonscientists.
As part of this, she suggests scientists need to take responsibility for communicating science by being willing to talk to and work with journalists, to help explain and contextualize their work.
"We need to train scientists themselves to talk about their results and scientists need to be out there," she says. "If we don't, the reporter is going to call someone else. It's our responsibility to make sure fake news or bad reporting is not disseminating."
Second, agencies and institutions must do a better job of what Brossard calls "quality or brand control." She uses Coca-Cola as an example. The company monitors news around the world and flags any media in which it is mentioned, looks at related conversations taking place on social media and launches damage control whenever necessary. Institutions and agencies should be doing this with their science and act when studies are misinterpreted, she says, though there is currently no systematic way to do this.
Third, Google and other search engines should remove retracted studies from search results, Brossard says. For instance, Andrew Wakefield's falsified and discredited study in 1998 fraudulently linking autism and vaccines is still available, though online it is marked as retracted. This does not always matter to the mother or father concerned about the health of their child.
"If I tell you that 87 percent of scientists believe there is no link because the evidence shows that, but then there is this one study, many parents will say: 'I'm not going to take the risk. I'm going to believe that one,'" Brossard says. "It's not that people don't trust science, it's that they are going to use science that fits their beliefs."
While efforts like medical writer and journalism instructor Ivan Oransky's blog Retraction Watch -- which roots out retractions and cases of fraud among scientific publications -- have been instrumental in bringing attention to inaccurate or false studies, Brossard says bad studies might still resurface and Retraction Watch can't catch everything, although they now report between 500 and 600 retractions a year.
"Social media has played a big role," Brossard says. "It's a way for people that share a set of beliefs to be assured they're not alone."
Which is why, she says, it's important to get science news right from the start.
"There is not a clear dichotomy between fake news and real news," she says. "Scientists should engage in communicating their work and realize it's not 'us versus them, the public.' They need to be aware of the consequences of what they say and take into account what we know about science communication. They shouldn't shy away."
###
Kelly April Tyrrell, kelly.tyrrell@wisc.edu, 608-262-9772