Tag Archives: disinformation

Within a couple of days, President Trump has posts removed from both Facebook and Twitter

Twitter deleted one of the president’s posts and temporarily blocked his account for publishing the personal details of a journalist, while Facebook deleted one of Trump’s posts for falsely saying flu is more lethal than COVID-19.

It’s been a bizarre week even by Trump standards. After the announcement that both the President and the First Lady tested positive for coronavirus, we witnessed a series of conflicting and confusing accounts, followed by a small parade and a controversial discharge from Walter Reed.

But even this wasn’t the end of it. A barrage of all-caps semi-cryptic tweets followed from the President, including one call to “REPEAL SECTION 230!!!” — a law that protects social media companies like Twitter and Facebook from being liable for the content posted on their platforms.

https://twitter.com/realdonaldtrump/status/1313511340124917760

But what triggered a response from social media companies was a false tweet coming from the President, who calimed that the US had “learned to live with” flu season, “just like we are learning to live with Covid, in most populations far less lethal!!!”

So far, COVID-19 has killed more than the past 5 flu seasons combined, despite unprecedented protection measures that have drastically limited the spread of the virus — and still, there is no end in sight to the pandemic without a vaccine. We don’t know exactly how lethal the novel coronavirus is, but it is much more dangerous than the flu. According to Johns Hopkins University, it could be about 10 times more dangerous, but it’s also more contagious, so it’s hard to say just how much worse it is — but by no means is it less dangerous.

Twitter hid the message and posted a warning about “spreading misleading and potentially harmful information,” while Facebook removed the post altogether. Facebook spokesperson Andy Stone also confirmed that the company removed the post for COVID-19 misinformation.

In a separate incident, the president posted a tweet quoting a New York Post that praised him as an “invincible hero”. The tweet also included the journalist’s personal email, which prompted Twitter to delete the post as sharing other people’s information is prohibited on the platform.

Social media companies have been criticized for taking insufficient action to tackle misinformation, which is doubly concerning as one study has found the President to be the largest source of coronavirus misinformation.

Facebook did announce more stringent measures. The social network announced it will ban all accounts, pages, and groups that represent or promote the QAnon group. QAnon is a sprawling, far-right conspiracy theory with a history of scandalous conspiracy theories not based on actual facts). For instance, QAnon members believe that a cabal of Satan-worshiping pedophiles running a global child sex-trafficking ring is plotting against President Donald Trump, and that Trump is planning a day of reckoning. Trump has previously praised the group, saying that they “like me very much” and that they “love America.” Meanwhile, the FBI has deemed QAnon a potential domestic terrorist threat.

Bad News Screenshot.

‘Pre-bunking’ is an effective tool against fake news, browser game shows

Bad News comes bearing good news. The game about propaganda and disinformation, that is.

Bad News Screenshot.

Image credits DROG.

An online game that puts players in the role of propaganda producers can help them spot disinformation in real life, a new study reports. The game, christened Bad News, was effective in increasing players’ “psychological resistance” to fake news.

‘Alternative truth’

“Research suggests that fake news spreads faster and deeper than the truth, so combating disinformation after-the-fact can be like fighting a losing battle,” said Dr. Sander van der Linden, Director of the Cambridge Social Decision-Making Lab.

Researchers at the University of Cambridge helped develop and launch the browser-based video game back in February of 2018, in collaboration with Dutch media collective DROG and design agency Gusmanson. Since then, thousands of people have played the game — which takes about fifteen minutes from start to finish — with many, yours truly included, submitting their data to be used for this study.

In Bad News, your job is to sow anger and fear by creatively tweaking news and manipulating social media. Throughout the game, you’ll find yourself calling on twitter bots, photoshopping ‘evidence’, and churning conspiracy theories to attract followers. It’s quite a good game, and a pretty eye-opening one at that, because you have to walk a very thin line. On the one hand, you want as many people as possible to start following and believing you; on the other hand, you need to rein yourself in somewhat to protect your “credibility score”.

What the team wanted to determine is whether the game can help people spot fake news and disinformation in real life. The results suggest it can.

“We wanted to see if we could preemptively debunk, or ‘pre-bunk’, fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived,” Dr. van der Linden explains. “This is a version of what psychologists call ‘inoculation theory’, with our game working like a psychological vaccination.”

Players were asked to rate the reliability of a series of headlines and tweets before and after gameplay. Participants were randomly allocated a mixture of real (the control group) and fake news (the “treatment” group). The team reports that members in the treatment group were 21% less likely to perceive fake news headlines as reliable after playing the game. Bad News had no impact on how these participants ranked real news in terms of reliability.

There are six “badges” people can earn in the game for the six most common strategies used by fake news producers today: impersonation; conspiracy; polarisation; discrediting sources; trolling; emotionally provocative content. In-game questions measuring the game’s impact were issued for four of these badges (to limit bandwidth usage). From pre- to post-gameplay, the results show that Bad News:

  • Reduced perceived reliability of the fake headlines and tweets by 24% or the disinformation tactic of “impersonation” — i.e. the mimicking of trusted personalities on social media.
  • Reduced perceived reliability of “polarisation” — i.e. the use of highly-polarizing, emotionally-provocative headlines — by about 10%.
  • Reduced perceived reliability of “discrediting” — i.e. attacking a legitimate source with accusations of bias — by 19%.
  • Reduced perceived reliability of “conspiracy” — i.e. the spreading of false narratives blaming secretive groups for world events — by 20%.

Those who were most susceptible to fake news headlines at the outset of the study benefited most from this “inoculation”, the team adds.

“We find that just fifteen minutes of gameplay has a moderate effect, but a practically meaningful one when scaled across thousands of people worldwide, if we think in terms of building societal resistance to fake news,” adds van der Linden.

“We are shifting the target from ideas to tactics,” says Jon Roozenbeek, study co-author also from Cambridge University. “By doing this, we are hoping to create what you might call a general ‘vaccine’ against fake news, rather than trying to counter each specific conspiracy or falsehood.”

The team worked with the UK Foreign Office to translate the game into nine different languages including German, Serbian, Polish, and Greek. They’ve also developed a “junior version” of the game aimed at children aged 8-10 which is available in ten different languages so far. The goal is to “develop a simple and engaging way to establish media literacy at a relatively early age”, Roozenbeek explains, and then see how long the effects last.

Still, so far, the data isn’t conclusive. The major limitation of this dataset is that it used a self-selecting sample, namely those who came across the game online and opted to play. As such the results are skewed toward younger, male, liberal, and more educated demographics. Even with this limitation, the team says controlling for various characteristics showed that the game was almost equally effective across age, education, gender, and political persuasion. Part of that comes down to the fact that Bad News has an ideological balance built in, the team explains: players can choose to create fake news from the left and right of the political spectrum.

“Our platform offers early evidence of a way to start building blanket protection against deception, by training people to be more attuned to the techniques that underpin most fake news,” Roozenbeek concludes.

You can try the game out here.

The paper “Fake news game confers psychological resistance against online misinformation” has been published in the journal Nature.

Want to combat scientific disinformation? Here’s how

If you’re reading this, there’s a very good chance you tried to combat some science misinformation at some point. Whether it’s an antivaxxer friend, that climate change-denying uncle, or just some internet comment, disinformation has become so pervasive that it’s impossible to avoid — and if you’ve tried to talk them out of it, you know just how insanely difficult it can get.

Now, in a new study, researchers describe some evidence-based strategies to combat the misinformation.

Transparency is a must, researchers say.

Misinformation is hardly a new thing. it artificially generated backlash against climate change science — ironically just as the scientific community was reaching a consensus on the issue.

“Nowhere has the impact of scientific misinformation been more profound than on the issue of climate change in the United States,” researchers write in the study.

That might seem like a contradiction, but it is actually the result of a carefully planned strategy. An organized network, funded by organizations with a lot of money invested in the fossil fuel industry, devised a campaign to slow down the transition to a low-carbon economy, especially by eroding public confidence in climate change. The result was a large-scale misinformation campaign which was wildly successful, says Justin Farrell, of the Yale School of Forestry & Environmental Studies (F&ES).

In a new paper, Farrell and colleagues shed new light on this misinformation, and describe four evidence-based strategies to combat it:

  • Public inoculation: A growing body of research shows that our individual perceptions are strongly influenced by our culture — the set of pre-existing ideologies and values we have. However, there is more and more evidence showing that we can use this and “inoculate” against misinformation. This inoculation is essentially exposing people to refuted scientific arguments before they can even hear them — fittingly, like using a “vaccine”. This strategy can be very effective if done quickly, before misinformation spreads, and if more attention is placed on the sources of misinformation.
  • Legal strategies: It’s well-known that fossil fuel companies such as ExxonMobil have systematically downplayed the risks for their products. Industry leaders knowingly misled the public. As a response, several cities and states have sued these companies — while this is a lengthy process, it can help shed new light on how these companies lied to the public and to political leaders.
  • Political mechanisms: Like the public opinion, political opinion has also been swayed — but the exact way in which this happens remains difficult to assess. For instance, they identify a case in which the energy company Entergy Corporation paid actors who posed as grassroots supporters of a controversial power plant in New Orleans — but this got little attention in the media, and it’s unclear how politicians were swayed by this manipulation campaign. Placing cases like this under the spotlight, as well as discussing more political candidates’ views on climate change could be very useful in encouraging the election of responsible leaders into office. Also, shedding light on politicians’ financial connections to fossil fuel companies needs to be addressed more — which leads us to our last point.
  • Financial transparency: “follow the money” is generally a pretty good plan. The number of campaigns that promote science misinformation — coming from donor-directed foundations that shield the contributor’s identity from the public–  has grown dramatically in the past few years, topping $100 million. This is done especially to make it difficult to learn who the authors of the disinformation campaign are and to spread haze around the money trail. How often have we heard that scientists are “lying about climate change to get research money” — when the reality couldn’t be further from the truth. The authors call for new legislation to improve funding transparency.

“Ultimately we have to get to the root of the problem, which is the huge imbalance in spending between climate change opponents and those lobbying for new solutions,” said Farrell. “Those interests will always be there, of course, but I’m hopeful that as we learn more about these dynamics things will start to change. I just hope it’s not too late.”

While the study was focused on climate change, similar strategies can be used to address all sorts of misinformation — from antivaxxing and homeopathy to astrology and conspiracy theories.

The study has been published in Nature.