Esquire - Fighting The Infodemic: How Fake News Is Making Coronavirus Even More Dangerous

Article by Mark Wilding for Esquire - published 25 April 2020

On 6 April, Tom Phillips was self-isolating at his home in south London when he saw the news that a mobile phone mast in Birmingham had been set on fire. Experts suspected that the action had been prompted by a conspiracy theory linking 5G technology to the emergence of Covid-19. As editor of Full Fact, an independent fact-checking charity, Phillips was not unduly concerned; this was just one story. He made a mental note to keep an eye out for further developments and got back to work.

A few hours later, the Guardian published an update: at least 20 mobile phone masts across the UK had been set on fire or vandalised. Posts seen in anti-5G Facebook groups had encouraged members to mount the attacks and post footage online. Telecoms workers had been abused by members of the public. In one viral video, a woman confronted two engineers as they installed 5G cables in the street. “You know when they turn this on it’s going to kill everyone and that’s why they’re building the hospitals?” she said. “Why do you think they’re building 25,000 concentration camps of death in London Excel right now? It’s because of this wire here.”

Now, Phillips was interested. But he wasn’t entirely surprised; there’d been similar attacks linked to concerns about the rollout of 3G and 4G. During 2019, Full Fact debunked a series of 5G conspiracy theories, including viral social media posts linking the technology with mass bird deaths and claims it kills trees. Nevertheless, this spate of attacks was disturbing. “It was that sense that, this is getting extremely real,” he tells me. “And there was a degree of frustration that this had got to this point.”

Though Phillips declined to apportion blame for the spread of these rumours, he could have been forgiven for doing so. In July last year, the charity published a report about its ongoing fact-checking partnership with Facebook, in which it called on the government to provide “authoritative public information” on topics including 5G. The report highlighted “a distinct lack of official guidance properly addressing some public concerns” about the technology and warned: “The absence of reliable and trustworthy information can create a vacuum in which misinformation is better able to spread.”

For months after the report’s publication, anti-5G activism remained a niche concern. The movement’s most prominent supporter was arguably David Icke, the goalkeeper-turned-conspiracy theorist who has long argued that the world is secretly run by a race of shape-shifting lizards. Then, the coronavirus pandemic broke out. The long-simmering 5G conspiracy theory mutated and took on startling new momentum. Adherents noted that Wuhan, the Chinese city where the coronavirus outbreak began, had been an early adopter of 5G – ignoring the fact that Iran, which saw one of the earliest outbreaks of the virus, does not yet have the technology. On 26 March, the Daily Star published a news story: “Coronavirus: Fears 5G wifi networks could be acting as 'accelerator' for disease” (the headline has since been changed).

Before long, YouTube, Facebook and Twitter were awash with claims that 5G is responsible for the spread of the virus, with videos racking up millions of views. Celebrities and influencers with huge social followings, including Woody Harrelson and Amir Khan, promoted the theory, and in mid-April, breakfast TV host Eamonn Holmes told viewers it may “suit the state narrative” to dismiss 5G concerns. For Will Moy, chief executive at Full Fact, it’s frustrating that the organisation’s warnings to the government had not been heeded. “The opportunity to tackle that problem was actually months ago,” he says. “We saw it coming. We knew we could tackle it. And it was a failure of good information, not just the prevalence of bad information, that caused the problem.”

The shifting landscape and lack of scientific consensus during the early stages of the coronavirus pandemic created an information vacuum that sucked in conspiracy theories and false health advice, opening up a new front on which governments and overstretched health authorities must fight. In mid-February, Tedros Adhanom Ghebreyesus, director general of the World Health Organization (WHO), warned the Munich Security Conference: “We’re not just fighting an epidemic; we’re fighting an infodemic. Fake news spreads faster and more easily than the virus, and it is just as dangerous.”

In the days and weeks after Ghebreyesus’s speech, a series of events illustrated the prescience of his warning. As UK telephone masts burned, damaging vital infrastructure at a time of national emergency, a small town in Ukraine erupted into two days of rioting after rumours spread that a plane had arrived carrying coronavirus patients from Wuhan. In Iran, hundreds of people have been reported dead after drinking methanol, in the misguided belief that it would offer protection from the virus.

Manlio De Domenico, a researcher at the Bruno Kessler Foundation in Italy, is an expert in how viruses and information spread. In mid-January, as news filtered out of China that a novel coronavirus had been identified in Wuhan, and the number of confirmed cases began to grow exponentially, he had a hunch about what lay ahead: a global catastrophe, but also a rare opportunity to observe how information spreads during a pandemic. “I have to say I was hoping for this to burn out just in China and that’s it,” he says. “But I am a researcher. I have a kind of duty to get the data, to understand phenomena.”

On 21 January, De Domenico began collecting social media posts about the emerging virus. Over the next seven weeks, he and his colleagues used machine learning to analyse more than 112 million tweets in 64 languages, a flood of information that was spreading across the globe ahead of the pandemic. They observed how the level of misinformation grew exponentially, just like the virus, and researchers found that information from unreliable sources accounted for up to 30 per cent of overall posts. In a paper published earlier this month, they wrote: “We found that waves of unreliable and low-quality information anticipate the epidemic ones, exposing entire countries to irrational social behaviour and serious threats for public health.”

De Domenico’s research focused on Twitter, which provides an easily accessible source of data. We may never know the true scale of the wider coronavirus infodemic. There have been numerous reports of unverified health advice circulating via private messaging apps, urging the public to hold their breath to test themselves for infection; to drink water every 15 minutes; and to take regular saunas – all actions that offer no proven protection from the virus. Conspiracy theorists have pushed claims on YouTube and in Facebook groups that Covid-19 is a bioweapon or even a myth, promoting seemingly fringe ideas that have nonetheless gained a foothold in the public consciousness. Meanwhile, the European Union’s disinformation unit has warned that state-sponsored actors are publishing fake news “to exploit the public health crisis to advance geopolitical interests”.

If followed at the expense of truly preventative measures, then well-meaning but inaccurate health advice – the kind currently working its way through family group-chats across the globe – could pose a mortal risk. Imran Ahmed, chief executive at the Center for Countering Digital Hate, which has turned its attention to tracking coronavirus misinformation during the outbreak, tells me that the pandemic has made clear what many communities already knew: that misinformation has dire consequences. “It’s one of those rare occasions in which we are completely interdependent, as a society, on each other and one bit of misinformation can cause huge amounts of damage to all of us.”

Governments and health authorities have scrambled to slow the flow by seeding social media networks with official advice. At the end of March, the government announced that its Rapid Response Unit, a joint Cabinet Office and 10 Downing Street team set up to fight disinformation, was dealing with up to 70 incidents a week of “false narratives” linked to the coronavirus threat. (The Cabinet Office did not respond to a request for further information about the unit’s work.) The WHO has launched an Information Network for Epidemics to promote verified information about the virus and the pandemic response.

Meanwhile, the NHS has sought to quickly verify its 800 official health service social media accounts while working with platforms to take down fakes. It’s an issue that came to prominence just this week, when a viral Twitter thread alleged the UK government was itself operating a network of fake NHS staff accounts in an attempt to demonstrate support for its handling of the pandemic. The Department for Health and Social Care has flatly denied the claim, describing it as “disinformation” and “categorically false”. When Full Fact investigated, it found “no publicly available evidence” to support the claim and said the source of the allegations had declined to provide further information. Two days after it was published, the allegation remained online and had attracted close to 25,000 retweets.

Social media companies, used to criticism for facilitating the flow of inaccurate information, are now taking unprecedented steps to promote verified news and advice. Searches for coronavirus-related topics on Facebook, Instagram, YouTube or Twitter see users pointed to official health authority websites. By late March, Facebook said it had directed more than one billion users to expert health resources and seen 100 million people click through to view official information during the course of the pandemic.

Still, misinformation continues to spread. Some have questioned whether social media companies should have done more – and sooner. Official health sources are promoted through Facebook search results, but conspiracy theories circulate in long-established groups popular with anti-vaxxers and 5G truthers. YouTube has committed to clamping down on videos claiming a link between the coronavirus and 5G, but only after the arson attacks on UK telephone masts. Twitter has said it will remove fake and harmful coronavirus content, but allowed a tweet from US tech entrepreneur, Elon Musk, which, falsely claimed children are “essentially immune” to Covid-19.

And these are only the visible conspiracies. In late March, as coronavirus cases in the UK started to rise, a voice message began started to spread on WhatsApp, supposedly recorded by a woman from the South East Coast Ambulance Service. It warned that a third of coronavirus cases were expected to be children; that ambulances wouldn’t be dispatched to people with breathing difficulties; and that ice rinks were being turned into mortuaries. After receiving panicked emails from members of the public, Marianna Spring, specialist disinformation and social media reporter at the BBC, investigated and discovered that the voice note was a hoax. Public Health England told her: “This is fake news, and we would urge people to ignore the message and not to share it further.”

Those forwarding the message on weren’t foil-hatters, state actors or Twitter bots. According to Spring, they were frightened people trying share a warning with their loved, just in case it turned out to be true. “Fear is a very powerful emotion that can drive people to share things or even consider ideas that in normal times they would reject as rubbish,” she tells me. “Medical myths and speculation about how hospitals are coping “are providing answers to people who are clutching at straws and wanting to work out what’s going on.”

At Full Fact, every day begins the same way. Its nine-strong fact-checking team reviews hundreds of false claims, inaccurate news reports, and questionable memes to produce a shortlist of 15-20 dubious claims, from which a handful will be chosen for the comprehensive Full Fact treatment. For more than a decade, Full Fact has served as an independent watchdog checking claims made by politicians and the mainstream media for truthfulness and accuracy, publishing the results of its enquiries in an attempt to correct the public record. In recent years, its focus has shifted slightly; viral social media posts are now as much a part of its remit as political statements made on Question Time.

On 27 January, Full Fact published its first fact check related to the coronavirus. “We’ve seen a number of claims on social media saying that a patent for coronavirus was filed in 2015, since a new respiratory illness was recently observed in Wuhan, China,” it read, going on to explain that the term ‘coronavirus’ refers to a “broad category of viruses” and that the patent in question had been filed by researchers looking at a different illness infecting birds and other animals. In the following days, the organisation debunked a series of claims about Covid-19, including a viral Facebook post that alleged the cleaning supplies manufacturer Dettol had predicted the emergence of the virus on the labelling of its antiseptic spray.

By early March, when the first cases of person-to-person transmission were confirmed within the UK, Full Fact was covering almost nothing other than claims about coronavirus. “That’s when it became apparent that this was going to be our world for the foreseeable future,” Philips says. Full Fact has now published more than 80 fact checks on Covid-19 claims, and interest in the charity’s work has spiked. During March, traffic to the Full Fact website soared an all-time high.

Before joining Full Fact, Phillips spent several years as UK editorial director at BuzzFeed, the news and entertainment website best known for producing highly shareable quizzes and listicles. When I spoke to Phillips in early April, I asked how the time he spent producing viral content had influenced the way he approaches his current role. I imagined the two jobs involved opposing objectives, but Phillips sees them in much the same way. “You can at least give the correct information as good a chance of achieving the kind of spread that it needs.”

Rumour always outpaces truth, because facts are rarely as neat. “Fact-checking often involves you denying a simple narrative and pointing out [its] complexity,” Phillips says. “So it's always a slight uphill battle.” Full Fact must check each claim against official data, or run it past independent experts in the relevant field. By that point, the statements under scrutiny have often been shared thousands of. But as Grace Rahman, the organisation’s lead for online fact-checking, puts it: “We want to be fast, but we need to be right.”

Will Moy, who launched Full Fact in 2008 while working for a crossbench member of the House of Lords, tells me the charity’s aims had always extended well beyond “playing whack-a-mole with an endless stream of harmful misinformation”. Full Fact made 87 correction requests last year. This year, it got the headline of the Daily Star’s 5G story changed. Seeking corrections can often prevent a false claim being repeated, says Moy, but also creates what he described as a “they know we check effect”. “Organisations and people change their behaviour when they anticipate scrutiny.”

Fact-checkers deal in hard evidence and data, but they also need empathy. If the truth is to stand a chance, you need to understand why the falsehood is so compelling. “It is not surprising that people are falling for misleading information right now. The world has turned upside down,” Phillips says. “You have to start from a position of empathy and understanding that people are frightened and it's okay and, frankly, rational to be frightened. So you need to be compassionate, I think, in your fact-checking. This isn't going to be a time for finger-wagging from a position of authority.”

On 7 April, the WHO held the first of two half-day Zoom webinars to discuss its response to the Covid-19 infodemic, bringing together hundreds of scientists, public health specialists, technology experts and journalists from all over the world. Many of the 10-minute presentations were on the subject of ‘infodemiology’ – a term used by researchers seeking to apply lessons learned from disease outbreaks to the spread of misinformation.

Epidemiologists consider viruses in terms of their basic reproduction rate, commonly referred to as R0, which is a measure of the average number of additional people expected to be infected by each individual case of a disease. The higher the R0 number, the faster and more widely a disease spreads. Things like a long incubation period and dense population will drive the number up; vaccines and social distancing will pull it back down. Drawing parallels, Kisoo Park, from the Korea University College of Medicine, suggested the R0 of an infodemic could be measured by similar factors: the plausibility of misinformation, the opportunities it has to spread, and the audience’s vulnerability to believing what they hear or read.

To that end, social media companies have pushed questionable information down users’ timelines or removed it entirely. Encrypted messaging services, however, provide a trickier challenge. Unable to moderate messages shared by users, WhatsApp has instead announced limits on message forwarding, after noticing an increase in the amount of viral content, which it said could “contribute to the spread of misinformation”. Regardless, it’s impossible to gauge just how much misinformation is rattling its way around private chats.

In 2018, Conservative MP Damian Collins launched a parliamentary inquiry into disinformation and fake news as chair of the Digital, Culture, Media and Sport committee. Since it published its findings last year, he has set up his own coronavirus fact-checking service, Infotagion, and hopes that creating an archive of Covid-19 misinformation will illustrate the need for more action. That might mean stricter penalties for those involved in publishing false information in the first place. “We’ve got to be more vigilant,” Collins tells me. “People knowingly manipulating the architecture of social media to post messages that could be harmful in a public health emergency? I think that should be a criminal offence.”

Still, tackling misinformation at its source may be but one part of the infodemic solution. Dr Julii Brainard, a researcher at the University of East Anglia who has modelled the impact of fake news on disease outbreaks, told the WHO webinar that authorities have two options, the first being to drown bad advice with good. “That's the basic strategy I see happening right now,” she said. “The other one is you make the people resistant, so they don't believe the bad advice. They don't share it, they don't act on it, they may even challenge it and try to stop it.” The infodemic equivalent of a vaccination programme.

How that might be done is an unanswered question. “Infodemiology is really in its infancy,” says Dr Sylvie Brand, director of global infectious hazard preparedness at the WHO. “We have far less tools than with epidemiology.” However, she stresses the importance of acting before rumours and conspiracies have a chance to take a hold, while echoing Phillips’ point about the need for understanding the emotional drivers of misinformation. “If you leave a void in your communication then it’s filled immediately,” she says. “What is really important is to make sure you listen carefully to what people are thinking or saying and then immediately answer their concerns.”

Even in the later stages of a crisis, clear communication from official sources can still play a vital role in fighting misinformation. Manlio De Domenico, the Italian researcher who tracked the spread of coronavirus news via Twitter, observed an intriguing phenomenon: in some countries, levels of unreliable information fell as the epidemic progressed. He suggested that, as the crisis escalates, people turn to more reliable sources of information, creating an opportunity for public health messages to cut through. De Domenico highlighted US president Donald Trump’s unproven assertion that the antimalarial drug chloroquine represents a “game changer” in the treatment of Covid-19. “Communication at the level of leaders is failing,” he says.

As the coronavirus outbreak continues its relentless spread, questions are increasingly being asked about what more should have been done in the earliest stages of the pandemic. Were social distancing measures imposed early enough? Why did we not stockpile more protective equipment? Why, as the virus tore through Hubei province in China, was the threat not taken seriously in the West? When the time comes, we will no doubt be left asking similar questions about our preparedness for the infodemic and examining what measures we might take for when the next crisis hits.

The solutions are unlikely to be easy. With fear and confusion so widespread, a lack of simple answers to complex questions has left a vacuum that misinformation has rushed to fill. But the coronavirus infodemic is also the product of a society in which the truth is no longer an objective measure, but something to be continually contested. In which a forwarded message from a questionable source carries the same weight as official health advice. Our ability to assess the veracity of information is distorted by our need for agency and clarity at a time when we feel we have little of either.

But like the ouroboros, endlessly eating its own tail, the mere existence of fact-checkers can set paranoia in motion. In early March, Phillips gave a radio interview to LBC’s Eddie Mair in which he debunked a series of conspiracy theories about the coronavirus. Footage of the interview was posted to YouTube, where it attracted more than 1,500 comments. Many of them accused Phillips of promoting his own 'false' claims. “Legend has it if you call yourself a fact-checker everyone believes what you say with no evidence,” said one. Another added: “Who to believe? Your own eyes or FAKE NEWS with fake experts?”

I asked Phillips if he found these comments disheartening. “You will never ever convince every conspiracy theorist that what they believe isn't true,” he replied. “And that's not the goal we should be judging ourselves by. What we should be judging ourselves by is: are we doing enough good work? Are we being responsive and engaged and human enough in that work, that people who may have been persuaded, who may have tipped over into believing this, maybe go the other way? What you can do is those basic bits of work, constantly, and empathetically, to try and balance the scales a little bit.”

Copyright 2024 Damian Collins. All rights reserved

Promoted by Dylan Jeffrey on behalf of Damian Collins, both of FHCA, 4 West Cliff Gardens, Folkestone, Kent, CT20 1SP.

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram