21 Feb 2021

Facebook giving high profile users too much leeway - researcher

12:49 pm on 21 February 2021

Experts believe Facebook's "aggressive" stance against anti-vax posts is failing to stifle misinformation.

SARAJEVO, BOSNIA AND HERZEGOVINA - JULY 14, 2019: Woman use Samsung Galaxy smartphone to log in on the Facebook social network account.

Photo: 123RF

The digital giant increased the list of banned content that it uses artificial intelligence and content reviewers to remove this month - yet didn't act on a widely-shared anti-vax post from the Destiny Church leader this week until contacted about it by RNZ.

The updated blacklist now includes posts that claim vaccinations are toxic, dangerous, cause autism, don't work against the disease they're meant to or are less safe than contracting Covid-19.

In a statement Facebook said it enforced the rules via technology and reports from members, with the help of more than 15,000 content reviewer staff based around the world.

"We primarily rely on artificial intelligence to identify violating content on Facebook and Instagram, and our technology is often confident enough that a piece of content violates our standards to delete it automatically...often before anyone has seen it," it said.

That technology didn't pick up on a lengthy Wednesday morning spiel from Destiny Church leader Brian Tamaki, suggesting vaccines were "risky", "rushed", "not a reliable long term solution" and may lead to "experiments" on lower socio-economic communities and vulnerable people.

In New Zealand, the Pfizer Covid-19 vaccine has gone through the Medsafe approval process. Information about its safety, effectiveness and side effects can be found here.

Tamaki's post discussed people symbolically applying the blood of Christ to front doors to be protected from a previous "terrorising airborne" pandemic which he did not name, and implored Destiny Church followers to band together in their beliefs in Psalm 91 as a "simple" way to avoid contracting the virus.

"There is an option that allows you and your family to have a chance in the face of this fear and pandemics, and it has already been proven," he said, with no credible scientific evidence for his claims.

The post was online for 10 hours, and shared more than 180 times with no removal or disclaimer.

It was removed when RNZ asked Facebook about it, but reposted via a link to Tamaki's own webpage, which has been reshared more than 80 times.

Science denial expert Dr Fiona Crichton said 10 hours was long enough for misinformation to spread all around the world, particularly given "Facebook is a place people go to, for this kind of information".

"People hear it, they spread it to others, and if you come along later and say 'actually that was a big miss' people are very reluctant to hear it.

"If they've taken that piece of information on board because, one, we don't like being wrong, two, if you start to believe something then it's much harder to reject that information. Belief is very hard to counter just with facts," she said.

"That's why there's an absolute onus on Facebook to act very quickly, and do it genuinely."

A medical worker fills a syringe with a dose of the Pfizer-BioNTech Covid-19 vaccine as the country launches its inoculation campaign at the University Clinic for Infectious Diseases in Skopje on February 17, 2021.

It's critical Facebook acts quickly when misinformation is posted on topics like vaccines, Dr Fiona Crichton says. Photo: AFP

M X Dentith, a conspiracy theory researcher based in Auckland, said Facebook had competing interests when it came to high-profile public figures publishing misinformation.

They said prominent pages make people spend time on Facebook, in turn allowing the website to gain advertising revenue.

"So it turns out if you're a major figure on a social media platform the kinds of rules that you're meant to adhere to don't really seem to apply to you unless you do something so incredibly bad it forces you them to take you off the platform."

That's disputed by Facebook, which said its artificial intelligence also helped prioritise posts for its content moderators to review.

It said posts were followed-up faster if they were more viral - quickly accumulating shares and views - more severe or more likely to violate community standards.

However, Dentith said the sheer volume of misinformation on Facebook made it difficult to manage.

They worried the AI could also stop people with the right information, and the right intentions.

"It's laudable that Facebook has decided it's going to clamp down on claims of this type. But there's still this lingering worry here, that if they're doing it by algorithm they're also going to be clamping down on legitimate stories which might be interrogating those stories to explain why they're wrong."

For every piece of misinformation, Dentith said it was necessary to have some level of reaction to help inoculate it and help people understand why it is wrong.

Crichton said "countering myths with facts" wasn't always effective, and also urged people to sit down with family members or friends spreading misinformation, before giving them "tailored" credible information.

"Try and figure out what they're really worried about. Because usually they're worried about things like - is it safe for me, or my children, or I'm worried about the fact that my granny has got MS. There are all sorts of things that happen in people's lives when they take up misinformation."

Tamaki's re-shared post remains available as of Sunday morning.

Facebook urged people to continue reporting anything that violates its community standards, and said the vast majority of reports were reviewed within 24 hours.

**In New Zealand, the Pfizer Covid-19 vaccine has gone through the Medsafe approval process. Information about its safety, effectiveness and side effects can be found here.

Get the RNZ app

for ad-free news and current affairs