28 Jun 2020

How Facebook groups are destroying America

From Sunday Morning, 11:24 am on 28 June 2020

Facebook groups are built for privacy and community, but that's also what makes them dangerous, according to new research.

Those same features - privacy and community - are often exploited by bad actors, foreign and domestic, to spread false information and conspiracies.

Facebook and Google apps on a a tablet.

Photo: AFP

Nina Jankowicz is the disinformation fellow at the Woodrow Wilson International Center for Scholars in Washington, DC.

She has written an article ‘How Facebook groups are destroying America’, which was published in Wired magazine.

She told Colin Peacock that, after the Cambridge Analytica scandal in 2018 and Russian interference via Facebook in the 2016 US general election, Facebook adjusted its focus towards something it called the ‘privacy pivot’.

Groups became a big part of its new strategy, she says.

While some groups have genuine community concerns, other are more sinister, Jankowicz says.

“There is some quite salacious, divisive and indoctrinating and extremist messaging being spread in groups and because they are segregated by interest, by location they are really easy targets for bad actors whether those are foreign actors or domestic dis-informers.”

Facebook incentivises membership of such groups through notifications and suggestions, she says.

“Once you join one group you get notifications and suggestions to join other similar extremist or indoctrinating groups.”

Another problem is that many of these groups are private, she says.

“A lot of them are not only private, some of them are secret so you have to be invited to join them which means they have a little bit less oversight than a public group might have.

“And they tend to attract like-minded people.”

She noticed a number of groups springing up on Facebook advocating for the re-opening of the US.

“Of course many of our states are open now, but there were plenty of groups that popped up during our coronavirus lockdown that were all about reopening and they were people who had many different political beliefs but they were united by this desire to reopen, sometimes they’d join these groups out of an economic interest.

“From that economic interest in reopening you might then be redirected to a group that was anti-vaccination, or that didn’t believe in wearing masks or social distancing,”

It was a short step from opposing the lockdown to groups convinced it was all a deep state conspiracy, she says.

“They saw restrictions as some sort of huge government control mechanism and the next thing they believe will happen is we’ll all be vaccinated and microchips will be implanted in us so we can be tracked and they think Bill Gates is behind that.

“From an economic interest in re-opening you get to tracking devices implanted in vaccines. And it’s not many degrees of separation away, I’m sorry to say.”

Facebook does have third party fact-checking, but paradoxically it seem to reinforce misinformation, Jankowicz says.

“Facebook is fact-checking some of the content through its third party tracking network, but when that label is applied to these posts … it kind of has the opposite effect in some of these communities, because they don’t trust Facebook. They think Facebook is the deep state, and any fact check that Facebook applies to that content will just make them believe it more, because they think it’s trying to censor them from the truth.”

These groups are inherently exploitable, she says.

“Over time these communities become very close knit they have several hundred posts a day a lot of them have key active members who become trusted in that community.

“It’s a bit like the cafeteria at work these are people the users are seeking out on a daily basis. They choose to be part of that community, there is trust involved and that’s exactly what makes them so exploitable because there is that inherent trust and community in these groups.”

These extremist or conspiracy-minded groups thrive and grow through organic viral content, she says.

“Groups are made to create that engagement to keep people on the platform, to keep them feeling like they are part of something … Zuckerberg calls it the digital living room, in reality I would say it’s more like a digital basement. It’s a covert, really shady vector for a lot of political movements including white nationalists and other things we would prefer not to have a platform on our digital public square.”

As trust is built in the community, further engagement is fostered, she says.

“Over time those asks get bigger and bigger so it starts as harmless memes, but a couple of months later the group or page moderators might ask people to change their profile picture in support of apolitical cause from there they might ask them to sign a petition and make a donation.”

It is beholden on Facebook to provide more transparency, she says.

“We need to provide people with more transparency more information to navigate the flow of content that they are seeing, why are they being targeted with this specific suggestion for this group? Why are they seeing certain content about other content?”

Users are unaware to the extent Facebook is controlling this flow of information, she says.

“People need a lot more control and they don’t realise Facebook is making these decisions for them usually with just economics in mind.”

Any group with membership of more 5000 should be public, Jankowicz says.

“There is much more likelihood that there would be perhaps, not a robust debate, but at least there would be room for oversight and room for discussion than right now in these underground sort of groups there is not.

“Facebook isn’t really doing its due diligence in monitoring the content in closed and private groups they claim they want people to have their privacy, but is a group really private when it had tens of thousands or hundreds of thousands of members? No.”