On the surface, a change in algorithm by Facebook to fill your newsfeed with friends and family news, and push down branded content and news items, sounds fairly harmless.
But the change is increasingly drawing criticism, as it throws up unintended consequences – fake news highlighted, ignorance spread widely, and your data to be captured en masse. In last year’s Netflix documentary The Social Dilemma several former big tech employees laid out more dramatic charges – that users are being nudged towards certain behaviour and thoughts, they’re being monitored and tracked, our emotional buttons are being pushed without us realising it – and, if you want to control a population or country there’s never been a tool as effective as Facebook.
Then on March 31 former UK deputy Prime Minister Nick Clegg, who’s now in charge of Facebook’s global communication, wrote a blog piece that has been described as ‘wilfully naïve’ – saying the Facebook algorithm is just reflecting back what you, the user, feed it.
If that’s the case, says digital technology expert Dr Andrew Chen, it’s a funhouse mirror. Chen, a research fellow at the University of Auckland’s Koi Tū – Centre for Informed Futures – talks to The Detail today about the impacts of a curated news feed on users.
“The ‘mirror’ is distorting what a user is showing to it,” he says. In the podcast, he explains how.
The Detail also speaks to Newsroom senior political reporter Marc Daalder who has analysed Clegg’s blog and written about the funhouse mirror.
“Theoretically we can turn off Facebook,” he says. “But it’s become such a central part of our social lives that it’s not really a viable option for most people. That means Facebook has responsibilities to us as our sort of ‘new public square’ that go beyond its responsibilities to its shareholders. These are public interest based responsibilities not profit-based ones … and that’s where regulation comes in.”
That would be a massive task for New Zealand, which doesn’t have a lot of heft against a gigantic multi-national.
Marc Daalder says the overall idea of the Christchurch Call is coming to a multi-lateral agreement on how to deal with distasteful content on social media – “and not just distasteful but content that has really negative, offline, impacts as a result of what’s happening online; the key example being the March 15 livestream which wasn’t just bad as an online product but had real offline traumatising impacts”.
Facebook’s algorithms aren’t inherently geared towards promoting, for example, anti-vax or white supremacist content.
“But often the algorithms are geared towards promoting whatever people are more likely to engage with, and that’s more likely to be something that’s provocative and outrageous and shocking.
“A lot of the spread of fake news for example has found to be the result of people sharing it, not because they agree with it or believe it but to say ‘can you believe this?’; ‘look at that!’; ’that’s ridiculous’ … but then that means it’s still getting more reach and there are more people who might look at that and not have the same sort of critical view, and fall for it.
“If you see a YouTube video that is fake, but it has 10 million views, then you might start to think, ‘well, would this many people be watching it if it was fake?’“
And Daalder says in the post-Trump era, “when real life gets whackier it’s easier to believe the whacky stuff you get on line.”