Loading

Ever since it started to reach global giant status, Facebook has been at pains to downplay its own power and influence. It is desperate to present itself in simple terms: a corporate entity making commercial decisions in the interests of shareholders.

So when it comes to the News Media Bargaining Code, it presents itself as Company A in commercial conflict with Company B. It is simply using available leverage to force a resolution to a commercial conflict. This may have held water a decade ago. But today it totally fails to give appropriate weight to the historically unique role Facebook now plays when it comes to information supply, free media, propaganda and democracy.

By hoovering up information on us for years, Facebook has built itself unprecedented power to influence the way people think and act en masse. Yet most of the huge scandals to hit Facebook in recent years, such as Cambridge Analytica and the live-streaming of the Christchurch massacre, have been the result not of meddling, but of inaction and neglect.

The decision to ban Australian news represents new territory. Turning off the news, overnight, to millions of people as a bargaining chip in a commercial dispute is a truly shameless demonstration of corporate might. Cards are now on the table.

Loading

This is Facebook intentionally pulling one of the huge levers of power it has developed to deliberately influence the way free information functions and flows in a sovereign democracy. No longer can it say: “Bad things are happening because we didn’t know.” Facebook must now concede: “Bad things are happening because we’re making them happen.”

And this is where it may have overplayed its hand. By using its enormous power intentionally, Facebook may have made it impossible for the world to continue ignoring that it has that power.

Zuckerberg himself has a stranglehold on Facebook’s structure, meaning no one can fire him and activist shareholders have no power to pursue change. This much unregulated power should not exist in the hand of any one person, be they Mark, Rupert or the Mahatma.

So in trying to ward off a fairly limited piece of regulation, Facebook has shone a spotlight on the case for much more robust regulation. For example, if we really want to tackle misinformation on Facebook, we should start by measuring it.

Illustration: Simon LetchCredit:The Sydney Morning Herald

Reset Australia’s proposal for a “live list”, under which the Australian government would compel Big Tech platforms to publish the most viewed content about COVID-19, is a very useful place to start. A live list wouldn’t get snagged in complex debates about conspiracy theories, disinformation or right or wrong. It wouldn’t even take down posts. It would simply be a constantly updated list of the most popular shared links relating to COVID-19. But it could be used by academics, journalists and public health officials to begin to understand how algorithms amplify and funnel us into filter bubbles.

Loading

We still have very little understanding of how algorithms push us into bubbles and amplify the most sensational, outrageous and conspiratorial content to keep us logged on. This kind of transparency is long overdue.

At my fingertips is live data that can tell me the air and water quality in my neighbourhood, the road congestion near my kids’ school, and the level of social distancing on the next train as I head to work. But if I wanted to know what the most viral content about COVID-19 circulating online was, I would come up short. For an industry entirely constructed on data, this isn’t good enough.

Big Tech will tell us this is impractical. It will raise privacy concerns. It might even warn it will break the internet. But in truth, it’s because Big Tech is afraid of what we might see.

Stephen Scheeler is a former chief executive pf Facebook in Australia and New Zealand. He is an adviser to Reset Australia, part of a global initiative to counter digital threats to democracy.

Most Viewed in National

Loading



Source link