https://www.article19.org/taming-big-tech-protecting-expression-for-all/
https://www.article19.org/taming-big-tech-protecting-expression-for-all/
Governments around the world are seeking to regulate how social media companies address problematic content on their platforms, especially hate speech, harassment, and disinformation.
But while well-intentioned, their proposals risk doing more harm than good, and they fail to tackle the real problem: the excessive power of a few huge companies whose business models are inherently exploitative.
ARTICLE 19’s policies set out a solution that would not only protect freedom of expression and privacy online but also – finally – give us all a viable alternative to Big Tech.
The problem with platforms
Social media networks are a vital space for us to connect, share, and access information. But because the business models of large social media platforms rely on capturing our attention and selling it to advertisers, their algorithms are designed to keep us engaged for as long as possible – including by amplifying problematic content like hate speech and disinformation.
Governments have come up with various proposals to address this. Yet rather than tackling the flawed business model, many of their so-called solutions focus on what kinds of content people should and shouldn’t be allowed to post or access on social media.
This gives large platforms even more power to police what we see, say, and share online – with disastrous consequences for public debate, the free flow of information, and democracy itself.
The conversation about social media has become marred by ‘regulatory drama’: we don’t want state intrusion, but we do want better regulation.
How should this drama be resolved?
ARTICLE 19 has a two-pronged solution.
ARTICLE 19’s solution
1. How to regulate content moderation while protecting freedom of expression
First, our policy Watching the Watchmen sets out how governments can ensure their efforts to regulate platforms respect users’ freedom of expression, improve platforms’ transparency, accountability, and decision-making, and – crucially – avoid giving even greater power to the handful of companies that dominate the digital sphere.
Watching the Watchmen
How to regulate content moderation while protecting free expression
But setting human rights standards for social media services addresses only part of the problem.
Currently, a few platforms dominate the social media markets, exploit their users, and violate our rights to privacy, free expression, and non-discrimination. And the lack of viable alternatives locks us into these exploitative relationships.
To truly fix problems in the social media markets, we must tackle the excessive market power of the few huge corporations that control them
2. How to tackle the excessive market power of social media giants
ARTICLE 19’s second policy, Taming Big Tech, shows how to do just that.
It lays out a pro-competition solution that would transform social media – from a closed space, controlled by a handful of exploitative companies and riddled with hate speech and disinformation, to an open and diverse space where we have a real choice between service providers and can step out from exploitative relationships.
Taming Big Tech
How to tackle the excessive market power of social media giants
Taken together, these two proposals would protect freedom of expression, media pluralism, and diversity, and lead to more open, fair, decentralised platforms that enable the free flow of information.
This would be a win–win: for social media users, for smaller service providers, and for society and democracy more broadly.
Key questions answered
Isn’t it a good thing that governments are addressing problematic content on social media?
It’s certainly encouraging that governments want to tackle online abuse, hate speech, and other problematic content – problems that the biggest platforms have repeatedly failed to address, and that drive many users away.
But while their intentions might be understandable, many of their actual proposals would do more harm than good.
This is because, while they claim to be about regulating platforms, governments’ proposals are really more about regulating users’ speech. Effectively, governments are asking platforms to police us and decide what kinds of speech are ‘illegal’ – or even ‘legal but harmful’. This would give a few huge companies even more power.
ARTICLE 19’s policy Watching the Watchmen outlines how governments can regulate platforms’ content moderation and content curation in a way that protects users’ rights.
But current content-moderation and -curation systems are only part of the problem. Governments must also address the excessive market power of the dominant tech companies.
This excessive market power plays a vital role in free expression challenges. That’s why our proposals offer a two pronged solution that prevents further concentration of that power and set out an innovative solution that would create more open, fair, decentralised social media markets.
As such, ARTICLE 19’s two policies represent two sides of the same coin – and both solutions are necessary to protect users’ rights.
If people don’t like social media, why don’t they just stop using it?
Worldwide, over three-quarters of people aged 13+ use social media, and we spend an average of 2.5 hours on the platforms every day.
Simply leaving is not an option for most of us because social media is so central to every aspect of our lives.
From staying in touch with friends and family to shopping, from keeping up with world news to participating in community forums, from pursuing education to partaking in our hobbies, and from organising a birthday party to exchanging information about protests, social media has become the digital village green, town square, and city hall.
Suggesting that people can simply leave social media is therefore a very privileged position. For many people, leaving would be akin to leaving our communities – even our societies – and being cut off from basic services.
And nor should we have to leave, given that there is no competing service to switch to – which itself is a result of Big Tech’s market domination.
We are stuck between a rock and a hard place.
The only escape route is to tackle the excessive power of Big Tech.
Why does it matter that just a few companies control our online spaces?
Monopolies of any kind are bad for society. They control the market, lock us into using their goods or services, and have no incentive to improve – after all, there’s no competition or alternative.
No single entity – private or public – should control the flow of information in society. Yet the excessive market power of the large social media platforms, coupled with their popularity as a source of information and their power over what we see, means they can do just that.
This makes the dominant platforms gatekeepers: not only of the market (because they can see off competitors and lock in users) but also of human rights (because they can grant or restrict our rights to privacy, free expression, and other fundamental rights).
To fix these challenges, we must dilute this power and keep it in check.
What is ARTICLE 19’s solution to Big Tech’s excessive power?
ARTICLE 19’s policy, Taming Big Tech, offers a unique solution to both of these problems: current content-moderation and curation systems on social media platforms, and the excessive market power of the companies that own them.
We propose separating (also known as ‘unbundling’) two services that large platforms currently provide as one package (one ‘bundle’): (1) hosting content, and (2) curating content.
Currently, platforms both host our content (i.e. we can create our profiles and post on their platform) and curate it (i.e. they use their own algorithms to create our timeline or newsfeed: what we see on their platform). They offer us no choice in this: they present hosting and curation – two distinct services – as one inseparable package.
But there is no reason why they should be inseparable. The only reason they currently are is because it allows the dominant companies to lock out competitors, lock in users, and maximise their already-outrageous profits (Meta, for example, made a profit of $39.37 billion in 2021).
Separating these two services would mean the large platforms could still host our content (i.e. we could still use our existing profile on their platform), but they would have to allow third parties to curate it (i.e. create our timeline or newsfeed). Third parties could then compete with the Big Tech giants to curate what we see in more diverse ways, offering us greater control and choice, and breaking up the current monopoly.
This would mean that, for instance, Facebook would have to ask us whether we want Facebook itself or another company – which we could freely select – to curate content for us. We could then select a company that prioritises privacy, or simply one that specialises in a subject we’re interested in (be that football, hip-hop, or climate change), to curate what we see in our newsfeed.
Of course, some users would be happy for Facebook to continue to both host and curate their content – and that would also remain an option. The crucial factor here is user choice.
But the current model is so lucrative for the large platforms that they aren’t going to change voluntarily. That’s why ARTICLE 19’s solution requires independent and accountable regulators to enforce and oversee it.
Crucially, we need both unbundling and human rights-compliant standards that all content-curation providers, from the smallest to the largest, would have to adhere to.
Isn’t it about time they paid for solving the problems they created?
That’s why ARTICLE 19 believes the biggest platforms should foot the bill for separating content hosting from content curation.
And it’s why governments should impose a levy on the biggest platforms to fund our other solutions, like giving people access to dispute-resolution mechanisms when their content is wrongfully removed, and supporting new business models for platforms that create social value – rather than simply extracting it from us.
How would our solution benefit people, providers, and society?
As ARTICLE 19’s policy Taming Big Tech lays out, separating content hosting from content curation and providing access to competitors would have innumerable benefits:
- For individuals, it would finally give us concrete and viable alternatives to the biggest platforms’ content-curation systems. We could select a company to curate our content based on a variety of criteria – for example, how well they protect our privacy or give us access to a plurality of perspectives – conditions that are essential to making informed choices and protecting democracy. And we wouldn’t even need to leave the platform: we could keep our existing profile, friends, and followers.
- For smaller providers, it would grant easier access to users and an incentive to compete with one another to provide content curation that best serves users’ interests – including safeguarding our privacy and online speech.
- For society, it would lead to far more open, fair, diverse, and decentralised social media markets that enable the free flow of information – a healthier environment for free speech all round.
In other words, our solution would be a win–win for social media users, smaller providers, and society at large.
Who should fund our solution?
In 2021, Meta – which owns Facebook, Instagram, Messenger, and WhatsApp – made a profit of $39.37 billion.
That’s more than the GDPs of over half the world’s countries – and more than the combined GDPs of the poorest 31 countries.
Around 97% of Meta’s total revenue comes from advertising: that is, from monetising users’ attention and selling it on.
We work for them.
We work for free.
And they make their billions by invading our privacy, controlling what we can see and say online, and amplifying conflict, disinformation, and hate speech.
This is no accident; it’s their entire business model.
They intentionally create these problems – which are profitable for them yet disastrous for society – while pretending they are making the world a better place for us all.
Isn’t it about time they paid for solving the problems they created?
That’s why ARTICLE 19 believes the biggest platforms should foot the bill for separating content hosting from content curation.
And it’s why governments should impose a levy on the biggest platforms to fund our other solutions, like giving people access to dispute-resolution mechanisms when their content is wrongfully removed, and supporting new business models for platforms that create social value – rather than simply extracting it from us.
Podcast: Taming the Titans
Episode 5: Momentum is building – so where now?
This week, we talk about a crucial third force in this discussion – beyond state and business: civil society – groups of citizens advocating for human and democratic rights and for the good of societies. A global debate is coming: legislators and regulators in different parts of the world will need to adapt emerging regulatory tools and concepts to their own context and markets. What can we learn from the process of negotiations around the Digital Markets Act in Europe and how can it be replicated in other contexts? Is it even the right template? How can we weave together a global civil society to make sure that people’s voices are really heard in this growing conversation?
Recommendations
For large social media companies
For governments
Taming Big Tech
How to tackle the excessive market power of social media giants
Watching the Watchmen
How to regulate content moderation while protecting free expression
ARTICLE 19’s new proposals finally give users concrete and viable alternatives to the biggest platforms. The proposals would allow people to select providers to curate our content based on how well these providers protect our privacy. Vitally, they would ensure that our communities and societies have access to a plurality of perspectives – essential conditions for making informed choices and protecting democracy.
Find out more
27.07.2022 5 MIN READ
Digital markets: Why competition is good for freedom of expression
06.06.2022 1 MIN READ
Bridging the Gap: Local voices in content moderation
27.04.2022 3 MIN READ
Twitter: What Elon Musk must do to protect free speech
10.11.2021 7 MIN READ
International: Why Haugen’s Facebook testimony misses the point
12.01.2021 14 MIN READ
US: A Capitol riot and Big Tech takes a stand: but is it the one we want?
30.03.2020 11 MIN READ
Why decentralisation of content moderation might be the best way to protect freedom of expression online
10.12.2019 1 MIN READ
#MissingVoices Campaign
Add new comment