Ilaria Fevola
Published: September 28, 2022
Ilaria Fevola is ARTICLE 19’s legal officer focused on transparency.
Today, Sept. 28, is the International Day for Universal Access to Information, that recognises the importance for every individual to be able to access information held by public institutions. Traditionally, it has been an opportunity to examine whether countries have adopted a law on access to information and if, in practice, public bodies adhere to their transparency obligations.
At present, 126 countries have access to information laws. This means 91% of the world’s population lives in a country where they can formally request information from a state or local authority.
But we know that the right to know doesn’t end with the existence of a law. It is a right that empowers people to participate in decisions that affect them; a tool to hold entities that make such decisions accountable. Information is power, as we often say.
However, it is not just governments that make those decisions. From the extractive industry to Big Tech, private corporations hold enormous power over individuals, both online and offline.
Yet, laws do not make them accountable in the same way that access to information laws do public institutions. They are not obliged to be transparent, and individuals do not have the guarantees that the right to know gives them. So far, only data protection laws have tried to fill this gap, allowing people whose data have been processed by companies to ask how it is being used.
Until very recently, the importance of the right to know for corporate transparency has been neglected, or perhaps just ignored. Nowhere is this more visible than on social media.
Social media platforms are no longer considered as just private companies whose users simply accept terms and conditions. They are spaces where free expression, democratic debate and participation are realised. Given their power, transparency of Big Tech firms is becoming increasingly important.
There is a massive concentration of market power in a handful of companies that have total control over the content that is distributed and consumed: they actively control, select and censor what we see online. This much power cannot be left unchecked.
The European Union has recently taken a major step towards addressing this issue, by introducing regulations in the digital sphere. The Digital Services Act (DSA) is meant to ensure that users are able to choose from a wide range of products and services online, while the Digital Markets Act (DMA) is aimed at allowing businesses to freely and fairly compete online. The European Commission’s initiative sets transparent and clear rules for companies, and outlines sanctions if they fail to respect them.
Crucially, the legislations put a strong emphasis on transparency, to shed light on issues such as platforms’ content moderation practices. That way, users can operate in a predictable environment and know the possibilities and limits on their behaviour online. Platforms will be required to explain and publish their terms of services, explain their content moderation decisions and make them publicly available.
Social media companies will also have to produce transparency reports, including information about requests from governments to remove user-generated content, notices submitted to flag alleged illegal content, and information about measures against misuse. This will benefit researchers, oversight bodies, and the public who will be able to better understand how content moderation decisions are made.
But while a law is a key and fundamental step, for it to be effective, it requires implementation on two sides. One, institutions need to know the law well and see it with a positive, instead of a suspicious, attitude. Second, people need to know that a law exists and that they can use it freely to ask for information.
The same needs to happen with Big Tech. Companies need to understand how transparency will benefit them in the long run, through building trust with their users and the public at large.
It is also crucial that users are informed about their right to know and know how to use it. This two-way awareness of transparency is a long process and we’re just getting started. The road ahead is long but one thing is clear: companies have a duty to respect human rights, and the right to know is one of them.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.