Aug.31, 2023 /Internet/ — Moderators could be liable to lawsuits if they fail to protect privacy and data protection. The European Union’s Digital Services Act (DSA) aims to create a safer online environment for users by holding online platforms more accountable for the content they host. Under the DSA, moderators will be required to take steps to remove illegal content, such as hate speech and child sexual abuse material, from their platforms more quickly. They will also be required to be more transparent about how they moderate content and to give users more control over their data.
If moderators fail to comply with the DSA, they could be fined up to 6% of their global turnover. They could also be sued by individuals who have suffered harm as a result of the platform’s failure to protect their privacy or data.
The DSA is still being negotiated, but it is expected to be finalized and adopted by the European Parliament in 2023. Once it is in force, it will have a significant impact on the way online platforms operate.
Here are some specific provisions of the DSA that could affect moderators:
- Content moderation: The DSA requires platforms to remove illegal content, such as hate speech and child sexual abuse material, from their platforms within 24 hours of being notified of it. They must also take steps to prevent the re-publication of this content.
- Transparency: Platforms must be more transparent about how they moderate content. This includes providing information about their content moderation policies, how they identify illegal content, and how they appeal decisions to remove content.
- User control: Users must be given more control over their data. This includes the right to access their data, to have it deleted, and to object to its processing.
The DSA is a significant piece of legislation that will have a major impact on the way online platforms operate. It is important for moderators to understand the requirements of the DSA so that they can comply with it and avoid being liable to lawsuits.