The European Commission this Tuesday defined 17 very large platforms, including Facebook and Twitter, and two search engines, Bing and Google, which will have new responsibilities in moderating content and protecting users in four months’ time.
These obligations follow the entry into force of the Digital Services Act in the European Union (EU) last November, under which “the Commission adopted the first designation decisions this Tuesday”, involving 17 very large-scale online platforms, with 45 million monthly active users, who will have to comply with the new rules, including AliExpress, Amazon, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia , YouTube and Zalando.
In addition, two very large search engines, such as Bing and the Google search tool.
Following this designation, which is based on the total number of users up to last February“companies will now have to comply within four months with the full set of new obligations under the Digital Services Law”, which aims to educate and protect “online” users, including minors, requiring service designees to systemic risks and provide robust content moderation tools”the city council sums up in a statement.
At stake is more power on from the startto users, who should now be given clear information and be able to easily report illegal content that platforms encounterand it’s also up to the tech companies to label the ads and inform who’s promoting them.
With regard to the protection of minors, platforms will have to redesign their systems to ensure a high level of privacy and security and will no longer be able to advertise to children.
Another responsibility that now falls on these very large platforms is the fight against disinformation, taking measures to counter the spread of fake news, the risks associated with the spread of illegal content ‘online’ and the negative effects on the freedom of expression and information, as well as a mechanism that allows users to flag this type of content.
To monitor compliance with all these new responsibilities, external and independent audits are planned and platforms will also have to provide more access to data by researchers and publish transparency reports.
“Within four months of notification of designation decisions, designated platforms and search engines must adapt their compliance systems, resources and processes, set up an independent compliance system and conduct the first annual risk assessment and report to the Commission.” concludes Brussels.
Last November, The new Digital Services Act has entered into force and was created to protect the fundamental rights of online users. It will be an unprecedented piece of legislation for the digital space that will hold platforms accountable for illegal and harmful content.
The new law applies to technology “giants” with 45 million or more users in the EU, representing about 10% of the community’s population, but also to new services such as ChatGPT, from artificial intelligence, as they are not considered a platform, works together with this.
Source: DN
