The European Commission gave this Thursday until the 25th for Meta and TikTok to report on the measures adopted to combat the dissemination of illegal content and disinformation, within the framework of the Digital Services Law.
In a statement, the community executive alerts the Meta about the need to comply with the obligations related to risk assessments, “especially with regard to the dissemination and amplification of illicit content and disinformation.”
Meta must provide the requested information to the Commission before October 25, 2023 for questions related to the response to the crisis that arose after the attack by the Islamist movement Hamas on Israel, a deadline that extends until November 8 for questions about the protecting the integrity of elections.
For its part, TikTok, also under the Digital Services Law (DAS), will also have to inform Brussels about the measures adopted to prevent the dissemination, in particular, of terrorist and violent content, as well as hate speech.
The deadlines presented are also October 25 for the measures adopted in the area of the aforementioned conflict and November 8 to address those related to the protection of the integrity of the elections and also of minors online.
Based on the responses sent, Brussels will evaluate the next steps, and may formally initiate a process and decide to impose fines, within the scope of the DAS, for erroneous, incomplete or distorted information in response to a request for information, or even for lack of answer.
Both Meta – owner of Facebook and Instagram – and TikTok are considered Very Large Online Platforms, so they must comply with all the provisions introduced by the DAS.
AX (former Twitter) was already the subject of a similar investigation.
Source: TSF