2.2 C
London
Saturday, November 23, 2024

After X, Meta and TikTok get EU request for info on response to Israel-Hamas war

Meta and TikTok have each been sent formal requests for information by the European Union under the bloc’s Digital Services Act (DSA), the Commission said today.

In recent days, EU regulators has expressed concern about illegal content and disinformation circulating on social media platforms following attacks in the Middle East and the ongoing Israel-Hamas war.

Last week the Commission took the same formal step of asking X (formerly Twitter) to submit info on how it’s complying with requirements set out in the DSA — after publicly warning Elon Musk’s platform about its legal obligations to diligently respond to reports of illegal content and mitigate risks related to disinformation.

It has also issued similar warnings to Meta, TikTok and YouTube (but an EU official confirmed no formal request for info has been sent to the Google-owned platform).

In Meta’s case, the Commission has also expressed public concerns about its approach to election security.

Larger platforms (19 in all) are already subject to the bloc’s rebooted content moderation regulation, including Meta-owned Facebook and Instagram and ByteDance’s TikTok, as well as Musk’s X. The Commission itself is responsible for oversight of these so-called Very Large Online Platforms (VLOPs) and search engines (VLOSEs). Compliance for a wider sweep of digital services, with less than 45M monthly active users, will kick in early next year.

Failure to mesh with the pan-EU governance regime carries a lot of risk since platforms could be hit with fines as high as 6% of annual global turnover for confirmed breaches. The DSA also contains powers for the EU to block access to infringing services, in cases of repeated serious violations of the rules — so the stakes are high.

In short, these aren’t the sorts of compliance risks that might be easily written off by Big Tech as a cost of doing business.

The Commission’s formal requests for information under the DSA are not the same as it opening formal investigations. But the development could prefigure such a step.

In a press update today, the Commission said it’s asked Meta to provide it with more details on the measures it has taken to comply with DSA obligations related to “risk assessments and mitigation measures to protect the integrity of elections and following the terrorist attacks across Israel by Hamas, in particular with regard to the dissemination and amplification of illegal content and disinformation”.

While it said its ask to TikTok is related to its obligations to apply “risk assessments and mitigation measures against the spreading of illegal content, in particular the spreading of terrorist and violent content and hate speech, as well as the alleged spread of disinformation”. The Commission also said its request to TikTok addresses compliance with other elements of the DSA — especially in relation to online child protection.

Meta and TikTok were contacted for a response to the Commission’s requests for information.

Meta has previously published a blog post detailing some of the steps it’s taken in response to events in Israel and Gaza — such as saying it would prioritize checks on livestreaming tools.

A TikTok spokesperson sent us this statement: “We just heard from the European Commission this morning and our team is currently reviewing the RFI [Request for Information]. We’ll publish our first transparency report under the DSA next week, where we’ll include more information about our ongoing work to keep our European community safe.”

Yesterday the EU’s executive published DSA-related recommendations for Member States — which will be looped into oversight of the regime next year via a network of national watchdogs when the general compliance deadline kicks in for in-scope services.

The Commission is urging Member States not to wait to designate an independent authority that will form part of the Digital Services Coordinators (DSC) network — and go ahead and do so before the official deadline (February 17, 2024) for appointing a local watchdog. The development suggests the Commission is feeling the heat — and maybe bitten off more than it can chew — when it comes to its major new oversight role of larger platforms’ content moderation efforts in the midst of so many volatile geopolitical events.

“In the context of an unprecedented period of conflict and instability affecting the European Union, first with Russia’s war of aggression against Ukraine, and now with the terrorist attacks by Hamas on Israel, the Commission counts on Member States to join forces to enable prompt enforcement of the DSA,” it wrote in a press release yesterday.

The DSA demands a complex balancing act from in-scope platforms and services — as it’s intended to drive online businesses to respond diligently to threats posed by illegal or just potentially harmful content on their patch while demanding they respect fundamental rights, like freedom of expression and information. That in turn suggests enforcement of the DSA must be a delicate balancing act, too. But given the volume of public warnings from the Commission to tech giants in recent days, after an initial (shocked?) silence following the bloody surprise attacks in the Middle East, it’s not clear they’ve figured out how to strike a sensible balance yet.

Add to that, criticism that the EU has passed a “censorship law” has also been visibly circulating on online platforms this week…

In its recommendation to Member States yesterday, the Commission was essentially calling for enforcement reinforcements. it put out a direct ask for DSC support in ensuring larger platforms are compliant, ahead of the bulk of their official duties monitoring the DSA compliance of other (smaller) services next year. Although it remains to be seen how many authorities can be rushed into support work faster than legally required.

The Commission’s recommendation also proposes what’s described as an “incident response mechanism” be set up to outline how it and the DSC network should cooperate and work together in response to fast-moving situations where illegal content is being disseminated and “poses a clear risk of intimidating groups of population or destabilising political and social structures in the Union”.

“The mechanism would include regular incident response meetings to discuss good practices and methodologies, and regular reporting on and exchange of information collected at national level,” the Commission also suggested. “The information received from the network may provide the Commission with evidence to exercise its supervisory and investigatory powers pursuant to the DSA.”

Notably the Commission missive also reminds Member State agencies of existing powers to tackle illegal content — such as the Regulation on addressing the dissemination of terrorist content online, which has been in force since June 2022 — again suggesting it’s hoping to spread the enforcement burden.

“The Commission will continue to rely on existing structures, particularly for counterterrorism, such as the EU Crisis Protocol which coordinates responses to online developments stemming from a terrorist or a violent extremist act; and, at international level, the Christchurch Call and the industry-led Global Internet Forum to Counter Terrorism; to secure joined-up actions,” it also noted.

Its press release also includes encouragement to VLOPs and VLOSEs to draw up “incident protocols” for tackling what it dubbed “extraordinary circumstances — such as an international armed conflict or terror attacks”.

So it does read as if the Commission struggling to get a handle on the patchwork response we’ve seen so far from Big Tech to the violence in the Middle East. And would much prefer they streamlined their responses. But good luck getting Musk to join any such club!

Latest news
Related news