San Francisco: European Commissioner Thierry Breton has sent a letter to Alphabet and Google CEO Sundar Pichai, reminding him about the EU’s Digital Services Act (DSA) to keep illegal content and disinformation from being shared on YouTube related to the Israel-Hamas war.
In the letter, also addressed to YoutTube CEO Neal Mohan, Breton said that following the terrorist attacks carried out by Hamas against Israel, “we are seeing a surge of illegal content and disinformation being disseminated in the EU via certain platforms”.
“I would like to remind you that you have a particular obligation to protect the millions of children and teenagers using your platforms in the EU from violent content depicting hostage taking and other graphic videos,” the commissioner said late on Friday.
It means having appropriate and proportionate measures in place to ensure a high level of privacy, safety and security for minors, he added.
Breton has already warned X, Meta and TikTok on removing terrorist propaganda and manipulated content, such as repurposed videos or clickbaits, from their respective platforms.
He told Pichai that “when you receive notices of illegal content in the EU, you must be timely, diligent and objective in taking action and removing the relevant content when warranted”.
“Given the urgency, I also expect you to be in contact with the relevant law enforcement authorities and Europol and ensure that you respond promptly to their requests,” said Breton.
In the context of elections, he told Pichai that the DSA requires that the risk of amplification of fake and manipulated images and facts generated with the intention to influence elections is taken extremely seriously in the context of mitigation measures.
“I invite you to inform my team on the details of the measures you have taken to mitigate any deepfakes, also in the light of upcoming elections in Poland, The Netherlands, Lithuania, Belgium, Croatia, Romania and Austria, and the European Parliament elections,” said Breton.
“As you know, following the opening of a potential investigation and a finding of non-compliance, penalties can be imposed,” he added.
Earlier, Meta said that since the terrorist attacks by Hamas on Israel and Israel’s response in Gaza, “expert teams from across our company have been working around the clock to monitor our platforms, while protecting people’s ability to use our apps to shed light on important developments happening on the ground”.
“In the three days following October 7, we removed or marked as disturbing more than 795,000 pieces of content for violating these policies in Hebrew and Arabic,” the social network said in a blog post.
As compared to the two months prior, in the three days following October 7, “we have removed seven times as many pieces of content on a daily basis for violating our Dangerous Organisations and Individuals policy in Hebrew and Arabic alone,” Meta added.
(IANS)