Terror content ordered to be taken off within an hour: EU to web firms

New recommendations also include steps to crack down on other harmful illegal content such as hate speech and images of child sexual abuse.

Update: 2018-03-02 03:18 GMT
The scripts in question are called session replay' scripts, which are used by companies to gain insights on how the customers are using their websites and/or to analyse confusing webpages.

Brussels: Online platforms should take down "terrorist content" within an hour of it being reported, the EU said on March 1 in new recommendations to internet companies to stem the flow of harmful content on the web.

Brussels is looking for ways to combat online extremism amid growing alarm about the use of sites like YouTube, Facebook and Twitter as forums to radicalise and recruit, especially by the Islamic State group.

The European Commission, the bloc's executive arm, has already signed up a group of US internet giants to a plan to combat web extremism but warned it would consider legislation if the voluntary approach did not work.

"While several platforms have been removing more illegal content than ever before -- showing that self-regulation can work -- we still need to react faster against terrorist propaganda and other illegal content," the commission's vice-president for the Digital Single Market Andrus Ansip said.

This content remains "a serious threat to our citizens' security, safety and fundamental rights," added Ansip, a former Estonian prime minister.

Voluntary industry efforts have achieved results, the commission said, but there is still "significant scope for more effective action, particularly on the most urgent issue of terrorist content, which presents serious security risks".

The commission said "terrorist content" should be taken down within one hour of being reported by the authorities, such as police, and internet companies should do more to monitor and remove material themselves.

The new recommendations also include steps to crack down on other harmful illegal content such as hate speech and images of child sexual abuse.

Last month the commission said IT firms removed 70 percent of illegal content notified to them in the preceding few months.

This was compared to 59 percent before May 2017, and 28 percent in the months after the code of conduct was launched in 2016.

Tags:    

Similar News