Online firms face EU fine if extremist posts stay up over an hour
STRASBOURG – The European Union’s chief executive on Wednesday proposed hefty fines on Google, Facebook, Twitter and other online platforms if they fail to remove extremist content within one hour.
Brussels gave internet firms three months in March to show they were acting faster to take down radical posts. Industry lobby groups say it has made great strides since then and imposing a time limit was arbitrary.
If authorities flag it, the European Commission wants content inciting or advocating extremist offences, promoting extremist groups, or showing how to commit such acts to be removed from the web within a hour.
“One hour is the decisive time window in which the greatest damage takes place,” Jean-Claude Juncker said in his annual State of the Union address to the European Parliament.
In a proposal that will need backing from EU countries and the European Parliament, internet platforms will also be required to take proactive measures, such as developing new tools to weed out abuse and human oversight of content.
Service providers will have to provide annual transparency reports to show their efforts in tackling abuse.
Providers systematically failing to remove extremist content could face hefty fines of up to 4 percent of annual global turnover. Content providers will though have the right to challenge removal orders.
“We need strong and targeted tools to win this online battle,” Justice Commissioner Vera Jourova said of the new rules.
In turn, the draft rules will demand the EU’s 28 national governments put in place the capacity to identify extremist content online, sanctions and an appeals procedure.
The industry has also been working since December 2015 in a voluntary partnership to stop the misuse of the internet by international extremist groups, later creating a “database of hashes” to better detect extremist content.
Firms increasingly rely on a mix of machine learning, artificial intelligence and human moderators to spot and delete extremist content.
“We’ve made significant strides finding and removing their propaganda quickly and at scale, but we know we can do more,” Facebook said in a statement, adding “there was no place for terrorism” on the social media platform.
When content is taken down from one platform, it often crops up on another – straining authorities’ ability to police the web.
Smaller platforms, industry insiders warn, may not have the same resources to speedily comply with tougher EU rules.
The Commission will retain a voluntary code of conduct on hate speech with Facebook, Microsoft, Twitter and YouTube in 2016. Other companies have since announced plans to join.
Separately, EU lawmakers voted on Wednesday to force Google and other technology firms to share more revenues with European media, publishers and other content creators in a shake-up of copyright rules.
(Reporting by Philip Blenkinsop, Daphne Psaledakis and Alissa de Carbonnel; Editing by Matthew Mpoke Bigg and David Evans)
Published at Wed, 12 Sep 2018 09:09:12 -0700