New German Law Can Lead To Millions In Fine For Facebook & Twitter For Not Deleting Hate Speech

New German Law Can Lead To Millions In Fine For Facebook & Twitter For Not Deleting Hate Speech

A new law has been passed in Germany, under which social networking sites like Facebook and Twitter may be liable for huge fines if hate speech and other such illegal material is not removed promptly.

From Jan. 1, the Network Enforcement Act (known commonly as the NetzGD) will require social networking sites having over 2 million users  to remove illegal content within 24 hours . The period is extendible to 7 days in some special cases. Failing to do so can result in the networks facing fines as large at $60 million.

The German law bans any form of hate speech, material encouraging violence, or spreading propaganda  .

The law, which was passed in July last year will however not apply to messaging applications.

Social Media Platforms Criticised For Not Controlling Harassment And Fake News

Facebook and Twitter have particularly come under fire for failure to stop harassment and the spread of fake news through their platforms. According to a recent investigation by ProPublica Facebook’s policy for enforcing their prescribed code of conduct was uneven at best.

A spokesperson for Facebook noted the social media company shared “the goal of the German government to fight hate-speech” and stated that it was making ” great progress in the removal of illegal content.”

Facebook Launches New Process

A statement released by Facebook has stated that it had established a reporting process separate from the existing Community Standards reporting in order to comply with NetzDG.

It noted that by introducing this process, the company aims to  “be transparent and make it possible for all people — regardless of whether they are registered on our platform or not — to report content”  as per the NetzDG.

Twitter recently released its new anti-abuse guidelines and has initiated the removal of their blue verification check marks from the accounts of white supremacist users.

In November last year, YouTube (owned by Google), started removing videos uploaded by extremists, even if these videos were not depicting violence or hate speech.

(Visited 2 times, 1 visits today)