Remove Material Deemed Illegal or Face Sanctions Including Jail

Sun, 21st April 2019, 13:50

Laws framed as emotional responses to horrific events may be understandable but rarely survive the subsequent court challenges that are surely to arise.  A new Australian law threatens social media and web hosting companies up to 10 percent of their annual global turnover and imprisonment of up to three years for executives if violent content is not removed expeditiously under the new law. The European parliament voted on Wednesday to fine firms up to 4 percent of their turnover if they persistently fail to remove extremist content within one hour of being asked to do so by authorities.

The new laws are in response to the terrorist attack on two mosques in Christchurch New Zealand, which killed 50 people as they attended Friday prayers. The gunman broadcast his attack live on Facebook and it was widely shared for over an hour before being removed, a timeframe Australian Prime Minister Scott Morrison described as unacceptable.

Under the new Australian law it is now an offence for tech and social media companies not to remove any videos or photographs that show murder, torture or rape without delay. Companies must also inform Australian police within a "reasonable" timeframe. Australian Attorney-General Christian Porter described the laws as a "world first in terms of legislating the conduct of social media and online platforms".

While the finalized draft of the text for the European law will not happen until after the election of the new European Parliament in May, EU officials moved to regulate because they believe internet companies are not doing enough under voluntary measures. Draft measures call on the bloc’s national governments to put in place the tools to identify extremist content and an appeals procedure. The one-hour rule would apply from the point of notification by national authorities.

In response to industry concerns that smaller platforms do not have the same resources to comply as speedily with tougher EU rules, lawmakers said authorities should take into account the size and revenue of companies concerned.

Yet the emotional knee jerk response by politicians is already causing headaches for website owners.

Internet Archive denies hosting 'terrorist' content

The Internet Archive is reporting that it has received 550 "false" demands to remove "terrorist propaganda" from its servers in less than a week. The Internet Archive, a non-profit organisation that archives historical snapshots of the web, said the demands wrongly accused it of hosting terror-related material.

If the Archive does not comply with the notices, it risks its site getting added to lists which ISPs are required to block. The demands came via the Europol net monitoring unit and gave the site only one hour to comply.

Chris Butler of Internet Archive says that it had received notices identifying hundreds of web addresses stored on as leading people to banned material. However, the reports were wrong about the content they pointed to, or were too broad for the organisation to comply with. Some of the requests referred to material that had "high scholarly and research value" and were not produced by terror groups. Even US Government-produced broadcasts and reports were flagged for content.

Initially the sender of the demands to remove the URLs was identified as Europol’s EU Internet Referral Unit (EU IRU). The sender was in fact, the French national Internet Referral Unit, using Europol’s application, which sends the email from an address.

Still Butler states:

We are left to ask – how can the proposed legislation realistically be said to honor freedom of speech if these are the types of reports that are currently coming from EU law enforcement and designated governmental reporting entities? It is not possible for us to process these reports using human review within a very limited timeframe like one hour. Are we to simply take what’s reported as “terrorism” at face value and risk the automatic removal of things like THE primary collection page for all books on

Other Technology firms said they are already working on the issue. Digital Industry Group Inc (DIGI) - of which Facebook, Apple, Google, Amazon and Twitter are members - says the laws fail to understand the complexity of removing violent content. With the vast volumes of content uploaded to the internet every second, this is a highly complex problem.

Companies rely on a mix of automated tools and human moderators to spot and delete extremist content. However, when illegal content is taken down from one platform, it often crops up on another, straining authorities’ ability to police the web.