Be ready for the parent view of what is going on in the internet. Just maybe this is a good idea.
The UK government is taking a hard line when it comes to online safety, moving to establish what it says is the world’s first independent regulator to keep social media companies in check.
Companies that fail to live up to requirements will face huge fines, and senior directors who are proven to have been negligent will be held personally liable. They may also find access to their sites blocked.
The new measures, designed to make the internet a safer place, were announced jointly by the Home Office and Department of Culture, Media and Sport. The introduction of the regulator is the central recommendation of a highly anticipated government white paper, titled Online Harms, published Monday in the UK.
The regulator will be tasked with ensuring social media companies tackle a range of online problems, including:
- Incitement of violence and the spread of violent (including terrorist) content
- Encouragement of self-harm or suicide
- The spread of disinformation and fake news
- Cyberbullying
- Children’s access to inappropriate material
- Child exploitation and abuse content
As well as applying to the major social networks, such as Facebook, YouTube and Twitter, the requirements will also have to be met by file-hosting sites, online forums, messaging services and search engines.
“For too long these companies have not done enough to protect users, especially children and young people, from harmful content,” UK Prime Minister Theresa May said in a statement. “We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.”
Google and Facebook didn’t immediately respond to a request for comment.
