Your authority on United Kingdom local government

Facebook logo along with binary cyber codes are seen in this illustration taken

Facebook logo along with binary cyber codes are seen in this illustration taken

The proposals will be fleshed out in coming months, and a person familiar with the matter told Bloomberg that the regulator is likely be given the power to fine companies such as Facebook and Twitter if they fail to protect United Kingdom users from harmful content.

DCMS (Department for Digital, Culture, Media & Sport) Secretary of State Nicky Morgan said: "With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK".

The proposals were published as a first response to a consultation on the Government's online harms white paper, which was released past year and called for a statutory duty of care for internet companies to protect their users.

It will then be for Ofcom to decide when and how companies have breached that duty and what the punishment should be.

"The DCMS Committee in the last parliament led calls for urgent legislation to prevent tech companies walking away from their responsibilities to tackle harmful content on their sites", he said.

But Morgan did not say Wednesday whether the fines or more serious proposed penalties such as taking platforms offline were still being considered by May's successor, Boris Johnson. However, the likes of Facebook and Twitter still largely advocate self-regulation, with the belief that they can more quickly adapt their systems to suit users' needs.

Government will be announcing the new regulation related proposals later on Wednesday.

"New rules are needed so that we have a more common approach across platforms and private companies aren't making so many important decisions alone", said Rebecca Stimson, Facebook's head of United Kingdom public policy.

Those processes will nonetheless be overseen by the regulator, who will require companies to have "effective and proportionate user redress mechanisms which will enable users to report harmful content and to challenge content takedown where necessary" (recognising concerns raised by the consultation respondents relating to the potential risk of regulation to freedom of expression), but will not investigate or adjudicate on individual complaints relating to individual pieces of content. It could also result in "regulatory drift" with Ofcom empowered it to intervene in all manner of online discourse in the name of enforcing online companies' duty of care.

Social media platforms have turned into a hotspot for harmful content. Fewer than five per cent of United Kingdom businesses will be in scope.

Vodafone said it broadly welcomed the government move as a "step in the right direction", because the company has a "long-standing commitment to keeping people safe online, especially children and other vulnerable groups".

The government said it will now start drafting corresponding legislation and release more details in the coming months. Platforms' content policies are incredibly broad and restrict free speech far beyond the limitations set in law.

"We called for the new regulator to be completely independent from Government which is why we demanded a right of veto over the appointment".

Listen to Billie Eilish's Bond theme song No Time To Die
Secret celebrities in Milwaukee? 'The Masked Singer' coming to Riverside Theater