Facebook, Twitter, TickTalk to face fine after failing to remove malicious content

The UK government has set new powers for tech companies to face fines of up to 18 million GB (approximately Rs 180 crore) or 10 per cent of their annual global turnover. Subject.

Under the new laws, UK Digital Secretary Oliver Dowden and Home Secretary Priti Patel have called for the elimination of social media sites, websites, apps and other services that allow people to host user-created content or talk to others online and restrict child sexual abuse, terrorist content and content. Prevalence of illegal content.

The UK’s Regulatory Body Office of Communications (Ofcom) is now certified as a regulator that has the power to prevent non – compliant services from accessing the country.
“We are giving internet users the protection they deserve and working with companies to address some of the abuses that are happening on the web,” Patel said.

“We do not allow child sexual abuse, terrorist material and other harmful content to grow on online platforms. Tech companies need to put public safety first or face the consequences, ”she said.

The new law also includes provisions that impose criminal sanctions on senior managers. The government has said it will not hesitate to enforce these powers if companies fail to take the new rules seriously – for example, if they do not respond fully, accurately and in a timely manner to information requests from Ofcom.

“I’m not ashamed to say that pro tech is free tech for everyone. Today, Britain is setting the world standard for online security with the most comprehensive approach to online regulation,” Dowden said.

“We are entering a new era of accountability for tech to protect children and vulnerable consumers, to restore confidence in the industry and to keep the law abiding for free speech. The new framework in this proportion we see will not place unnecessary burdens on small businesses but will adhere to the strong rules of the road That way we can capture the brilliance of modern technology to improve our lives, ”he said.

According to the regulations, tech platforms are expected to do much to protect children from exposure to malicious content or activities such as clothing, bullying and pornography.

The most popular social media sites with the largest audiences and high-risk features need to go further by setting and enforcing clear terms and conditions that clearly state how they handle legitimate content but can cause significant physical or psychological harm to adults.

It contains dangerous and misleading information about coronavirus vaccines, and can help bridge the gap between what companies do and what happens in practice.

Ofcom chief executive Dame Melanie Dawes said: “Being online can bring huge benefits, but four out of five are concerned about it. It highlights the need for prudent, balanced rules to protect consumers from serious harm, but also recognizes great things about online, including free expression.

“By acquiring new technology and data skills we are preparing for work and finalizing plans as we work with Parliament.”

The government plans to bring laws into the online security bill next year. Parliament will introduce these powers through a second law in response to the online Hermes White Paper consultation. The government says it is working with the Law Commission on whether promoting self-harm is illegal.

Companies have different liability for different types of content and functionality, depending on the approach to sites, applications and platforms that are at high risk of harm. May have large online presence and small group companies with high-risk features Facebook, TickTalk, Instagram, And Twitter, Is in category 1.

These companies must assess the risk of legal content or activity in their services “at the risk of causing significant physical or psychological harm to adults”. They must clarify what kind of “legitimate but harmful” content is acceptable in their terms and conditions on their platforms and enforce it transparently and consistently.

All companies require mechanisms so that people can easily report malicious content or activity while appealing for content removal. Category 1 companies are required to publish transparency reports about the actions they take to address online vulnerability.

Financial losses are excluded from this framework, including fraud and sales of unsecured goods.

The MacBook Air M1 is the portable beast of the laptop you’ve always wanted? We discussed this Orbit, Our weekly technology podcast, you can subscribe to Apple Podcasts, Google Podcasts, Or RSS, Download the episode, Or press the Play button below.

Leave a Reply

Your email address will not be published.