Editorial & Advertiser disclosure

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

Finance

Posted By Global Banking and Finance Review

Posted on December 16, 2024

Featured image for article about Finance

LONDON (Reuters) - Britain's online safety regime came into force on Monday, requiring social media companies like Meta's Facebook and ByteDance's TikTok to take action to tackle criminal activity on their platforms and make them safer by design.

Media regulator Ofcom said it had published its first codes of practice on tackling illegal harms such as child sexual abuse and assisting or encouraging suicide.

Sites and apps have until March 16, 2025, to assess the risks illegal content poses to children and adults on their platforms, Ofcom said.

After the deadline, they will have to start implementing measures to mitigate those risks, such as better moderation, easier reporting and built-in safety tests, Ofcom said.

Ofcom Chief Executive Melanie Dawes said the safety spotlight was now firmly on tech companies.

"We'll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year," she said.

The Online Safety Act, which became law last year, sets tougher standards for platforms such as Facebook, YouTube and TikTok, with an emphasis on child protection and the removal of illegal content.

Under the new code, reporting and complaint functions will have to be easier to find and use. High-risk providers will be required to use automated tools called hash-matching and URL detection to detect child sexual abuse material, Ofcom said.

The regulator will be able to issue fines of up to 18 million pounds ($22.3 million) or 10% of a company's annual global turnover if they fail to comply.

Britain's Technology Secretary Peter Kyle said the new codes were a "material step change in online safety".

"If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites," he said.

(Reporting by Paul Sandle; Editing by Emelia Sithole-Matarise)

Recommended for you

  • Thumbnail for recommended article

  • Thumbnail for recommended article

  • Thumbnail for recommended article

;