Anonymous ID: 8613a8 Aug. 10, 2024, 10:48 p.m. No.21389611   🗄️.is 🔗kun   >>9614 >>9617 >>9625 >>9627 >>9639 >>9649 >>9724 >>9737 >>9798 >>9816

UK - Online Safety Act 2023 (the Act) is a new set of laws regulating social media companies and search services from within and outside the UK

 

The Online Safety Act 2023 (the Act) is a new set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms.

 

The Act will give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear.

 

The strongest protections in the Act have been designed for children and will make the UK the safest place in the world to be a child online. Platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.

 

The Act will also protect adult users, ensuring that major platforms will need to be more transparent about which kinds of potentially harmful content they allow, and give people more control over the types of content they want to see.

 

Ofcom is now the independent regulator of Online Safety. It will set out steps providers can take to fulfil their safety duties in codes of practice. It will have a broad range of powers to assess and enforce providers’ compliance with the framework.

 

Providers’ safety duties are proportionate to factors including the risk of harm to individuals, and the size and capacity of each provider. This makes sure that while safety measures will need to be put in place across the board, we aren’t requiring small services with limited functionality to take the same actions as the largest corporations. Ofcom is required to take users’ rights into account when setting out steps to take. And providers have simultaneous duties to pay particular regard to users’ rights when fulfilling their safety duties.

 

The Act also introduced some new criminal offences – details are set out below.(1)

 

Who the Act applies to

The Act’s duties apply to search services and services that allow users to post content online or to interact with each other. This includes a range of websites, apps and other services, including social media services, consumer file cloud storage and sharing sites, video sharing platforms, online forums, dating services, and online instant messaging services.

 

The Act applies to services even if the companies providing them are outside the UK should they have links to the UK. This includes if the service has a significant number of UK users, if the UK is a target market or it is capable of being accessed by UK users and there is a material risk of significant harm to such users.

 

(1)The criminal offences introduced by the Act came into effect on 31 January 2024. These offences cover:

 

encouraging or assisting serious self-harm

cyberflashing

sending false information intended to cause non-trivial harm

threatening communications

intimate image abuse

epilepsy trolling

 

These new offences apply directly to the individuals sending them, and convictions have already been made under the cyberflashing and threatening communications offences.

 

https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer

Anonymous ID: 8613a8 Aug. 10, 2024, 11:26 p.m. No.21389701   🗄️.is 🔗kun   >>9727 >>9747 >>9754

>>21389639

>More than 741 people have now been arrested in connection with the unrest, of which 302 have been charged,''' the National Police Chiefs' Council said on Friday.

 

Listening to the judges they assume intention - not prove intention - not the same as U.S.

 

"The overall tone of the posts clearly reveals your fundamentally racist mindset," she said. She added:"I am sure that when you intentionally created the posts you intended that racial hatred would be stirred up by you."

https://news.sky.com/story/jordan-parlour-facebook-user-jailed-for-riot-related-social-media-posts-13193894

 

If charged a person can not argue…

Anonymous ID: 8613a8 Aug. 10, 2024, 11:38 p.m. No.21389724   🗄️.is 🔗kun   >>9737 >>9772 >>9778 >>9798 >>9816

>>21389611

 

How the UK Online Safety Act will be enforced

 

Ofcom is now the regulator of online safety and must make sure that platforms are protecting their users. Once the new duties are in effect, following Ofcom’s publication of final codes and guidance, platforms will have to show they have processes in place to meet the requirements set out by the Act. Ofcom will monitor how effective those processes are at protecting internet users from harm. Ofcom will have powers to take action against companies which do not follow their new duties.

Companies can be fined up to £18 million or 10 percent of their qualifying worldwide revenue, whichever is greater. Criminal action can be taken against senior managers who fail to ensure companies follow information requests from Ofcom. Ofcom will also be able to hold companies and senior managers (where they are at fault) criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.

 

How the Act affects companies that are not based in the UK=

The Act gives Ofcom the powers they need to take appropriate action against all companies in scope, no matter where they are based, where services have relevant links with the UK. This means services with a significant number of UK users or where UK users are a target market, as well as other services which have in-scope content that presents a risk of significant harm to people in the UK.

 

How the Act tackles harmful algorithms

The Act requires providers to specifically consider how algorithms could impact users’ exposure to illegal content – and children’s exposure content that is harmful to children – as part of their risk assessments.

Providers will then need to take steps to mitigate and effectively manage any identified risks. This includes considering their platform’s design, functionalities, algorithms, and any other features likely to meet the illegal content and child safety duties.

The law also makes it clear that harm can arise from the way content is disseminated, such as when an algorithm repeatedly pushes content to a child in large volumes over a short space of time.

Some platforms will be required to publish annual transparency reports containing online safety related information, such as information about the algorithms they use and their effect on users’ experience, including children.

 

https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer#how-the-act-affects-companies-that-are-not-based-in-the-uk