Free Speech Dead in the UK and EU
Segment #591
The Online Saftey Act (UK) and the Digital Services Act (EU) are a major step toward complete abolishment of freedom of speech. It is insane and certainly as the stories of arrests continue will keep many travelers from around the world from visiting these countries.It is ironic that most of the vitriol is directed at the conservatives named labeled as fascists. Sounds to me like Europe and the UK are embracing the very ideology they purport to hate.
Fascism is characterized by a dictatorial leader, centralized government control, militarism, and the forcible suppression of opposition.
References
The Digital Services Act - European Union
The Digital Services Act (DSA) is a landmark piece of European Union legislation that aims to create a safer and more transparent online environment. It's designed to modernize existing rules and address the growing challenges posed by illegal content, disinformation, and a lack of accountability from online platforms.
Key Goals and Principles
The DSA's main objectives are:
Protecting Users: It establishes new standards for user safety by safeguarding fundamental rights online and giving users more control over their digital experience.
Tackling Illegal Content: It sets out clear rules for how digital services must handle and remove illegal content, including illegal goods, services, and hate speech.
Increasing Transparency and Accountability: The DSA creates a strong framework for platform transparency, particularly regarding content moderation, advertising, and algorithmic systems.
Creating a Level Playing Field: It applies a consistent set of rules to all digital services operating in the EU, regardless of where they are based, to ensure fair competition.
Who Does the DSA Apply to?
The DSA's obligations are tiered, meaning they apply differently depending on the type and size of the service provider. The law applies to all "intermediary services" that are offered to recipients in the EU, even if the service provider is not based there.
The four main tiers of service providers are:
Intermediary Services: This is the broadest category and includes services like internet service providers (ISPs), domain name registrars, and cloud services.
Hosting Services: This includes services that store information provided by a user, such as web hosting providers and cloud service providers.
Online Platforms: A subset of hosting services, these are services that store and publicly disseminate information, such as social media platforms, online marketplaces, and app stores.
Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs): These are platforms and search engines with at least 45 million active users in the EU. Due to their "systemic impact," they face the most stringent obligations under the DSA.
Major Provisions and Requirements
The DSA imposes a wide range of obligations on these service providers, including:
Content Moderation: Services must have mechanisms for users to easily flag illegal content. They must also provide users with clear reasons for content removal and an internal complaint-handling system to challenge decisions.
Transparency: All regulated services must publish annual transparency reports on their content moderation activities. They also need to be more transparent about how their recommender systems (e.g., news feeds) and online advertising work.
Protection of Minors: The DSA includes specific provisions to protect children. For example, it bans targeted advertising based on profiling of minors.
Targeted Advertising: It places limits on targeted advertising by prohibiting the use of "sensitive" personal data (e.g., political views, sexual orientation) for profiling.
Banning "Dark Patterns": The law explicitly prohibits the use of "dark patterns," which are deceptive or manipulative online interfaces that trick users into making choices they don't intend to.
Risk Assessments: VLOPs and VLOSEs are required to conduct regular, independent risk assessments to identify and mitigate risks to fundamental rights, public health, and civic discourse.
Traceability for Online Marketplaces: Marketplaces must implement "Know Your Business Customer" (KYBC) requirements to verify the identity of traders selling on their platforms, helping to prevent the sale of illegal or counterfeit goods.
Enforcement and Penalties
The DSA is enforced by national authorities in each EU Member State, with the European Commission having direct oversight and enforcement powers over VLOPs and VLOSEs. Failure to comply can result in significant fines, with the most serious breaches potentially leading to fines of up to 6% of a company's annual global turnover.
###
The Online Safety Act - United Kingdom
The Online Safety Act (OSA) is a significant piece of legislation in the United Kingdom designed to regulate online content and protect users from a range of online harms. It's often compared to the EU's Digital Services Act but has some distinct features, particularly its focus on child safety.
Key Goals and Principles
The main objectives of the Online Safety Act are:
Creating a "Duty of Care": It places a legal responsibility on online platforms to protect their users, especially children, from illegal and harmful content.
Tackling Illegal Content: The act requires all in-scope services to take robust action against a wide range of illegal content, including child sexual abuse material, terrorism, and hate speech.
Protecting Children: This is a core focus of the act. It introduces specific duties for platforms to prevent children from encountering content that is harmful to them, even if it's not strictly illegal for adults. This includes content related to suicide, self-harm, and eating disorders.
Empowering Adult Users: The act gives adults more control over the content they see. Larger platforms are required to provide tools that allow users to filter out content they find abusive or unwanted.
Increasing Accountability: It gives the UK's communications regulator, Ofcom, significant new powers to enforce the rules and hold companies accountable.
Who Does the Act Apply To?
The OSA's reach is broad, affecting any online service that is available in the UK and allows users to post or interact with content. It categorizes services into different tiers with varying obligations:
User-to-User (U2U) Services: This is the most expansive category, including social media platforms, forums, messaging apps, and gaming platforms.
Search Services: This covers search engines and any site with a search function.
Pornography Services: These sites, which publish or display pornographic content, have specific and strict obligations.
The duties are tiered based on a service's size and functionality, with the largest platforms facing the most stringent requirements.
Major Provisions and Requirements
The Online Safety Act introduces a number of key obligations for platforms:
Risk Assessments: Services must conduct and maintain up-to-date risk assessments to identify potential harms to their users, particularly children.
Illegal Content: All services must have systems in place to prevent and remove illegal content. This includes a clear and easy-to-use reporting mechanism for users.
Child Protection: Services likely to be accessed by children must take measures to protect them from "priority" harmful content, such as that promoting self-harm or eating disorders. This often requires platforms to implement "highly effective" age assurance technologies.
Age Verification: For services with content that is legally restricted by age, such as pornography, the act mandates the use of highly effective age verification or estimation.
Content Moderation and Transparency: Platforms must be transparent about their content moderation policies and provide users with a clear complaints procedure if their content is removed.
Criminal Offenses: The act also introduces new criminal offenses for individuals, such as "cyberflashing," sending threatening communications, and encouraging or assisting serious self-harm online.
Duty to Protect Freedom of Expression: The act includes a duty for platforms to protect journalistic content and content of "democratic importance."
Enforcement and Penalties
Ofcom is the designated regulator responsible for overseeing the act's implementation and enforcement. It has the power to:
Require platforms to provide information and undergo audits.
Impose significant fines for non-compliance, up to £18 million or 10% of a company's annual global turnover, whichever is greater.
Apply to the courts for service restriction orders to block access to non-compliant websites.
The implementation of the act is a phased process, with Ofcom publishing detailed codes of practice and guidance for different parts of the law.