In the United Kingdom, a new parliamentary committee has voiced its opinion on the government’s divisive intention to control Internet material with an overarching emphasis on safety.’
A detailed report released today by the DCMS Committee expresses “urgent concerns” that the draft legislation “neither adequately protects freedom of expression nor is clear and robust enough to address the various types of illegal and harmful content on user-to-user and search services.” The DCMS Committee also expresses “concerns” that the draft legislation “neither adequately protects freedom of expression nor is clear and robust enough to address the various types of illegal and harmful content on search services.”
One of many concerns expressed by the committee was the fuzziness with which the bill defines different types of harms, such as illegal content — and designations of injuries — with MPs criticizing the government’s failure to include more detail in the bill itself, making it harder to judge impact because critical components (such as Codes of Practice) will be implemented through secondary legislation and are not yet on the table.
Because of this general ambiguity, as well as the difficulties associated with the adoption of a “duty of care” approach — which, as the committee points out, is divided into several specific duties (about illegal content; content that poses a risk to children; and, for a subset of high-risk P2P services, content that poses a threat to adults) — the proposed framework may be unable to achieve the desired “comprehensive safety regime,” according to the committee.
According to the committee, the bill also poses dangers to freedom of speech. It has proposed that the government include a balancing test for the regulator, Ofcom, to examine whether platforms have “duly balanced their freedom of expression requirements with their decision-making.”
Platforms may respond to sudden, ill-defined liability around broad swathes of content by over-removing speech, which could have a chilling effect on freedom of expression in the United Kingdom, according to one of the many criticisms leveled against the bill, which the committee appears to be taking note of as it considers the legislation.
To bring the bill into compliance with international human rights law, it recommends that the government reframe definitions of harmful content and relevant safety obligations to try to safeguard against the risk of over-removal by providing “minimum standards against which a provider’s actions, systems, and processes to tackle harm, including automated or algorithmic content moderation, should be judged.”
Even on the topic of child safety, which UK ministers have frequently emphasized as a critical component of the law, the committee finds “weaknesses” in the legislation, which they claim result in a proposed regime that “does not map sufficiently onto the reality of the situation.”
In this regard, they have urged the government to go further, urging that the bill be expanded to include “technically legal” practices such as breadcrumbing (also known as “where perpetrators deliberately subvert the thresholds of criminal activity and for content removal by a service provider”) — citing witness testimony that suggests that while the practice is not, in fact, illegal, it “nonetheless forms part of the sequence for online CSEA [child sexual exploitation and abuse].”
A similar point of view is expressed by a committee that believes the law should go farther to protect women and girls from certain sorts of online harassment and abuse aimed at them (such as “tech-enabled ‘nudifying’ of women and deepfake pornography,” among other things).
The committee believes that Ofcom’s investigation powers into platforms should be strengthened. They are calling for changes to give the regulator the authority to: “conduct confidential auditing or vetting of a service’s systems to assess the operational and outputs in practice”; and to “request generic information about how ‘content is disseminated using a service,'” with MPs further arguing that the bill should provide more specific detail about the types of data Ofcom can request (presumably to avoid the risk of platforms seeking to evade effective oversight).
On enforcement, the committee is concerned about a lack of clarity regarding how Ofcom’s (expected to be) comprehensive authority may potentially be used against platforms in the opposite direction.
It has proposed several changes, including clarifying that these powers only apply to services within the legislation’s scope.
Members of Parliament are also calling for a redrafting of the use of so-called “technology notices,” which will allow the regulator to mandate the use of new technology (following “persistent and prevalent” failures of the duty of care), stating that the scope and application of this power should be “more tightly” defined. More practical information is provided on the actions required to bring providers into compliance and more detail on how Ofcom will test whether the use of such notices is appropriate.
In this section, the committee raises concerns about problems that might cause business interruption. It also recommends that the government take some time to consider whether these capabilities are “appropriately future-proofed” in light of the introduction of technologies such as virtual private networks (VPNs) and DNS over HTTPS.
In addition to this, the study calls for more clarity in the bill’s language about judicial review and restitution.
A separate recommendation from the committee is that the government refrain from establishing a dedicated joint committee to oversee online safety and digital regulation, arguing that parliamentary oversight is “best served by the existing, independent, cross-party select committees, as evidenced by the work we have done and will continue to do in this area.”
It remains to be seen if the committee’s suggestions are taken into consideration by the administration. Even though Nadine Dorries, the secretary of state for digital, has already said that she is willing to take into consideration any parliamentary comments on the comprehensive package of measures,
Earlier recommendations by a joint parliamentary committee focused on evaluating the bill, made in December, cautioned that the draft law ran the danger of falling short of the government’s safety goals. The DCMS Committee’s report followed those suggestions.
The draft Online Safety Bill was published in May 2021, and it outlines a long-delayed plan to impose a duty of care on Internet platforms to protect users from a wide range of harms, whether related to (already illegal) content such as terrorist propaganda, child sexual abuse material, and hate speech, or more broadly problematic but not necessarily illegal content such as bullying or content promoting eating disorders or suicide (which may create disproportionate risks for younger users of social media platforms).
During her testimony before the joint committee in November, Dorries anticipated that the law would usher in a systemic shift to Internet culture, warning MPs and peers that it would result in “massive, tremendous” changes to how Internet platforms work.
Currently making its way through parliament, the bill targets a broad range of Internet platforms and envisions enforcing safety-focused governance standards through regulated Codes of Conduct, which Ofcom would oversee in an expanded role, including the ability to impose substantial penalties for violations of the standards.
The broad scope of the regulation — the intention for the law to target not only illegal content spreading online but also content that falls into more of a grey area where restrictions risk impinging on freedom of expression and speech — has resulted in widespread opposition from civil liberties and digital rights organizations, as well as from businesses concerned about liability and compliance burdens, among others.
The government has also increased its attacks on platforms that use end-to-end encryption, employing rhetoric that seeks to imply that robust security is a barrier to catching child predators (for example, the government’s recently unveiled NoPlaceToHide public relations campaign, which aims to turn the public against E2E encryption) to catch pedophiles. Criticism is based on the belief, among other things, that ministers are attempting to sabotage Internet security and privacy by recasting excellent practices as obstacles to the objective of enforcing “kid protection” via extensive digital monitoring.
As part of this effort, the Home Office has been sprinkling some taxpayer money in recent months to try to encourage the development of technologies that could be applied to E2EE systems to scan for child sexual abuse material — technologies that, according to the Home Office, could provide a “middle ground” between robust security and law enforcement’s data access requirements.
Those who oppose the bill already argue that relying on the fabricated claim of “child protection” as a populist lever to push for the removal of the most crucial security and privacy protections from all Internet users while simultaneously encouraging a cottage industry of commercial providers to spring up and advertise “child protection” surveillance services for sale, comes closer to gaslighting than it does to safeguarding children.
Returning to the big picture, there is a great deal of anxiety about the possibility of the United Kingdom overregulating its digital economy.
Additionally, there is a risk that the bill will become a parliamentary “hobby horse” for every type of online grievance, as one former minister of state put it — with the potential for complex and poorly defined content regulation to end up as a disproportionate burden on UK startups compared to tech giants like Facebook, whose self-serving algorithms and content moderation fueled calls for Internet regulation in the first place, in addition to being extremely harmful to UK Internet users’ human rights.