Pornography legislation has long been an emotive and contentious topic, with no consensus as to the correct approach to regulation. Measures must strike a delicate balance, upholding rights such as free speech, privacy and adult autonomy against the responsibility to protect vulnerable individuals from associated harms.
What happened to regulation under the Digital Economy Act 2017?
In 2017, Part 3 of the Digital Economy Act (“DEA”) introduced measures requiring any commercial provider of internet pornography in the UK to prevent access to that material by a person under 18. It empowered the Secretary of State to identify a regulator to implement these age verification measures, impose financial penalties for breach, and issue notices requiring the Internet Service Provider (“ISP”) to block any contravening material. The British Board of Film Classification (“BBFC”) was appointed to regulate implementation of the DEA, a significant evolution from its original mandate to establish guidance for filmmakers for age category certification of films. This appointment led to concerns that the BBFC would be ill-suited to manage this broad new discretion, having historically operated based on a mutually cooperative relationship with filmmakers, as opposed to pornographers and ISPs unused to creating and distributing content in consideration of regulatory guidance.
The proposals in the DEA invited critical discussion of the effectiveness of legislative attempts to regulate pornography in the evolving internet age, both in human rights and practical terms. Some of the major criticisms of Part 3 of the DEA were its hasty adoption and anomalous nature as part of a “catch-all” statute, the appointment of an unspecialised ad hoc regulator in the BBFC, its failure to address the prevalence of overseas Internet Service Providers, the dangers of private browsing and the potentially imbalanced commercial impact on small-scale providers.
Whether as a result of this debate, or because of an overambitious legislative agenda, Part 3 of the DEA never came into force, and was repealed six years later by the Online Safety Act 2023.
A second bite of the cherry – the Online Safety Act 2023
On 26 October 2023, the age verification debate was reignited as the Online Safety Bill received royal assent and became the Online Safety Act 2023 (“OSA”). With a raft of new protective measures, the OSA focused on protecting minors from harmful material, and created new offences to target image-based sexual abuse online such as “deepfake” pornography. It also made provision for and in connection with regulation by Ofcom, arming the regulator with enforcement powers not dissimilar to those proposed under the DEA.
The OSA is undoubtedly more targeted and thorough in respect of its measures than repealed Part 3 of the DEA, dividing online services into categories with distinct routes to implementation through “technically accurate, robust, reliable children’s access assessments”, purporting to ensure an outcome of “highly effective age assurance”. Part 5 of the OSA sets out the duties of all ISPs which are not exempt. Those duties include ensuring by use of age verification or “age estimation”, that children are not normally able to encounter regulated pornographic content.
Ofcom was tasked with producing guidance and Codes of Practice to assist ISPs in complying with the duties and on 16 January 2025 announced that by 16 April 2025 all user-to-user and search services must carry out a children’s access assessment to ensure effective age assurance is in place[1]. Also published on 16 January 2025 was Ofcom’s planned enforcement programme. Ofcom has set out that it will be writing to all ISPs that display or publish pornographic content, as defined by Part 5 OSA, to inform them of their obligations and request confirmation of the age assurance provisions being implemented to achieve compliance[2]. Ofcom’s guidance has been met with mixed reviews.
Practical challenges
From a practical perspective, some of the concerns raised in the infancy of Part 3 DEA appear to remain under the new legal framework, while others have been addressed. For example, Ofcom appears to be better placed than the BBFC to extend its regulation of communications services to include online safety, given it has experience of the evolution of digital communications and online media. There has also been a more comprehensive attempt to categorise the duties which apply to each type of services covered by the OSA.
However, as with the DEA, pornography is not the primary focus of the OSA, and ensuring that it operates effectively in against the backdrop of continuing and rapid evolution of technology – and of pornography itself – presents a serious challenge. The OSA has not simplified an already crowded legal and regulatory environment. This was highlighted by Professor Clare McGlynn, who has expertise in the legal regulation of pornography, sexual violence and online abuse[3] and was one of those making early calls for further-reaching reform soon after the OSA came into force.
Further practical challenges come from the implementation of age verification measures. While there is no shortage of platforms which purport to provide age verification software, and Ofcom has published a “non-exhaustive” list of technologies which might be used to verify age, there are serious questions about the protection of service users being asked to input personal data.
A major concern in the industry is how Ofcom intends to ensure that age assurance is enforced equally across ISPs of all sizes to create a level playing field. The age verification requirements, and the speed at which they must be rolled out, could be unfeasible for small-scale non-mainstream providers with fewer resources to adapt to robust regulatory changes. Even established big players have concerns; the BBC reporting that Aylo, parent company of Pornhub, described age verification requirements as “ineffective, haphazard, and dangerous”. This unequal bargaining power may also come into focus when it comes to enforcement of the new regime, as the OSA does not specify what will be an “appropriate” sanction as long as the penalty reflects the seriousness of the breach, and is below the higher of £18 million, or 10% of the ISP’s revenue for the relevant accounting period. Ofcom is required by the Communications Act to produce guidance for use in determining any penalties it imposes, but guidance has not yet been updated to reflect its new powers.
A unilateral concern appears to be that age verification will not prevent digitally literate minors from seeking out explicit content, but may instead drive them away from commercially and legally responsive ISPs to seek out pornography through alternative means, including Virtual Private Networks (“VPNs”). VPNs are renowned for their resilience, facilitating user file sharing without detection, and frequently withstand attempted blocking from governments. It is in this way that minors are more likely to come across harmful, extreme and potentially illegal content.
Finally, there is the difficulty of one jurisdiction seeking to tackle a global phenomenon. Part 3 DEA only empowered the BBFC to enforce sanctions against platforms operating from within the UK. This was a major loophole in the legislation, as only a very small number of adult websites accessed from the UK could be regulated in this country, with most of the pornography industry hosted in less regulated jurisdictions. The OSA appears to have considered this, and includes in its scope for regulated providers of pornographic content and internet service which “has links with the United Kingdom”[4], meaning either the service has a significant number of UK users or UK users form one of the target markets for that service. It remains to be seen whether and how enforcement action could be taken against an ISP in another jurisdiction if it was found to be in breach of provisions.
Legal challenges
One of the major challenges to the regulation of pornography is whether it is at odds with the UK’s human rights commitments. For example, one of the stated aims of the OSA is to protect children from the harms associated with exposure to certain types of material. An evidential threshold must be met to demonstrate causative links between pornography and harm to minors, such as to justify an intervention an infringement on freedom of expression and right to privacy for adults. There is no shortage of studies which have been conducted into the differentiation between the perception of harm associated with young people’s exposure to pornography and evidence of harm itself. Any interference with free expression, by limiting adult autonomy to publish and access explicit material, must be carefully balanced against child protection justifications. Similarly, the extent to which private interests are infringed, for example through the surveillance of an individual’s sexual preferences by the requirement that they verify their identity each time they access content, thereby undermining anonymity, cannot be without justification in keeping with the requirements of necessity, proportionality, and in accordance with legitimate aims.
It is therefore possible that legislation regulating pornography, or the practical application of Ofcom’s guidance, could become subject to legal challenges on the grounds that by breaching interdependent core interests of privacy and free expression, pornography legislation and regulations obstructs the full application of the European Convention on Human Rights. Although, while in principle pornography legislation can be argued to be a flawed enterprise in human rights terms, in practice due to the margin of appreciation granted to states and the legitimate aim of protecting children, it is unlikely that domestic legislation will be found to be incompatible with the Convention.
An unresolved debate
Pornography’s commercial evolution and expansive subject matter, coupled with the accessibility of the internet in minors’ own homes, continues to be the subject of public debate. The OSA as the latest iteration of pornography legislation, and Ofcom as its watchdog, marks the latest attempt to tackle a historic phenomenon situated within a rapidly developing digital age.
The key question remains; whether a combined approach of legislation and regulation can ever be an appropriate tool to effectively manage the harms associated with pornography, or if practical unworkability and human rights issues, exemplified in the failed attempt of the DEA, will ultimately defeat the OSA as well. Either way, ISPs face a potentially complex road to compliance, and Ofcom potential risk of challenge on human rights grounds, as its regulatory rigour is tested.
[1] Quick guide to children’s access assessments – Ofcom
[2] Enforcement Programme to protect children from encountering pornographic content through the use of age assurance – Ofcom
[3] Pornography, the Online Safety Act 2023 and the Need for Further Reform Pornography, the Online Safety Act 2023 and the Need for Further Reform by Professor Clare McGlynn, Lorna Woods, Alexandros Antoniou :: SSRN
[4] Section 80(2)(c) Online Safety Act 2023.