November 14, 2025

Strangulation in pornography: the ban, the science, and the trade-offs — a long read

By Ephraim Agbo 

The UK government has moved from warning to lawmaking. In June it signalled it would amend the Crime and Policing Bill to make the publication and possession of pornography that depicts strangulation or suffocation a criminal offence, and has designated such material a “priority offence” under the Online Safety Act — meaning platforms will be legally required to detect and remove it proactively, with Ofcom responsible for enforcement. That is a dramatic escalation in how states seek to police sexual material online, and it has re-ignited a debate that sits at the intersection of public health, gendered violence, free speech and the realities of content moderation.

This long read examines why the UK ministers made the decision, what the clinical evidence says about the risks of partnered strangulation, how performers and sex-worker advocates respond, and whether the proposed legal and regulatory architecture is up to the delicate balancing act policymakers now claim to face.


The policy: what ministers have put on the table

The government’s stated ambition is simple: where evidence shows a form of sexual behaviour is both widely depicted online and has demonstrable harms, the state should act to reduce access to the imagery that appears to normalise that behaviour. The immediate legal changes announced are twofold:

  1. an amendment to the Crime and Policing Bill to criminalise publishing or possessing pornographic material that depicts strangulation or suffocation; and
  2. the designation of choking/strangulation content as a “priority offence” under the Online Safety Act, imposing proactive duties on platforms and giving Ofcom the power to enforce compliance and levy fines.

The policy flows from an independent review of pornography commissioned by government and led by Baroness Gabby Bertin. The review argued that, because non-fatal strangulation is criminalised in domestic law, online depictions of the practice should not remain freely accessible in ways that normalise or promote it. The review also catalogued the scale and tone of modern mainstream porn — which, it said, increasingly includes violent or degrading content that would not have been widely distributed a generation ago. The government says these findings supplied both the moral and legal case for urgent action.


The data and science: real physiological risk, incomplete causal chains

A key reason this debate is not purely cultural or moral is that clinicians and neuroscientists have raised clear medical alarms. Non-fatal strangulation — whether occurring in intimate partner violence or in sexual contexts — can cause immediate and delayed neurological injury: carotid artery dissections, strokes, hypoxic brain injury, memory problems, and persistent cognitive symptoms have all been documented in clinical case literature. Recent focused work looking at patterns of choking during sex has reported associations with headaches, cognitive dysfunction and post-traumatic stress symptoms among affected women. More recent neuroimaging studies are beginning to explore whether repeated exposure to breath-restriction during sex correlates with differences in brain activation patterns on cognitive tasks. Those early signals are enough to lend urgency to the policy question.

But the science is not a neat, single causal chain that instantly justifies broad criminal sanctions. Much of the research to date is cross-sectional, reliant on self-report, or focused on clinical samples that over-represent people who already experienced harm. There are unresolved empirical questions that matter for policy design: how common is mimicry (young people copying what they see online) versus independent experimentation? What proportion of people who view such content go on to attempt dangerous behaviour, and in what contexts? How significant are cumulative neurological effects from intermittent, consensual, supervised breath-restriction versus repeated, coercive episodes? Answering those questions requires longitudinal, population-level work that is only now starting to accumulate. In short: medical evidence supports alarm about the practice itself; it is suggestive — but not yet definitive — about the precise contribution of online imagery to population-level harms.


What the review found (and what it did not)

Baroness Bertin’s review — and the government summary that followed — emphasised three findings that shaped ministers’ response:

  • depictions of choking or breath-restriction are widespread on mainstream porn platforms;
  • frontline professionals (teachers, youth workers, clinicians) report seeing young people imitate or normalise these practices; and
  • the UK’s existing patchwork of obscenity and “extreme pornography” laws does not consistently capture such depictions when they are posted online, producing an enforcement gap.

Those are powerful base facts for a lawmaker. But critics of the review complain that its remit — and the evidence it relied on — lean heavily on anecdote, activist submissions and content analysis rather than randomized, longitudinal social science that tracks behaviours over time. That matters because once legislation is written, its drafting details determine who is caught by the law, and what forms of content platforms must remove or block. The review recommended criminalisation because it saw strangulation porn as a form of violent material that is already illegal offline; the policy question is whether the online form should be treated identically without nuanced thresholds or safe-harbour carve-outs.


Voices on the ground: survivors, campaigners and performers

The response has not been monolithic.

Campaigners and survivors’ groups — including organisations that specialise in violence against women and girls — have welcomed the ban as a long overdue step. For many survivors, the imagery normalises violence and has repeat, real-world effects: clinicians report patterns where victims cannot always point to a single initial episode but do report cumulative coercive dynamics that often include breath-restriction. For these groups the policy advances victim safety, signals societal disapproval, and closes a loophole that allowed illegal behaviour to be depicted with impunity online.

Performers, sex-worker advocates and some free-speech defenders offer a different take. Their concern is practical and rights-based rather than dismissive of the harms. They argue that a blunt criminalisation risk will:

  • drive consensual creators underground and criminalise producers who document negotiated, safety-first practices that might serve an educational purpose;
  • push consumers toward unregulated or encrypted channels where harm is harder to monitor; and
  • in practice, create an incentives structure for platforms to over-remove content (false positives) because machine filters are the cheapest way to avoid fines. Those outcomes, they warn, could reduce transparency and make it harder to find harm-reduction guidance online. FiLiA, sex-worker collectives and some civil-liberties commentators have all registered versions of this concern.

This tension — safety versus visibility — is not merely semantic. The debate about “silencing” versus “protecting” parallels other regulatory dilemmas (drugs, self-harm content, extremist material): where platforms and regulators mandate takedown, they will rely on automated detection systems that struggle with nuance. The result can be blunt enforcement that removes educational or clearly consensual material while failing to catch cleverly disguised harmful content hosted offshore.


Enforcement at scale: tech reality checks

Designating strangulation porn a priority offence under the Online Safety Act raises practical questions about how platforms will comply. The Act’s regime pushes for proactive detection: platforms must prevent user exposure rather than merely remove content after complaint. In practice that implies a mix of automated image/video detection, hashed databases (like those used for child sexual abuse material), and human moderation.

Automated detection is improving but is not a panacea. Visual classifiers can detect objects, faces, and some actions, but distinguishing a staged, consensual performance (with visible safety checks and verbal consent) from a violent, non-consensual assault is often impossible from pixels alone. False positives are inevitable. Over-blocking has real costs: it silences educators, harms creators, and displaces consumers. Under-blocking is equally costly: a platform that misses content risks prosecution and heavy fines. The global nature of hosting — where content can be mirrored across jurisdictions — further complicates enforcement and raises questions about proportionality and extraterritorial reach.


Policy design: how to make a blunt tool less blunt

If the government’s goal is to reduce real harm without collateral damage, the law’s implementation must be surgical. Six design features could improve the odds:

  1. Operational definitions: Legislation and Ofcom guidance must define clearly what constitutes illegal depiction — for example, distinguishing realistic acts that cause or intend breath restriction from simulated gestures or clearly educational demonstrations. Ambiguity invites overreach.
  2. Educational carve-outs and certification: Create certified channels for demonstrably educational material (public-health content, clinician guides, performer safety videos) that platforms can whitelist after a verification process. That lets harm-reduction information remain accessible.
  3. Target distribution, not private curiosity: Focus enforcement on uploading and deliberately sharing violent pornographic content rather than criminalising individual private consumption, unless clear intent to distribute illegal material exists. Criminalising possession raises thorny proportionality issues.
  4. Sunset and review clauses: Build statutory review points (e.g., two years after implementation) to measure enforcement outcomes, false-positive rates, and impacts on victims and performers. Data should drive adjustments.
  5. Technology standards and appeals: Require platforms to publish detection accuracy metrics and provide transparent, timely appeal routes for wrongly removed content. Independent auditing of platform compliance would increase public trust.
  6. Parallel investment in prevention: Laws must be paired with sex-education funding, public health messaging, and survivor services. Removing content without improving consent literacy among young people fights the last battle.

Case studies: how the problem manifests

Case study 1 — school reports and imitation: Multiple reports collected by the independent review and by teachers’ groups describe classrooms where pupils (some under 16) repeat sexual scripts learned online. In some recorded instances, teachers reported hearing children ask how to “choke” a partner — a pattern the review flagged as evidence of normalisation. While anecdotal, these episodes are repeated enough across jurisdictions to be politically salient and to motivate policy action.

Case study 2 — clinical aftermath: Emergency departments and sexual-assault services report patients who present with headaches, visual disturbance or confusion after non-fatal strangulation episodes during sex. Medical literature on non-fatal strangulation in assault contexts shows that visible external injuries are often absent even when internal vascular injury has occurred — a fact that clinicians warn should make us cautious about trivialising any breath-restriction. Neuroimaging studies exploring working memory differences in people with histories of repeated choking during sex are early but concerning.

Case study 3 — performers and safety narratives: Some consensual performers and educators argue that visible negotiated scenes — where performers explain safewords and show harm-reduction steps — can be instructive to viewers. Banning all depictions risks losing those safety-first demonstrations without necessarily stopping the production of clandestine, irresponsible material. That is the core of performers’ objections. (Advocacy groups for performers and sex workers have urged that any regulatory regime include voices from within the industry.)


The geopolitics of content: other jurisdictions and export effects

The UK is not alone in wrestling with harmful pornographic content. Legislatures and parliaments in Australia, New South Wales, and several EU countries have investigated links between violent porn and harm; some have tightened age-verification and platform obligations. Where the UK’s approach differs is in a simultaneous criminalisation plus platform-duty model: criminalising possession or distribution while obliging platforms to proactively police content. That mix intensifies the enforcement burden and heightens the risk of cross-border friction. Platforms with a global footprint will likely default to wholesale blocking algorithms for UK users — a defensive, not always discriminatory, tactic.


A judgment: defensible aim, delicate implementation

The government’s impulse is defensible. There is credible medical and social evidence that partnered strangulation is dangerous and that its normalisation is socially harmful. Survivors, clinicians and many campaigners have legitimate reasons to pressure for legal change. Criminalising depictions — and obliging platforms to act — could remove a vector that seems to be facilitating dangerous behaviour among young people.

But the devil is in the drafting and the delivery. Without crystal-clear definitions, transparent enforcement standards, carve-outs for educational and consensual safety material, and robust review mechanisms, the law risks becoming a blunt instrument that silences useful speech, displaces harm into shadowy channels, and damages the livelihoods and safety of performers it did not intend to punish. The technological reality of automated moderation and the international structure of hosting means that poorly designed regulation will produce perverse incentives and poor outcomes if implemented without continuous oversight and stakeholder engagement.


What to watch next

  1. Drafting details in the Crime and Policing Bill — the precise statutory language will determine thresholds for illegality, mens rea (intent), and whether possession is criminalised absent distribution. Follow the Lords debates and the Bill’s final text.
  2. Ofcom guidance and technical standards — Ofcom’s formal guidance on how priority offences should be detected will show whether regulators expect hashed-image databases, automated classifiers, or human review and what transparency will be required.
  3. Early enforcement metrics — once platforms begin to remove content, independent auditors should publish data on takedowns, false positives, and appeals. That evidence will determine whether the policy achieves its aims without unacceptable collateral damage.
  4. New empirical studies — robust longitudinal research that links exposure to imagery with subsequent behaviour will be the only way to move certainty beyond plausibility. Funders and universities should prioritise this work.

Conclusion

The UK’s decision to criminalise depictions of strangulation in pornography is a consequential policy choice that responds to real harms. It is also an index case for how modern democracies regulate risky cultural content in an era of algorithmic distribution and global platforms. The choice now is not whether to act — the political momentum suggests action was inevitable — but how to act in ways that maximise protection, minimise unintended harm, and preserve the channels that allow education and consent literacy to flourish. If ministers accept that the law is only the start, not the finish line, and pair criminalisation with careful drafting, transparent enforcement and serious investment in education and services, then the policy may well reduce harm without producing the very harms its critics fear. If they do not, the measure risks becoming a symbolic win for policymakers and a messy reality for the people it affects most.


No comments:

Strangulation in pornography: the ban, the science, and the trade-offs — a long read

By Ephraim Agbo  The UK government has moved from warning to lawmaking. In June it signalled it would amend the Crime and Polic...