What the Report Found

The report, titled “Combatting New Forms of Extremism” and published on April 1, represents the committee’s most comprehensive investigation into the evolving extremism landscape. Its central finding is blunt: Prevent was designed as a counter-terrorism tool and remains trapped in a counter-terrorism mindset, even as the nature of the threat has fundamentally changed. The committee found that the programme is struggling to process a flood of referrals that do not fit traditional categories of Islamist or right-wing extremism, and lacks the multi-agency infrastructure to triage cases effectively.

The numbers tell the story. In 2024, Prevent received approximately 7,500 referrals. Current trends suggest the figure will exceed 10,000 in 2026. But the committee found that the majority of these referrals involve individuals — overwhelmingly young males — who are not motivated by a coherent ideological framework. Instead, they are drawn to violence through a cocktail of online content: incel communities, accelerationist messaging, true crime glorification, and algorithmically served extremist material that blends elements of multiple ideologies into what researchers call “hybridised” belief systems.

The Online Dimension

The committee reserves its sharpest criticism for the role of social media platforms. It describes a landscape in which algorithms “push a steady stream of harmful content” towards vulnerable young people, while advances in generative AI are lowering the barriers to producing extremist and harmful material. Children who begin by consuming edgy memes or violent gaming content can be algorithmically funnelled towards increasingly extreme material within days. The committee heard evidence that some children as young as 12 had been referred to Prevent after being found consuming content promoting mass violence.

The report calls for a “fundamental reset” in how social media companies are regulated in relation to extremist content, arguing that the Online Safety Act’s framework is too slow and too narrowly focused on proscribed terrorist organisations to catch the new wave of ideologically fluid radicalisation. It recommends that Ofcom be given specific powers to require platforms to audit and disclose how their recommendation algorithms interact with extremist content, and that the government establish mandatory reporting requirements for patterns of radicalisation detected by platform systems.

The Neurodiversity Question

One of the report’s most sensitive findings concerns the over-representation of neurodiverse individuals among those being referred to Prevent. The committee heard evidence that autistic young people and those with ADHD are disproportionately represented in the caseload — not because neurodiversity causes extremism, but because certain cognitive traits can make individuals more susceptible to the obsessive engagement patterns that algorithmic radicalisation exploits. The committee calls for specialist training for Prevent officers and Channel panel members, and for a formal review of how neurodiversity intersects with vulnerability to radicalisation.

This is politically delicate territory. Previous attempts to link neurodiversity with security risk have been criticised by disability rights campaigners. The committee is careful to frame the issue as one of safeguarding rather than suspicion — arguing that neurodiverse young people are being failed by a system that does not understand their needs, rather than posing a unique threat. Whether this distinction survives contact with tabloid headline writers remains to be seen.

The Iran War Backdrop

The report does not dwell at length on the Iran conflict, but its timing is impossible to ignore. The war has supercharged both Islamist and far-right online radicalisation. Anti-Muslim hate crime has surged in the UK since February. Simultaneously, pro-Iranian and anti-Western content has proliferated on platforms popular with young people. The committee warns that the current conflict is creating a “permissive environment” for radicalisation on multiple fronts — and that Prevent, already struggling, is about to face its biggest test yet.

What Comes Next

The government has 60 days to respond to the committee’s recommendations. Home Secretary Yvette Cooper has previously indicated sympathy for strengthening online platform regulation, but her department is consumed by the Iran war’s domestic security implications and the May 7 local elections. The committee’s call for a “multi-agency reset” would require coordination between the Home Office, Department for Education, NHS England, and Ofcom — the kind of cross-departmental effort that Whitehall rarely delivers quickly. Meanwhile, the referral numbers keep climbing, the algorithms keep serving, and the children keep watching. The committee has sounded the alarm. The question is whether anyone in government is listening above the noise of the war.