Domestic abuse services are reporting a sharp rise in cases where perpetrators use artificial intelligence and everyday connected devices to exert control over women, with Refuge saying its specialist caseload grew markedly in the closing months of 2025. The charity recorded a 62% rise in its most complex technology-enabled abuse referrals, reaching 829 women, and a 24% increase in referrals of those under 30. According to Refuge, offenders are exploiting wearables, smart-home systems and AI tools to stalk, impersonate and otherwise manipulate victims. (Sources: Refuge, Refuge background).
Frontline workers describe a widening toolkit of digital tactics. Wearable gadgets such as smartwatches, Oura rings and Fitbits have been used to monitor locations through synced cloud accounts, while smart heating and lighting have been weaponised to unsettle and intimidate survivors. Refuge’s technology team warns devices designed without safeguards can be readily co-opted into patterns of coercive control. (Sources: Refuge, Refuge background).
Survivors’ accounts underline the practical and psychological toll. One woman, identified by Refuge as Mina, said she left a smartwatch behind when fleeing and was subsequently tracked via linked cloud accounts to emergency accommodation. "It was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent. It created a constant sense of paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my movements weren’t private," she said. Police returned the device, but Mina reported further breaches, including a private investigator locating her at a refuge; she said police told her no crime had been committed because she had "not come to any harm". (Sources: Refuge, Refuge background).
Refuge also warns that AI is amplifying older forms of abuse. The charity’s tech team highlights examples where videos are manipulated to make survivors appear intoxicated or erratic, which can be used to discredit them with social services or others. The organisation has also seen AI-generated documents and spoofed communications sent to survivors to coerce them into actions or to lure them into locations. Emma Pickering, head of Refuge’s tech-facilitated abuse team, said: "Time and again, we see what happens when devices go to market without proper consideration of how they might be used to harm women and girls. It is currently far too easy for perpetrators to access and weaponise smart accessories, and our frontline teams are seeing the devastating consequences of this abuse." (Sources: Refuge, event overview).
International bodies warn that these trends are not confined to the UK. The United Nations Office at Geneva and UN Women have highlighted a global surge in digital violence against women, noting that AI-driven deepfakes and anonymous online tools are increasingly used to produce non-consensual imagery and to harass and coerce female targets. According to the UN agency, a very large proportion of deepfakes online are non-consensual and overwhelmingly target women, underscoring the scale of the problem. (Sources: UN Office at Geneva, Refuge).
Refuge has expanded its response capacity: its Technology-Facilitated Abuse and Economic Empowerment Team, established in 2017, provides specialist support, bespoke safety planning and advice to survivors and to agencies and tech developers. The charity runs accredited training for professionals covering topics from online misogyny and the so-called manosphere to the specific risks posed by AI companions, and recently hosted an online session titled "Power and Control in the Age of AI" to share emerging insights. Meanwhile industry initiatives have begun to surface, with some technology firms and non-profits announcing partnerships to better protect victims from digitally enabled abuse. (Sources: Refuge background, Refuge training, industry initiative).
Refuge is pressing for stronger regulatory and law-enforcement responses, calling for investment in digital investigation teams and for the tech industry to be held to account for design and safety failures. "It is unacceptable for the safety and wellbeing of women and girls to be treated as an afterthought once a technology has been developed and distributed. Their safety must be a foundational principle shaping both the design of wearable technology and the regulatory frameworks that surround it," Pickering added. The UK government said tackling violence against women and girls, including when facilitated by technology, is a priority and pointed to its VAWG strategy and work with Ofcom on online harms. (Sources: Refuge, Refuge event, government statement as reported).
Source Reference Map
Inspired by headline at: [1]
Sources by paragraph:
- Paragraph 1: [2], [4]
- Paragraph 2: [2], [4]
- Paragraph 3: [2]
- Paragraph 4: [2], [6]
- Paragraph 5: [3]
- Paragraph 6: [4], [5], [7]
- Paragraph 7: [2]
Source: Noah Wire Services