
Tech Corner: The Intersection of Technology and Domestic Abuse: Legal Implications in the UK
Published: 18/03/2025 06:00
In today’s digital age, technology has undeniably transformed the way we interact, work, and live. While technological advancements offer numerous benefits, they have also opened new avenues for individuals to exert control, manipulate, and harass their partners and ex-partners in cases of domestic abuse. In this article, we will explore the unfortunate intersection between technology and domestic abuse within the context of England and Wales family law.1 We will discuss how perpetrators of abuse employ various tactics, such as call spoofing, using Apple AirTags to track their victims, and spyware on mobile phones. In addition, this article will outline how emerging AI technologies, such as voice cloning, are being used to facilitate abuse. Most of these methods fall under s 76 Serious Crimes Act 2015 (SCA 2015) as technology has been weaponised for the purpose of coercion and control alongside other cyber-enabled offences. Likewise, many of these offences will be tied to cyber-enabled Violence Against Women and Girls (VAWG), including new and updated offences under the Online Safety Act 2023 (OSA 2023).2
Call spoofing
Call spoofing is a deceptive technique used by abusers to manipulate caller ID information, presenting themselves as someone else when contacting their victims. This technology has seen an alarming increase in its use by perpetrators to maintain control and intimidate their partners and ex-partners. Perpetrators can use various methods and applications to disguise their identity or impersonate individuals known to the victim.
Example: Sarah received a call from her child’s school, but upon answering, she found herself speaking to her ex-partner. The abuser had exploited call spoofing to cause confusion and distress.
Call spoofing has become a pervasive issue in the landscape of domestic abuse in England and Wales. With the widespread availability of spoofing apps and services, abusers can manipulate caller ID information to mask their identity. This practice poses serious challenges for both victims and the legal system. Call spoofing involves manipulating the information that appears on the recipient’s caller ID display. Abusers can alter the name, phone number, or even display a different area code to make it seem like the call is originating from a different location.
In England and Wales, the current legislation around call spoofing is covered by s 127 Communications Act 2003 and the False Communications Offence in ss 179 and 182 OSA 2023.3 In line with the OSA 2023, Ofcom has updated regulations in the last year for communication providers by targeting and blocking callers with inaccurate Calling Line Identification (CLI) to protect victims from illegal call spoofing.4 The framework is set to come into action from 29 January 2025, however the scope does not cover WhatsApp or other internet calling providers which may result in difficulties identifying spoofing this way.5 These regulations have been designed target scammers rather individuals using the technology as a means to abuse, although, Ofcom is due to publish further guidelines for the OSA 2023 relating to VAWG in the first half of 2025.6
For victims, call spoofing can be particularly difficult to evidence, however, WomensLaw.Org give the recommendations for patterns and behaviour that may strengthen the claim.7 For each of these it will always be beneficial to keep a log with dates and times of each individual event:
- Do the calls and texts from the spoofed numbers come at similar times to previous patterns of this person?
- Does the content within the call/text of the spoofed number seemingly know information that would only be known by those close to the recipient or does it seem familiar? I.e. hidden clues or detectable writing and messaging styles
- Is the timing suspicious? Did the calls start immediately after a breakup or following a certain event?
- Are there external factors that could help demonstrate a pattern?
- Can we access phone records for evidence? (This will vary between providers and for incoming call data you may require a court order due to GDPR regulations.)
If in doubt, the individual should hang up the phone and call the number of the person the caller was claiming to be, call spoofers will not be able to intercept this line. The current recommendation from Ofcom for spoof calling is to report these occasions to Action Fraud.8
Tracking devices
Apple AirTags, initially designed to track personal belongings, have been weaponised by abusers in England and Wales to monitor and stalk their victims. These inconspicuous devices can be discreetly hidden within personal items or affixed to a vehicle, enabling perpetrators to track their targets’ movements in real-time.
Example: A survivor of domestic abuse, Jane, discovered an AirTag secretly placed in her purse. Her abuser had been using it to monitor her whereabouts, thereby violating her privacy and sense of security.
AirTags have gained popularity as a discreet and effective tracking tool for abusers in England and Wales. Their small size, long battery life, and seamless integration with Apple’s ecosystem make them an attractive choice for those seeking to engage in stalking behaviour.
The current legislation around the tracking devices is covered by ss 2–5 Protection from Harassment Act 1997 and is clearly specified as a cyber-enabled VAWG offence.3
Apple has built-in detection when it comes to other Apple devices with location sharing, it will alert individuals if they have had another device travelling with them and give them the option to play a sound out of this device. Raising awareness of this capability is a first step in the right direction in preventing harm from these devices. There have been examples of abusers removing the speakers from the devices so that the sound won’t play, however this won’t prevent the notification to the other Apple device. If the device is not made by Apple or if the victim does not have an Apple phone, these notifications will not flag these devices on their phone. If users suspect a tracking device there are other options for detecting these devices such as by using an app that can be downloaded onto their phone or using a device which can be purchased online. Likewise, looking at coincidences that can only be explained by a party having unsolicited knowledge of the victim’s whereabouts may help the detection of these devices as seen in Re A and B (Children: ‘Parental Alienation’) (No 5) [2023] EWHC 1864 (Fam).
Tracking devices may not be obvious and can also be planted on the child or their belongings. Recent cases have included devices being sewn into school uniforms or bags where the child is unaware of it.10 Devices such as AirPods and other wireless devices can also be used maliciously in this way. When everyday items are flagged up in this way, it may not be obvious that they have been used to track a location, so it is important to be aware of.
Spyware
The proliferation of spyware applications has allowed abusers to surreptitiously invade the privacy of their victims by monitoring their mobile phones. These malicious apps can gain access to text messages, call logs, GPS locations, and even activate the phone’s microphone and camera without the victim’s knowledge.
Example: John installed spyware on his spouse’s smartphone, Lisa, gaining unauthorised access to her messages, calls, and location data. This breach of her privacy allowed him to exercise control and manipulate her actions
Spyware has become a pervasive threat in domestic abuse cases in England and Wales, with abusers using these covert applications to gain unauthorised access to their victims’ personal information and communications.
The current legislative arms for spyware offences are ss 1 and 2 Computer Misuse Act 1990 (CMA 1990) and ss 2–5 Protection from Harassment Act 1997 which also pertain to further cyber-enabled VAWG offences.3
Like detecting trackers, considering coincidences that can only be explained by a party having unsolicited knowledge of the victims’ discussions and whereabouts may help the detection of these devices. Free apps like Avast One can scan phones and other devices to detect spyware.12 Additionally, individuals should be cautious of account sharing in iCloud (or non-Apple equivalents). Some apps designed to help with child safety could also be utilised as spyware where messages and locations from a child’s phone or other device may be linked to private and secure details of one of their parents.
If a device is reducing in battery life or data a lot faster than usual or if their phone feels physically warmer, these could be additional indicators that spyware is present as the phone is working twice as hard to copy the activity to another device. Additionally, it is possible to detect spyware by looking for apps on your phone that do not seem familiar or by checking phone privacy settings to see where data is being shared.
If someone is concerned their location or information is being shared through spyware, there are several apps available to detect spyware on a device. For example, Avast One, which has worked alongside Refuge in recent years to educate on the ways in which everyday technology can be used to perpetrate abuse.13 It is important to note that purchasing a new device is not always a simple solution, when copying data between devices, the spyware can copy over too. It is safer to reset the device entirely by removing any data and settings. Likewise, it is possible to remove spyware. Avast One also offers a free spyware removal tool.
If a location is being shared through spyware, then this will still be active when a device has been turned off, however, using Airplane mode will prevent the sharing of a location. If a victim wanted to report this to the police before removing the spyware, then a safe option would be to enable Airplane mode in a public area like a gym or supermarket before going to report this activity. During that period, it would appear to the perpetrator that they remained in that public area for the duration that they reported the matter to the police, rather than alert the perpetrator of the report.
The emerging role of AI
As artificial intelligence continues to advance, abusers are increasingly exploiting AI-powered technologies to manipulate and intimidate their victims. Voice cloning poses a significant threat. With AI-driven software, abusers can mimic a victim’s voice, potentially leading to detrimental consequences.
Example: Lisa’s former partner used AI voice cloning to impersonate her during a phone conversation with her employer, causing her to lose her job and financial stability.
Artificial intelligence has opened up new avenues for abusers to perpetrate manipulation and harassment in England and Wales:
(1) Voice cloning: AI-based voice cloning technology allows abusers to mimic a victim’s voice convincingly, enabling them to impersonate the victim in various situations.
(2) Deepfake videos: AI can be used to create realistic deepfake videos that manipulate the appearance and speech of individuals, further enabling manipulation and deception.
(3) Social engineering attacks: AI-driven social engineering attacks, including phishing emails and fraudulent phone calls, can be used to manipulate victims into divulging sensitive information.
The above is an example covered by s 77(1) Serious Crime Act 2015 of AI being used to cause reputational damage and loss of earnings.14 In recent years, consultations for legislating AI have not progressed as quickly as AI has developed and there are not specific offences that clearly deal with AI yet. The OSA 2023 clearly criminalises the use of deepfake technology in creating child pornography15 and sharing of intimate deepfakes of an adult.16 For other offences, the following legislative arms can be extended to harms that involve the use of voice cloning or deepfake technology:
- ss 1 and 2 Fraud Act 2006, where a false representation has been made to deceive and cause harm; and
- ss 179 and 182 OSA 2023 regarding a false communications offence.
The following eight questions were published by MIT Media Lab as part of its Detect Deepfake project to help individuals identify if videos are genuine or have been created using deepfake technology:17
(1) Pay attention to the face. High-end DeepFake manipulations are almost always facial transformations.
(2) Pay attention to the cheeks and forehead. Does the skin appear too smooth or too wrinkly? Is the agedness of the skin similar to the agedness of the hair and eyes? DeepFakes may be incongruent on some dimensions.
(3) Pay attention to the eyes and eyebrows. Do shadows appear in places that you would expect? DeepFakes may fail to fully represent the natural physics of a scene.
(4) Pay attention to the glasses. Is there any glare? Is there too much glare? Does the angle of the glare change when the person moves? Once again, DeepFakes may fail to fully represent the natural physics of lighting.
(5) Pay attention to the facial hair or lack thereof. Does this facial hair look real? DeepFakes might add or remove a moustache, sideburns, or beard. But, DeepFakes may fail to make facial hair transformations fully natural.
(6) Pay attention to facial moles. Does the mole look real?
(7) Pay attention to blinking. Does the person blink enough or too much?
(8) Pay attention to the lip movements. Some deepfakes are based on lip syncing. Do the lip movements look natural?
Additionally, metadata and other surrounding information could help with identification of a DeepFake. Information such as timestamps, editing history, and GPS coordinates showing inconsistencies within this metadata could point to possible AI manipulation.18
If a call sounds exactly like a family member but has come from a line the individual doesn’t recognise or at an irregular time, they should think about asking specific questions to the scammer that they would only know the answer to if they were said family member. Likewise, planning ahead by having key words or questions/answers with close friends and family members can help better prepare for this scenario. Like with call spoofing, ending the call and directly calling the person can also help to identify if the caller is ingenuine.
Conclusion
The intersection of technology and domestic abuse poses significant challenges within the framework of family law in England and Wales. Abusers are increasingly employing tools such as call spoofing, Apple AirTags, and mobile phone spyware to stalk, control, and manipulate their victims. Moreover, the emergence of AI technologies, like voice cloning, has further exacerbated these risks.
It is imperative for the legal community in England and Wales to acknowledge and address these threats. Whilst there have been some legislative updates in recent years, campaigning from organisations such as refuge suggest more can be done to create a safer and more secure environment for victims of domestic abuse. Raising awareness of the harms out there and how individuals can better protect themselves is the first step. Likewise, further training for legal professionals and the judiciary to stay updated with emerging technologies where possible can help reduce harms from the insidious misuse of technology. By adapting to the evolving landscape of technological abuse, the legal system in England and Wales can better protect the rights and well-being of those affected by domestic abuse in the digital age.