Continuing the Global Efforts to Prevent Non-Consensual Intimate Image Sharing

Committed to preventing the non-consensual sharing of intimate images online

  • Over 182,000 individuals in the past two years have protected themselves online by using to prevent their intimate images and videos from being shared across multiple online platforms.
  • Groundbreaking tool continues to pave the way in the protection against intimate image abuse through the innovative use of hashing technology.
  • Platforms who have implemented include Facebook, Instagram, TikTok, Bumble, OnlyFans, Reddit, Aylo, Threads and Snap Inc.


Montreal, Quebec (November 30, 2023) – Today marks the second-year anniversary of, a groundbreaking online tool, (developed by SWGfL and Meta) that can help prevent the non-consensual sharing of intimate images online. Since launching in 2021, over 182,000 adults across the world have created cases on to protect their intimate images from being shared by perpetrators of intimate image abuse.

To mark this milestone, we are continuing to recognise the global efforts that are being taken to put a stop to non-consensual intimate image sharing. Industry partners, who have all joined within this time, have allowed more and more people from across the world to gain awareness of how can support those being threatened by intimate image abuse, allowing them to take preventative action and remove control from perpetrators. Over the past two years, has been implemented by Facebook, Instagram, TikTok, Bumble, OnlyFans, Reddit, Aylo, Threads and Snap Inc.

How Does Work? uses world-first, on-device hashing technology. People concerned about intimate image abuse can create unique identifiers of their images, (also known as ‘hashes’ or digital fingerprints) from their own device.

Only hashes are then shared with, not the original image (or video), ensuring that the intimate content never leaves the user’s device. When a case is opened, hashes are presented as a string of letters and numbers, rather than the image itself, to protect the user’s privacy.
Hashes are then shared with participating partners Facebook, Instagram, TikTok, Bumble, OnlyFans, Reddit, Aylo, Threads and Snap Inc. If an image is uploaded to one of these sites, matches the corresponding hash, and meets partner policy requirements, the image will be sent for moderation. If the image meets the criteria of an intimate image, it will be removed and blocked from being shared further across partner platforms.

Since its launch, has received over 434,000 hashes (an almost 1000% increase in 2023 from the previous year)


“When was first developed with Meta back in 2021, we knew that it would, in time, act as the sea-change solution towards the global concern of intimate image abuse. Now, two years down the line, we are seeing a distinct ripple effect, with more awareness of what can achieve and more people taking protective action against perpetrators.
We celebrate the work of our industry partners in implementing and we continue to encourage more industry to join the initiative and ensure their users feel protected when they go online. Everyone should benefit from technology free from harm, and can help us get even closer to achieving that goal.”

- David Wright CEO of SWGFL

“At the Revenge Porn Helpline, we know first-hand the impact that is having for our clients. For every person that comes to us, fearing the sharing of their private, sexual images, the knowledge that there is an effective and powerful way that they can protect themselves is incredibly reassuring. One of the most harmful aspects of this abuse is the sense of powerlessness it gives, but puts power back in the hands of survivors and away from perpetrators of abuse.”

- Sophie Mortimer, Revenge Porn Helpline Manager at SWGfL

“ has become a vital tool for people to protect themselves and their intimate imagery online, and we’re proud to partner with SWGfL and our industry peers on this important work. More and more people are using the tool to prevent the spread of their intimate images online, and we hope others in our industry will recognize its value and join our effort.”

- Cindy Southworth, Head of Women’s Safety at Meta

“Non-consensual intimate imagery is not allowed on TikTok for the safety and well-being of our community members, and partnering with strengthens our efforts to stop this harmful behavior and support those who've fallen victim to it.”

- Julie de Bailliencourt, Global Head of Product Policy, TikTok

“The sharing of intimate images without consent is vile, illegal and preventable. The numbers released today demonstrate the huge impact is having on the lives of those who are - or fear becoming - victims of intimate image abuse. At OnlyFans we are incredibly proud to work in partnership with those who seek to prevent the illegal sharing of non-consensual intimate images and believe that all social media platforms have a role to play in helping to protect people online.”

- Keily Blair CEO of OnlyFans

“Reddit’s mission is to bring community, belonging, and empowerment to everyone in the world. Non-consensual intimate media has no place on our platform, and we continually evolve our policies and tooling to ensure the safety of our users. Initial results of our partnership with, paired with our layered content moderation system, has proved effective in detecting and removing NCIM through its hash matching tool, and we continue to explore ways to expand our tooling and capabilities. As reported in our bi-annual Transparency Report, between the months of January and June, we removed more than 152,000 pieces of NCIM, the majority of which was detected through automated means.”

- Reddit

“Consent is paramount to protecting the safety of our users and the integrity of our platforms, and continued participation in is a key part of our suite of robust trust & safety measures to prevent the distribution of non-consensual intimate images. When marking the two-year anniversary of we see what can be achieved when cross-sector industry works together, and we encourage more platforms to join this initiative to empower victims and help make the internet a safer space for everyone.”

- Alex Kekesi, VP Brand and Community, Aylo.

Further information and contact point:

About is a free global tool with over 90 global NGO partners. Intimate image abuse, also commonly referred to as ‘revenge porn’ is a growing global concern and one that actively addresses. In the UK alone, the Revenge Porn Helpline (operated by SWGfL) has removed over 300,000 intimate images that have been shared without consent. The Revenge Porn Helpline was established in 2015 alongside the introduction of new UK legislation in which disclosing private sexual images and videos without someone’s consent became illegal. The Revenge Porn Helpline was a world’s first and has been seen as a leading support service that is actively addressing intimate image abuse through the removal of non-consensual intimate images online. This work formed the basis for, encompassing a global approach that allowed adults to feel empowered and take protective action across their own devices.

About SWGfL
SWGfL ( is a charitable trust, with an international reputation working to ensure that everyone can benefit from technology, free from harm. Over 22 years old, the charity is responsible for award winning services, resources and campaigns as well as operating three Helplines that each support victims of differing online harms. It works with various Government Departments both at home and abroad and has addressed conferences across Europe, the Americas, Asia, and Africa. SWGfL, alongside partners Childnet and Internet Watch Foundation, lead the UK Safer Internet Centre. The Centre is the national awareness centre and is responsible for raising the nation’s attention to online safety issues.