SASHA’s mission to end image-based abuse Is reshaping the rules of the digital world

SASHA’s privacy-first technology doesn’t just detect abuse — it empowers victim-survivors to take action, trace misuse, and demand accountability.
SASHA’s mission to end image-based abuse Is reshaping the rules of the digital world

Sometimes — not often — I meet someone building technology so profound it could reshape the future of the internet. 

SASHA is one of those rare companies,  developing a privacy-first tool that redefines how we govern, interact with, and protect content online. 

By giving individuals control over their images and embedding accountability into digital sharing, it has the potential to fundamentally shift how we participate in the digital world.

The technology makes it possible to prove ownership of an image, and potentially identify the first-link leaker if an image is abused online — even if the image is manipulated or screenshotted. 

I spoke to Thomas Eriksson, CEO and founder of SASHA, to learn more.

From a personal privacy breach to public mission 

SASHA (short for Safe Share) was founded in 2020 and seeks to prevent and address online image-based abuse (IBA) and identity theft. SASHA’s business aim is to empower and support victim-survivors to take back control over their images and to hold perpetrators legally accountable for their actions.

The company was inspired by the experience of a close friend of Eriksson’s whose ex-boyfriend had leaked intimate images of her online. It was heartbreaking. 

Eriksson admits, “I wanted to help her. I said, “You should report it,” and started looking for solutions.

"I was shocked at how big the problem was — how widespread image-based abuse is — and how hard it was to get anything taken down. We went to the police, but they told us it was difficult to do anything because perpetrators hide behind fake accounts.

Platforms said they’d take it down if it were public, but in private messages? That was harder — too many barriers.

Eventually, we got the content removed, but within 10 minutes, it reappeared under a new fake account with slight alterations. And it happened again and again.“

“No consequence,  no control”: the problem SASHA was built to solve

Eriksson is a Danish concept developer and serial entrepreneur with a background in medtech, edtech, and digital innovation. He has created products and campaigns for both creative and commercial organisations, including Roskilde Festival — where he led the "Roskilde Fever" campaign during the COVID-19 lockdown — Odense University Hospital, and the Danish Centre for Learning Materials.

He wondered, Why hasn’t this problem been solved?

“Reverse image search works, but only on publicly available content. There are watermarking tools, but they’re either visible, fragile, or centralised. And none of them help with the burden of proof.

Eventually, I realised the issue boiled down to a lack of consequence. There are no consequences for sharing images without consent online. That’s the root.”

In the physical world, we’ve built systems — laws, social norms — that enforce consequences. Online? Not so much. 

“So I asked, 'What kind of technology could restore consequences?'”

According to Eriksson, first, we need to document the intent of a share.

“If I send a photo of my kids to my mother-in-law via Messenger, and it ends up elsewhere, I should be able to trace it back. Just like if I lend you my car and you give it to someone else who crashes it — you’re still responsible.”

Second, the tech had to be robust. Images online go through compression, rotation, filters, and memes. 

“Metadata gets stripped the moment you upload. Most watermarking dies if you rotate the image 11 degrees or draw on it.”

Third, it had to be scalable. 

“We share 14 billion images a day. So we have to build something that works on the edge — on the user’s device — not in the cloud. That protects privacy and scales infinitely.”

How SASHA works

With SASHA, the watermark — or signature — is embedded directly into the image pixels. 

Eriksson explained:

“We’ve hidden what we call “needles in a haystack,” where the haystack is the size of a football field, and the needles are practically invisible.”

SASHA is resilient against both compression and manipulation, maintaining data integrity. Its system ensures robust protection of images directly on-device through a decentralised network. 

“Even after compression, rotation, and screenshotting, our signal stays intact. In our beta, we can even detect the signature from a photo taken of another phone screen.”

Further, as image abusers become more sophisticated, so does SASHA. Its iterative learning process studies and processes new attacks, applying these insights to prevent future occurrences.

 Importantly, SASHA complies fully with strict privacy standards such as GDPR, in line with EU laws. It solely uses an imprint of each image for recognition, ensuring images remain private.

The company does not have access to or store the original image. Users can share their catalogued images through the App. 

If another user tries to share an image through the SASHA App, SASHA uses AI-embedded technology to see whether this particular image file is a derivative of an image file explicitly marked as “not shareable” by another user. In this case, the SASHA App blocks the share.

This means that if someone finds their image somewhere it shouldn’t be, they can scan it and know exactly who shared it. They can send a takedown request — or a cease and desist letter — directly through the app. It gives people the power to act, not just wait for a platform to maybe care.

Embedding digital consent at the point of share

SASHA is currently concentrating on the B2C model, but its ultimate goal is the B2B2C model, which will make the B2C model redundant.

According to Eriksson, “right now, content moderation is reactive. Someone reports an image, a moderator reviews it, and decides if it breaks policy. “

“That’s slow, expensive, and subjective.

With SASHA, platforms can just ask the image: ‘Should you be here?’ And the image answers: ‘No, I was intended for this person, in this context.’ Boom — delete.

No need to see the content. No need to guess. Just enforce the owner’s intent.”

The biggest shift for SASHA will be when the tech is embedded into platforms. The company is in discussions with some of the biggest tech players to build this into messaging and upload functions.

The B2B2C model aims to not only make it easier for victim-survivors to identify a perpetrator of nonconsensual image sharing but also to prevent such non-consensual sharing in the first place. 

Imagine if, when someone tries to forward an image, their app says: “This photo was shared with you privately and cannot be reshared.” As Eriksson shared, “it’s like embedding consent into the image itself.

The company is also is close dialogue with insurance and telco partners that offer digital safety services. They’ll sponsor SASHA premium access for their customers. And the company will offer APIs to platforms that want to integrate our protection.

The next step for SASHA is deepfakes. Eriksson explained that because SASHA’s system knows the origin and intent of an image,it can detect when content has been altered or used maliciously. Deepfakes often start with real content. We can trace it.

Embedding oversight into the infrastructure of trust

The team is acutely aware that technologies designed to protect users can also be misused — or unintentionally expose already vulnerable populations to new risks.

SASHA took the novel step of engaging Tech Legality, a consultancy firm specialising in human rights and digital technologies, to carry out an independent human rights impact assessment (HRIA) of the SASHA product. 

The assessment was tasked with analysing the role SASHA’s product could play in preventing and responding to IBA and identity theft through a human rights lens; identifying potential adverse human rights impacts and make recommendations to prevent or mitigate those, including risks to SASHA’s users and others impacted by SASHA’s products, in particular vulnerable groups, and journalists and political activists; 

For example, there is a risk that governments could approach SASHA with requests for specific user data, either under national laws which may not always comply with international human rights law, or by informal requests where there is no legal basis, but still pressure is exerted.

Further, in an increasing number of jurisdictions (e.g. Ghana, Malaysia, Bangladesh, Pakistan), homosexuality is criminalised, and this means that where SASHA users who are LGBTQ+ share intimate images of themselves, they may risk leaving an evidence trail related to their production of materials that are deemed illegal under national laws. 

In these cases, the SASHA technology could be used to identify specific users for the purpose of a criminal investigation. 

Mindful of this risk, Eriksson explained,  “If a regime came to us and asked who took a picture of a protest — we couldn’t tell them. Only the owner can unlock that.”

Further, the SASHA product has been designed with the intention that adult users can use it as an avenue to safely share images, including sexual images, while protecting their images from unwanted onward sharing. 

However, children globally also share sexual images of themselves, commonly referred to as ‘sexting’. In many jurisdictions, including in some countries in Europe, ‘sexting’ is criminalised as it is deemed to be selfproduction of child sexual abuse materials. This means that if children use the SASHA app, it could produce evidence that may be used against the child in criminal proceedings. 

Children may be incentivised to use the App because it promises the ‘safe sharing’ of intimate images, which implies that their images will be kept private and that children will be safe from criminalisation.

In response, SASHA is made available only to users aged 16+, in line with European privacy laws. 

Eriksson admits, “That was a tough decision. Younger teens are vulnerable, but legally we can’t offer it to them without their guardians.”

"We’ve designed the system so that we cannot provide identifying information for minors under 16, even if asked. Our terms of use make that clear. We want to prevent harm, not contribute to it.”

Eriksson admits, “If we succeed at scale, we’ll need oversight. We’re designing our system so that independent third parties — like EU regulators — can verify how data is used and accessed. The database of image signatures must not be misused or exploited.”

A new wave of tools turning digital evidence into justice

SASHA joins a growing wave of technologies designed not just to document harm, but to actively support justice, dignity, and agency in the digital age.

For example, in terms of offender identification and prosecution, TraffickCam is a mobile app that helps combat child sex abuse and human trafficking by letting travellers upload photos of hotel rooms they stay in. These images help build a searchable database used by investigators to match photos from trafficking cases with specific hotel locations, aiding in the identification and prosecution of offenders.

UK-founded eyeWitness to Atrocities equips human rights defenders, journalists, and investigators with tools to securely document and verify evidence of war crimes and human rights abuses. Its Android app captures photos, video, and audio with embedded metadata (e.g. GPS, time, motion), ensuring authenticity.

 All files are encrypted and uploaded to a secure repository with a verifiable chain of custody, suitable for legal use. eyeWitness then compiles this material into structured dossiers to support international justice efforts, working with courts, civil society groups, and global institutions.

This is just the beginning for SASHA, a mission-driven company changing the lived reality for many of the internet. The company’s 

“Honestly, I’m really proud. Not just of the technology, but of the team. We’ve brought on some of the world’s best cryptographers and watermarking researchers, including people formerly at Google. They could work anywhere, but they chose SASHA because they believe in the mission,” shared Eriksson.

“We’ve created a tool that gives power back to individuals. That embeds responsibility into digital content. That lets us build an internet where privacy, consent, and accountability are not just ideals, but defaults.”

Lead image: Thomas Eriksson, CEO and founder of SASHA. Photo: uncredited. 

Follow the developments in the technology world. What would you like us to deliver to you?
Your subscription registration has been successfully created.