By | March 8, 2024

The advancement of generative AI tools has created a new problem for the internet: the proliferation of images of synthetic nudes that resemble real people. On Thursday, Microsoft took a big step to give victims of revenge porn a tool to prevent their Bing search engine from returning these images.

Microsoft has announced a partnership with StopNCII, an organization that allows victims of revenge porn to create a digital fingerprint of these explicit images, real or not, on their device. StopNCII’s partners then use that fingerprint, or “hash” as it’s technically known, to remove the images from their platforms. Microsoft’s Bing joins Facebook, Instagram, Threads, TikTok, Snapchat, Reddit, Pornhub and OnlyFans in partnering with StopNCII, and using their fingerprints to stop the spread of revenge porn.

In a blog post, Microsoft says it has already taken action on 268,000 explicit images that were returned through Bing image search in a pilot until the end of August with the database of StopNCII. Previously, Microsoft offered a direct reporting tool, but the company says that it is proven that it is not enough.

“We have heard concerns from victims, experts and other stakeholders that user reports alone cannot effectively scale for impact or adequately address the risk that images can be accessed via search ,” Microsoft said in its blog post Thursday.

You can imagine how much worse that problem would be on a significantly more popular search engine: Google.

Google Search offers its own tools to report and remove explicit images from its search results, but has faced criticism from former employees and victims for not partnering with StopNCII, according to a Wired investigation. Since 2020, Google users in South Korea have reported 170,000 search and YouTube links for unwanted sexual content, Wired reported.

The AI ​​deepfake nude problem is already widespread. StopNCII’s tools are only for people over the age of 18, but the “undressing” sites are already causing problems for high school students across the country. Unfortunately, the United States does not have a deepfake AI pornography law to enforce anyone, so the country relies on a patchwork approach of state and local laws to address the problem.

San Francisco prosecutors announced a lawsuit in August to take down 16 of the most “undressed” sites. According to a tracker for deepfake porn laws created by Wired, 23 US states have passed laws to deal with non-consensual deepfakes, while nine have canceled proposals.

Leave a Reply

Your email address will not be published. Required fields are marked *