Grappling with dark side of AI: Google Search, Microsoft Bing's deepfake porn 'problem' explained
The prevalence of nonconsensual deepfake pornography on popular search engines underscores the urgent need for effective measures to combat this disturbing trend.
In recent times, the internet has become a breeding ground for the disturbing phenomenon of nonconsensual deepfake pornography, raising concerns about privacy, consent, and the potential misuse of technology. A recent investigation by NBC News has shed light on the accessibility of such explicit content through popular search engines like Google and Microsoft's Bing, prompting a closer examination of the steps these tech giants are taking to address the issue.
Deepfake pornography involves the use of advanced artificial intelligence to superimpose an individual's face onto explicit content, often creating deceptive and disturbing scenarios. NBC News discovered that nonconsensual deepfake images featuring the likenesses of female celebrities were prominently displayed in search results for various women's names when combined with terms like "deepfakes," "deepfake porn," or "fake nudes."
The investigation focused on 36 popular female celebrities and their search results on both Google and Bing. Shockingly, the results revealed that nonconsensual deepfake images and links to deepfake videos surfaced prominently in the top search results for 34 searches on Google and 35 searches on Bing. Over half of the top results directed users to popular deepfake websites or competing platforms.
Searching for terms like "fake nudes" on Google reportedly returned links to various apps and programs designed for creating and viewing nonconsensual deepfake porn within the first six results. Bing, on the other hand, provided dozens of results related to nonconsensual deepfake tools and websites before presenting an article discussing the harms associated with this disturbing phenomenon.
Both Google and Microsoft acknowledged the distress caused by such content and outlined their commitment to addressing the issue. A Google spokesperson stated that they actively design their ranking systems to avoid shocking users with unexpected harmful or explicit content. Google is in the process of building more expansive safeguards, with a focus on removing the need for known victims to request content removals one by one.
“We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search. Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for. As this space evolves, we’re in the process of building more expansive safeguards, with a particular focus on removing the need for known victims to request content removals one-by-one," a spokesperson told NBC News.
Similarly, Microsoft emphasized its prohibition of non-consensual intimate imagery (NCII) on its platforms and services. Microsoft explicitly stated that the distribution of NCII is a gross violation of personal privacy and dignity, and they are actively working to combat it.
"The distribution of non-consensual intimate imagery (NCII) is a gross violation of personal privacy and dignity with devastating effects for victims. Microsoft prohibits NCII on our platforms and services, including the soliciting of NCII or advocating for the production or redistribution of intimate imagery without a victim’s consent," the Microsoft spokesperson told NBC New.
As technology continues to evolve, it is imperative for tech companies to stay ahead of the curve in addressing the challenges posed by nonconsensual deepfake pornography. Stricter safeguards, comprehensive content moderation, and educational initiatives are essential to protect individuals from the devastating effects of this malicious misuse of technology.
The prevalence of nonconsensual deepfake pornography on popular search engines underscores the urgent need for effective measures to combat this disturbing trend. While Google and Microsoft have expressed their commitment to addressing the issue, the road ahead requires continuous efforts to stay ahead of technological advancements and protect individuals from the harmful consequences of nonconsensual deepfake content. The fight against deepfake pornography is a collective responsibility that involves both technology companies and society as a whole.