Deepfake images that graft a child’s face onto sexually explicit material are easily found in top image search results on leading search engines and mainstream social media platforms despite a U.S. law appearing to ban such material.

Such images often feature faces of well-known celebrities before they turned 18 combined with an adult’s nude body. Two of the top 10 image search results for the term “fake nudes” on Microsoft’s Bing were sexually explicit deepfakes of female celebrities from when they were ages 12 and 15, according to a review conducted by NBC News.

One of those images was also returned on the first page of Google image search results for one of the celebrity’s names plus “fake nudes.”



Microsoft and Google both said they removed the specific material identified by NBC News from top search results. However, the search results for terms like “fake nudes” and specific celebrity’s names plus “fake nudes” and “deepfakes” are still returning pages of similar nonconsensual sexually explicit images on both platforms.

The remaining deepfake results include faces that appear to be adults as well as young-looking faces that NBC News has not investigated.

“Microsoft has a long-standing commitment to advancing child safety and removing illegal and harmful content from our services. We’ve removed this content and remain committed to strengthening our defenses and safeguarding our services from abusive content and conduct online,” a spokesperson for Microsoft said in a statement.

“Google Search has strong protections to limit the reach of abhorrent content that depicts CSAM [child sexual abuse material] or exploits minors, and these systems are effective against synthetic CSAM imagery as well,” a Google spokesperson said in a statement.

“We proactively detect, remove and report such content in Search, and have additional protections in place to filter and demote content that sexualizes minors. In line with our policies, we’ve removed the image that was reported to us, and we are constantly updating and improving our algorithms and policies to combat these evolving threats.”

In January, NBC News found two examples of sexually explicit fake media featuring teen Marvel star Xochitl Gomez that appeared at the top of X (formerly Twitter) search results for her name and the word “nudes.” X didn’t respond to a request for comment at the time, but the results have since disappeared.

A deepfake is fake or misleading media that is generated or edited using AI tools. Nonconsensual sexually explicit deepfakes generally refer to material created with AI technology, versus computer-generated or edited fake nude photos that predate the current AI wave.

A federal law that two legal experts say prohibits the knowing production, distribution, reception or possession of computer-generated sexually explicit images of children has been in the U.S. Code since the late 1990s.

But the experts say the law has not been enforced against tech companies hosting images like the ones NBC News identified, but rather, it’s been used occasionally to convict defendants in cases involving individual perpetrators. There are a variety of reasons such material stays up on social media platforms and search engines, the experts said.

“Most people don’t really know what the law on child pornography is, because it’s not obvious. It doesn’t just mean ‘naked pictures of kids,’” said Mary Anne Franks, a professor at George Washington University Law School and a frequent adviser on legislation about nonconsensual sexually explicit material.