Entertainment

Taylor Swift Search Blocked on X/Twitter After Deluge of Nude AI Fakes

Taylor Swift Search Blocked on X/Twitter After Deluge of Nude AI Fakes

X, the social network owned by Elon Musk formerly known as Twitter, has finally taken a rudimentary step at the platform level to try slow the spread of fake graphic images of Taylor Swift.

As of Saturday, searches on X that include the text “Taylor Swift” returned an error message that said, “Something went wrong. Try reloading.” However, as users pointed out, X appears to be blocking only that specific text string; a query for, say, “Taylor AI Swift,” still is allowed on X.

Regarding the change to block searches for Taylor Swift, X head of business operations Joe Benarroch said in a statement to the Wall Street Journal, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.”

The move came several days after sexually explicit AI-generated images of Swift went viral across X, as well as other internet platforms.

On Friday, SAG-AFTRA issued a statement condemning the Swift fake images as “upsetting, harmful and deeply concerning” and said “the development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal.” Microsoft CEO Satya Nadella, in an interview with NBC News, called the fake Swift porn images “alarming and terrible” and said that “we have to act” and that “irrespective of what your standing on any particular issue is I think we all benefit when the online world is a safe world.”

The White House also weighed in on the issue. Asked if President Biden would support legislation making such AI-generated porn illegal, White House press secretary Karine Jean-Pierre responded, “We are alarmed by the reports of the circulation of images that you just laid out… There should be legislation, obviously, to deal with this issue.”

Sexually explicit deepfakes of Swift went viral on X on Wednesday, Jan. 24, generating more than 27 million views in 19 hours before the account that originally posted the images was suspended, NBC News reported.

In a post in the late evening on Jan. 25, X’s Safety team said the company was “actively removing” all identified images of nonconsensual nudity, which it said is “strictly prohibited” on the platform.

“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” the Safety on X account wrote in a post. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We’re committed to maintaining a safe and respectful environment for all users.”




Source link

Related Articles

Back to top button