San Francisco: Microsoft has introduced more protections to its AI text-to-image generation tool Designer that users were utilising to create nonconsensual sexual images of celebrities.
The changes come after AI-generated nude images of American singer-songwriter Taylor Swift that went viral on X last week came from 4chan and a Telegram channel where people were using Designer to create AI-generated images of celebrities, reports 404 Media.
“We are investigating these reports and are taking appropriate action to address them,” a Microsoft spokesperson was quoted as saying.
“Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles,” it added.
Microsoft stated that an ongoing investigation was unable to confirm whether the images of Swift on X were created using Designer. However, the company is continuing to strengthen its text filtering prompts and address the misuse of its services, the report mentioned.
Meanwhile, Microsoft Chairman and CEO Satya Nadella has said that the explicit Swift AI fakes are “alarming and terrible”.
In an interview with NBC Nightly News, Nadella said that “I think it behooves us to move fast on this.”
Swift is reportedly weighing possible legal action against the website responsible for generating the deepfakes.