The ad watchdog has joined the war against AI-powered “nudification” after issuing the first ban on an ad for one tool that implied viewers could digitally remove a woman’s clothing.
The YouTube ad for PixVideo AI Video Maker, seen in January, showed a “before” and “after” image of a young woman, with red scribble overlaid on her midriff in the former, and parts of her bare skin exposed in the latter. Text across the bottom of the picture stated: “Erase anything” followed by a heart-eyes emoji.
The Advertising Standards Authority received eight complaints from people who believed the ad sexualised and objectified women, and challenged whether it was irresponsible, offensive and harmful.
In response, PixVideo said it understood the ASA’s concerns about how the creative could be interpreted, particularly around the objectification of women and the implication that someone’s body could be digitally altered or exposed without consent. It recognised those interpretations were harmful and unacceptable, and understood why the ad was considered likely to cause serious offence under the CAP Code.
However, the firm said the concerns related to the ad’s presentation and messaging, rather than the intended or permitted use of their product. Its terms prohibited the creation of nude or sexually explicit content and they had automated AI-based detection and blocking to prevent exposed or explicit imagery from being generated. It also said the app did not support, and was not designed to enable, the removal of clothing or the creation of nude imagery.
Even so, the ASA banned the ad and said: “Because the ad implied that viewers could use an app to remove a woman’s clothing, we considered it condoned digitally altering and exposing women’s bodies without their consent, adding that the ad was included a “harmful gender stereotype and was likely to cause serious offence”.
The company has agreed not to show the ad again and has paused all advertising while it carries out an internal review.
Related stories
Tech firms told to ensure AI images are sound – or else
ICO wades into Grok AI row with data protection probe
Disney gets spanked for Predator ‘severed body’ poster
‘Incestuous’ and ‘harmful’ mobile ad ripped down by ASA
ASA ruling exposes how easily kids can view explicit ads
Mobile gaming ads battered for portrayal of women

