The ASA has issued a ruling about an ad for an AI tool.
A paid-for YouTube ad for PixVideo – AI Video Maker showed a side-by-side "before" and "after" comparison image of a young woman. In the "before" image a red scribble was overlaid over the woman's midriff and appeared to originate from an animated cursor. In the "after" image the area that had been covered by the red scribble now revealed the woman's bare skin, including underneath her shorts. Text across the bottom of the image stated "Erase anything [heart-eyes emoji]".
Eight complainants, who believed the ad sexualised and objectified women, challenged whether it was irresponsible, offensive and harmful.
The company made clear that their terms of use prohibited the creation of nude or sexually explicit content and they had automated AI-based detection and blocking to prevent exposed or explicit imagery from being generated. They also said the app did not support, and was not designed to enable, the removal of clothing or the creation of nude imagery. However, they accepted that the ad did not reflect those safeguards or restrictions and that its wording and visuals risked implying uses they did not support or allow, which was a failure in creative execution and oversight.
They said they had already removed the ad and voluntarily suspended all advertising across all media platforms to carry out a comprehensive internal audit and rectify their marketing materials, and that no advertising had resumed since. Alongside withdrawing the ads, they were upgrading their advertising review and approval processes, including stricter creative guidelines, enhanced internal review, and mandatory compliance checks to ensure future ads did not imply, encourage or normalise harmful, sexualised or non-consensual portrayals.
Whilst the ASA understood that the app did not permit users to create nude or sexually explicit content, it nonetheless considered that, by implying viewers could digitally remove a woman's clothing and expose her body, the ad reduced the woman to a sexual object. Furthermore, because the ad implied that viewers could use an app to remove a woman's clothing, it considered it condoned digitally altering and exposing women's bodies without their consent. As a result, it upheld the complaint.
AI-powered platforms that generate videos and images is an area where all the UK regulators (and the government) are focusing their attention, particularly when there is scope for those technologies to be used in a way that could cause serious harm or widespread office, and there is particular concern over the use of those technologies in ways that can create or adapt sexualised content, based on real people who have not consented to that use. We have written about the government's plans to bring AI tools within the remit of the Online Safety Act 2023.
The European Parliament has called for a ban on nudification tools as part of the changes to the EU's AI Act in the Digital Omnibus proposals.
Generally speaking, you need to make sure that your processes for checking and approving marketing content are generally robust, whether you use humans and/or AI to generate it. If you are marketing tech and AI products, we can help you make sure that you don't make any claims that could be considered offensive or inappropriate.
