Skip to main content

VSP update – A trio of recent developments for Video Sharing Platforms (and a glimpse ahead to what enforcement under the Online Safety regime might look like)

27 November 2023

The Online Safety Act 2023 recently received Royal Assent and Ofcom has begun the process of consulting on various codes of practice. Most, if not all, Video Sharing Platforms (VSPs) will be caught by the Act, However, in the meantime, VSPs still need to comply with the existing Video Sharing Platform rules, and recent activity from the regulators indicates that they are keen to remind VSPs that they can’t sit on their laurels until the Online Safety regime kicks in. Regulatory activity can also be taken as an indication of how enforcement action may be approached under the Online Safety regime, as there are many parallels in the regulations.

Ofcom investigates VSP “My Media World Ltd” for non-compliance

Ofcom has opened an investigation into My Media World Ltd, regarding the VSP service Onevsp (previously known as Brand New Tube). The investigation will consider My Media World Ltd’s compliance with its statutory obligations as the provider of a VSP service under Part 4B of the Communications Act 2003 (as amended by the Audiovisual Media Services Regulations 2020).

VSP providers must take and implement measures set out in Schedule 15A of the Communications Act that are appropriate to protect:

  • under 18s from videos containing “restricted material” which includes pornographic material and other material that might impair their physical, mental or moral development; and
  • the public from “relevant harmful material” which includes material that is likely to incite violence or hatred, or material that would amount to a criminal offence under laws relating to terrorism, child sexual exploitation and abuse, and racism and xenophobia.

Schedule 15A lists, amongst other measures, the inclusion of terms and conditions to the effect that:

  • if a person uploads a video to the service that contains any restricted material that the person must bring it to the attention of the provider of the service; and
  • that a person must not upload a video to the service that contains relevant harmful material.

Where providers of VSP services take a Schedule 15A measure, they are required to implement it in such a way as to achieve the purpose or purposes for which the measure is appropriate.

Ofcom has concerns about the implementation and effectiveness of Onevsp’s terms of use. The investigation will therefore examine whether there are reasonable grounds for believing that My Media World has failed to take and/or implement such of the Schedule 15A measures as are appropriate to protect its users from relevant harmful material and/or under 18s from restricted material. Ofcom will gather further information and publish an update on the investigation in due course.

It's not just smoke…

In another regulatory development, the Advertising Standards Authority (ASA) recently issued a topical ruling about a vaping ad. Vaping is a current issue, with the Tobacco and Vapes Bill being part of the recent King’s Speech legislative programme and there being societal concern about the environmental impact of disposable vapes. However, the decision is also of interest for its reference to the video sharing regime.

A paid-for online ad for Daniels Vapes featured a video depicting someone dreaming about a shop with shelves of brightly coloured e-cigarette products.

The ASA challenged whether:

  • the ad breached the advertising codes by directly or indirectly promoting unlicensed nicotine-containing e-liquids and their components in online media; and
  • the platform had breached the VSP rules by including an ad for electronic cigarettes.

The ASA issued an upheld ruling on both points.

Rule 22.12 of the UK Code of Non-broadcast Advertising and Direct and Promotional Marketing (CAP Code) reflects a legislative ban in the Tobacco and Related Products Regulations 2016 regarding the advertising of unlicensed, nicotine-containing e-cigarettes in certain media (except for ads to the trade).

The ban means that ads placed in paid-for social media placements, advertisement features and contextually targeted branded content are likely to be prohibited. The CAP Code also requires marketers to ensure that their ads are compliant.

The ASA considered if the ad directly or indirectly promoted a nicotine-containing e-cigarette. Unlicensed e-cigarettes featured prominently in the ad, which was for an e-cigarette store. The ASA considered that the ad contained promotional content for the product and consequently the restriction that applied to online media under rule 22.12 applied. Daniels Vapes had placed the ad in error. However, because the ad had the direct or indirect effect of promoting e-cigarettes that were not licensed as medicines in non-permitted media, the ASA concluded that it breached the CAP Code.

The ASA is the day-to-day regulator of VSP-controlled advertising. The CAP Code’s VSP Appendix, rule 31.3, states that advertisements for electronic cigarettes and electronic cigarette refill containers are prohibited. It applies to advertising that is “marketed, sold or arranged” by VSPs that are subject to statutory regulation in the UK under the Communications Act 2003 (as amended by the Audiovisual Media Services Regulations 2020). VSPs in UK jurisdictions are responsible for ensuring that ads appearing in paid-for space on its platform comply with the Appendix rules.

In this case, the relevant platform's moderation system had not identified the ad as an e-cigarette promotion, but once notified they had taken swift action to remove it. However, because the ad was for a prohibited product, the ASA concluded that it breached the VSP Appendix.

Interestingly, the Online Safety Act only directly addresses ads which are fraudulent, and certain unlawful financial services adverts (and then only places obligations on the larger platforms). The government’s Online Advertising Programme is intended to complement the Online Safety Act but the government is planning a further consultation and draft legislation before we will understand exactly what will be covered. There’s definitely more to come in this space.

For now though, this decision shows that VSPs and other platforms need to take steps to make their moderation systems as robust as possible. It also illustrates (again) the importance of targeting ads correctly.

Pornography in the spotlight

Focussing now on another vice, back in July, the government announced a review to investigate any gaps in UK regulation which allows exploitation and abuse to take place online as well as identifying barriers to enforcing criminal law. While the criminal law has been updated in recent years to tackle the presence of extreme and revenge pornography, there are currently different regimes that address the publication and distribution of commercial pornographic material offline, such as videos, and online. The government wants to ensure any pornography legislation and regulation operates consistently for all pornographic content.

Further scope of the Review will be set out in due course. The government aims to complete the review within a year. The Online Safety Act will already require pornography websites to prevent underage access including by using age verification technologies. The age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child. The Act requires Ofcom to produce guidance on this. In January 2023, Ofcom published a call for evidence ahead of a consultation, which is due to launch in December 2023. More recently, Ofcom has undertaken a survey on barriers to proving age on porn websites. This seems a sensible move and will no doubt inform the age-gating guidance, which will be eagerly anticipated by the porn industry. It will also be of interest to other VSPs and services caught by the Online Safety Act which are looking to understand what practical steps will be required, or are best practice, to implement requirements to limit access to certain content to those under-age.

Related items

Back To Top