Follow us


November has proven to be a busy month in the Australian online safety regulatory landscape, with the launch of:

  • the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Departmentconsultation on proposed amendments to the Online Safety (Basic Online Safety Expectations) Determination 2022 (BOSE); and
  • the eSafety Commissioner’s consultation on the introduction of two new enforceable industry standards.

In this instalment of our online safety series, we summarise each of the consultations and their implications for affected service providers. You can see our previous briefing on the BOSE and industry codes generally here.

BOSE consultation

As part of the BOSE consultation, the Australian Government has published a draft amendment to the BOSE along with a consultation paper. Submissions on the BOSE consultation close on 16 February 2024. The draft amendment includes the following key proposals to amend the BOSE:

Topic

Proposed amendment

Generative AI and recommender systems (i.e. systems that prioritise content or make personalised content suggestions of online services)

Take reasonable steps to:

  • consider end-user safety and incorporate safety measures in AI capabilities and recommender systems on the services; 

  • proactively minimise the extent to which generative AI capabilities may be used to produce material or facilitate activity that is unlawful or harmful; and

  • ensure that recommender systems are designed to minimise the amplification of material or activity on the service that is unlawful or harmful. 

User empowerment​

Take reasonable steps to make available controls that give end-users the choice and autonomy to support safe online interactions (e.g. blocking and muting controls, opt-in and opt-out measures for specific types of content, enabling privacy and safety measures to be changed).

Child safety

  • Take reasonable steps to ensure that the best interests of the child are a primary consideration in the design and operation of any service that is used by, or accessible to, children.

  • Introduce new examples of the kinds of reasonable steps that could be taken ensure that technical and other measures are in effect to prevent access by children to class 2 material (e.g. pornography), including the implementation of appropriate age assurance mechanisms based on the level of risk and harm of the material.

Additional requirements with respect to safety of services

Introduce new examples of the kinds of reasonable steps that could be taken to ensure end-users are able to use a service in a safe manner, including:

  • having processes for detecting and addressing hate speech;

  • assessing whether business decisions will have an adverse impact on the ability of end-users to use the service in a safe manner;

  • investing in systems, tools and processes to improve the prevention and detection of material or activity that is unlawful or harmful; and

  • having staff, systems, tools and processes to action reports and complaints within a reasonable time.

Transparency reporting

Publish regular transparency reports, at regular intervals (of no more than 12 months) with information regarding safety measures deployed on the service, the effectiveness of those measures, and the enforcement of terms of use and standards of conduct. Such transparency reports would need to be specific to Australia, unless to do so would not be reasonably practicable.


New industry standards consultation

Since our last update on the industry code process, the eSafety Commissioner has registered the internet search engine code (available here). Following the eSafety Commissioner’s finding that the proposed industry codes for ‘relevant electronic services’ and ‘designated internet services’ did not contain appropriate community safeguards, the eSafety Commissioner has now published draft standards for these services which are the subject of this consultation. Submissions on these new industry standards close on 21 December 2023.

The proposed standards establish minimum compliance measures to address, minimise and prevent harms associated with access to the most harmful forms of online material (e.g. child sexual exploitation, pro-terror, drug-related material). Their application would be broad:

  • ‘Relevant electronic service’ is defined in the Online Safety Act 2021 (Cth) (Act) as a service that enables end-users to communicate with one another by email, instant messaging, SMS, MMS, chat services, through online games and online dating services.
  • ‘Designated internet service’ is defined in the Act as a service which allows end-users to access material using an internet carriage service or which delivers material to persons who have equipment appropriate for receiving that material. This includes many apps and websites, as well as online storage services.  

What next?

Given the wide-ranging implications of the proposed amendments and industry standards, affected service providers should start to consider how these amendments and standards would impact their services and their businesses more generally. They should also consider engaging in these consultation processes.  

Australian online safety series

A spotlight on Australian regulation, specifically addressing online safety

Key contacts

Kwok Tang photo

Kwok Tang

Partner, Sydney

Kwok Tang
Tania Gray photo

Tania Gray

Partner, Sydney

Tania Gray
Christine Wong photo

Christine Wong

Partner, Sydney

Christine Wong
Rachel Holland photo

Rachel Holland

Solicitor, Sydney

Rachel Holland
Nayan Bhathela photo

Nayan Bhathela

Solicitor, Sydney

Nayan Bhathela
Eric Kong photo

Eric Kong

Solicitor, Sydney

Eric Kong

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs

Australia Data Protection and Privacy Technology, Media and Telecommunications Technology, Media and Telecoms Risk and Regulation Emerging Tech and Digital Transformation Kwok Tang Tania Gray Christine Wong Rachel Holland Nayan Bhathela Eric Kong