Follow us

If proposed changes to the Privacy Act are adopted, operators and users of AI could face far-reaching implications.

With the upswing of interest in Artificial Intelligence (AI) by businesses due to the advent of generative AI language models, such as ChatGPT, businesses are starting to look at other uses for AI. Although there is currently no AI-specific legislation in place in Australia, the flipside is that the general law will apply. But how does that law impact AI? This article looks at some of the future regulatory requirements applicable to the use of AI in Australia if the Government adopts the Attorney-General’s proposals under the Privacy Act Review Report1 (Report).

There are a number of factors applicable to AI that are potentially relevant to the proposed changes. In particular, AI (such as machine learning and deep learning) can be self-trained by processing large datasets through a feedback loop, often without human supervision. As a result of the training, the AI can make changes to the algorithms used to process decisions or to generate the output. This results in the situation that the logic used by an AI system is often unknown, the so-called ‘black box effect’.

This lack of transparency will mean that operators and users of AI-systems may find it difficult to comply with some of the changes proposed in the Report. Other changes proposed in the Report will impact closely related areas that are likely to use AI processing, such as data analytics and the use of biometric data.

Explaining the AI

Several of the recommendations in the Report require the operator to give the user (or customer) information relating to the type of data that is processed and the manner of processing.

Direct marketing - In the area of direct marketing, it will be necessary to provide information about targeting, including clear information about the use of algorithms and profiling to recommend content to individuals. The operator will therefore need to disclose the principles that apply in relation to the targeting algorithm. However, this recommendation does not appear require disclosure of the logic applied in the algorithm.

Automated decisions - A more stringent approach is set out in relation to automated decisions, even though this did not go as far as providing an ‘opt-out’ right for individuals as suggested earlier in the consultation process and as set out in the EU’s GDPR. The information that must be provided includes:

  • specifying the types of personal information that will be used in automated decisions; and
  • the provision of meaningful information about how such decisions are made.

This requires a higher threshold of detail to be provided in respect of decisions and, where decisions are made using black box AI, the operators of that system may not be able to provide meaningful information on how the decisions are made.

Can the AI process fairly and reasonably?

The Report also envisages a more general requirement that any collection, use or disclosure of personal information must be fair and reasonable in the circumstances. This fair and reasonable requirement will most likely create some hurdles in respect of personal information processed by an AI application.

Considerations in determining whether the processing is fair and reasonable include:

  • whether an individual would reasonably expect the personal information to be collected, used or disclosed in the circumstances – in this case, consider whether existing datasets can be used for training an AI or for data analytics where there was previously no expectation that this was likely;
  • whether it is reasonably necessary for the functions and activities of the organisation or necessary or directly related for the functions and activities – this might be relevant if a data owner wanted to partner with a third party to conduct data analytics on a dataset to identify potential alternative commercial offerings; and
  • the kind, sensitivity and amount of personal information being collected, used.

For business using an AI may find it difficult to identify the information that is collected or used or even to know what amount of data is retained. In addition, if the AI is self-trained it would be difficult to determine whether the AI-generated information is necessary or directly related to the functions or activities.

Targeting individuals

Applying the fairly and reasonable requirement to the targeting of individuals, the Report recommends that businesses should provide clear information about the use of algorithms and profiling. As noted above in relation to automated decisions, use of black box AI will make it difficult for a business to provide the relevant information.

Data analysis is collection

The Report also recommends that the Act should be amended to clarify that ‘collection’ of data includes obtaining information from any source, including inferred or generated information. The Report expressly confirms that this change includes data generated using data analytics and machine learning. It seeks to clarify that any ‘collection’ will be from the point at which the data is generated.

Data analytics frequently take place on existing sets of data to try and identify patterns. Consequently, this use does not occur at a point where the customer can review a privacy notice. The Report acknowledges that this can lead to practical challenges, especially given that Australian Privacy Principle 5 requires notice (and possibly consent) at the time of collection.  Businesses will need to assess the impact of this clarification on future data analytics and plan the required consents in advance.

Closely related areas

There are other recommendations that will have a potential impact in areas that are likely to involve the use of AI, such as:

  • the collection of precise geolocation data (which will require consent); or
  • the use of biometric data (which may be considered an activity with a ‘high privacy risk’ and would require the business to conduct a privacy impact assessment before starting the activity).

For those businesses that are looking to harness the benefits of AI, it will be important to understand the impacts of these upcoming changes. HSF can help guide you through the process.

  1. The Privacy Act Review Report, December 2022 published by the Attorney-General’s Department (see

Key contacts

Alex Lundie photo

Alex Lundie

Senior Associate, Melbourne

Alex Lundie
Julian Lincoln photo

Julian Lincoln

Partner, Head of TMT & Digital Australia, Melbourne

Julian Lincoln
Kaman Tsoi photo

Kaman Tsoi

Special Counsel, Melbourne

Kaman Tsoi
Kwok Tang photo

Kwok Tang

Partner, Sydney

Kwok Tang
Susannah Wilkinson photo

Susannah Wilkinson

Regional Head, Emerging Technology (APAC), Brisbane

Susannah Wilkinson
Raymond Sun photo

Raymond Sun

Solicitor, Sydney

Raymond Sun

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs