As online booking site HealthEngine faces AUD$2.9m fine, pressure is on businesses to ensure privacy statements have rigour
Online health booking platform HealthEngine has been fined AUD$2.9m for deceptive conduct relating to the disclosure of personal information, and editing consumer reviews.
What is HealthEngine and what happened?
HealthEngine provides an online platform that connects consumers with over 70,000 health practitioners in Australia. Up until June 2018, it also allowed consumers to review the quality and services of health practitioners. However, between:
- 30 April 2014 to 30 June 2018, HealthEngine earned AUD$1.8m from disclosing non-health related personal information to insurance brokers without obtaining consumer consent. Information on approximately 135,000 consumers included names, dates of birth, phone numbers and email addresses.
- 31 March 2015 to 1 March 2018, HealthEngine edited approximately 3,000 consumer reviews by removing negative comments on health practitioners, and failed to publish approximately an additional 17,000 reviews.
The Australian Competition and Consumer Commission (ACCC) began investigating HealthEngine in July 2018 and subsequently launched legal proceedings in the Federal Court in 2019 for misleading and deceptive conduct.
ACCC’s call to action?
The ACCC successfully claimed that HealthEngine deprived consumers the option to control the disclosure of their personal information to insurance brokers. It was found HealthEngine did not adequately disclose to consumers that their information would be shared with insurance brokers, meaning consumers were unable to make an informed choice regarding the handling of their personal information.
It was also found that HealthEngine manipulated how consumers made important health care decisions. By failing to disclose other consumers’ experiences with health practitioners listed on its platform, consumers may have chosen a health practitioner they otherwise would have avoided.
HealthEngine also received referral fees from insurance brokers, as well as indirectly receiving a fee from a consumer that booked a health practitioner through their platform.
ACCC’s growing role as a privacy/data regulator
The Office of the Australian Information Commissioner (OAIC) remains Australia’s primary privacy regulator, with responsibility under the Privacy Act 1988. However, the ACCC has increasingly been getting involved in privacy and data matters, often in collaboration with the OAIC. Examples include the new Consumer Data Right scheme and the Digital Platforms Inquiry which, amongst its many findings, recommended a number of Privacy Act reforms.
The ACCC is also increasingly taking action against organisations in relation to misleading and deceptive conduct concerning the manner in which such organisations handle personal information.
In addition to the HealthEngine case, the ACCC has commenced other cases in the Federal Court alleging misleading and deceptive conduct based on inconsistencies between representations made to consumers about the handling of personal information and the actual conduct in relation to that information.
In this context, the ACCC is seemingly taking on a role similar to its United States counterpart, the Federal Trade Commission (FTC). On its ‘Privacy and Security Enforcement’ page, the FTC describes its role as:
When companies tell consumers they will safeguard their personal information, the FTC can and does take law enforcement action to make sure that companies live up these promises. The FTC has brought legal actions against organizations that have violated consumers’ privacy rights, or misled them by failing to maintain security for sensitive consumer information, or caused substantial consumer injury. In many of these cases, the FTC has charged the defendants with violating Section 5 of the FTC Act, which bars unfair and deceptive acts and practices in or affecting commerce.
Of course in the US there is no equivalent regulator to our Privacy Commissioner or the OAIC.
Recent examples of FTC enforcement action relating to deceptive practices in the context of privacy include cases relating to:
- companies misrepresenting the level of control consumers had over their personal information;
- an email management company falsely telling consumers their personal emails would not be ‘touched’, then sharing their email receipts with its parent company;
- an education provider using deceptive lead generators who set up fake military recruiting websites to induce consumers to provider personal information, which was then used to promote the provider’s schools; and
- a phone manufacturer collecting SMS message contents which it did not need and which was contrary to promises made to consumers.
The potential impact of regulation on the use of personal information is increasingly required to be seen through not a single regulatory lens but with a much broader focus.
Steps towards safer ground
To avoid being accused of misleading and deceptive conduct relating to personal information, we recommend organisations:
- Provide clear information to consumers on how personal information will be handled and used in their privacy policies, notices and consents.
- Avoid making representations that commit the organisation to a position that does not reflect how personal information is actually used, or a position that goes beyond what is required by applicable law.
- Be aware of representations that have already been made and the extent to which these may impact the scope of handling personal information.
- Ensure that any representations made regarding the use of personal information are updated regularly for accuracy and are comprehensive. Noting that any omission to provide sufficient detail can also be considered a misleading representation.
- Communicate how personal information will be used in a clear manner and through an appropriate channel for the intended audience. Requiring consumers to take proactive steps to find the information or protect their privacy may undermine the ability to demonstrate that the information was communicated clearly.