Using facial recognition technology – lessons from the Bunnings Case and a new OAIC guide on assessing privacy risks

Brenton Steenkamp, Monique Azzopardi and Utsab Banskota
05 Dec 2024
6 minutes

In using technologies, it is important to be aware that the privacy risks can be heightened where that technology involves the collection, use or disclosure of sensitive information.

The Privacy Commissioner’s determination in Commissioner Initiated Investigation into Bunnings Group Ltd (Privacy) [2024] AICmr 230 (29 October 2024) offers several lessons learnt for entities proposing to use, or who use, facial recognition technology (FRT). The Bunnings Case serves as a reminder of the importance of ensuring that technologies (especially newer technologies) are assessed through a privacy lens to confirm they are legally compliant and that any personal information (including sensitive information) collected, used or processed through such technologies is done in accordance with applicable privacy laws.

The Office of the Australian Information Commissioner has also recently published “Facial recognition technology: a guide to assessing the privacy risks”, which is focused on the use of FRT in a commercial or retail setting. The OAIC FRT Guide complements the Privacy Commissioner’s findings in the Bunnings Case and offers practical guidance on navigating and addressing privacy risks associated with the use of FRT.

The Bunnings Case

In the Bunnings Case, the Privacy Commissioner, Carly Kind, determined that Bunnings Group breached the Privacy Act 1988 (Cth) by collecting personal and sensitive information through the use of FRT and contrary to the Australian Privacy Principles (APPs). Although Bunnings defended that its use of this technology was necessary for the safety of its staff and others, the Privacy Commissioner found that the use of FRT by Bunnings was privacy intrusive and disproportionately interfered with individuals’ privacy.

The Bunnings Case follows an earlier 2021 determination by the then Australian Information Commissioner and Privacy Commissioner, Angelene Falk, that 7-Eleven collected sensitive biometric information (through the use of a FRT tool while surveying customers about their in-store experience), and that this interfered with consumers’ privacy as it was not reasonably necessary for 7-Eleven’s functions and was conducted without sufficient notice as required by the APPs.

Bunnings’ use of FRT in a nutshell

  • Between November 2018 and November 2021, Bunnings operated FRT across multiple stores in Victoria and New South Wales. FRT, via CCTV, captured the facial images of individuals within those Bunnings stores, including customers, staff and contractors. The FRT system was a means by which Bunnings could create and maintain a database of individuals (Database) it considered posed a threat (for example, individuals who had demonstrated violence or engaged in “serious cases of theft”) and who could be detected when those individuals entered a store where FRT was in use.
  • Bunnings reportedly used the facial image of each individual that entered its stores, through the application of an algorithm, to create “searchable data” in relation to that individual’s facial image. Each vector set was compared against vector sets previously extracted from the faces of individuals that Bunnings had enrolled in the Database. Where a match was identified, an alert was created in respect of that matched individual.
  • Bunnings reported that information of non-matched individuals was automatically deleted and it claimed that the transient processing of an image which occurred as part of the matching process took an average of “4.17 milliseconds”.

What did the Privacy Commissioner determine?

In summary, the Privacy Commissioner determined that:

  • Bunnings collected the sensitive information of enrolled individuals, as well as the sensitive information of all individuals who entered a relevant store during the period that FRT was in use. The sensitive information comprised biometric information and information or an opinion about the criminal records of a subset of enrolled individuals.
  • Bunnings interfered with the privacy of individuals and breached some APPs by:
    • not taking reasonable steps as required by APP 1.2(a) to implement practices, procedures and systems to comply with the APPs;
    • collecting individuals’ sensitive information without their consent and in breach of APP 3.3;
    • not taking such steps, as were reasonable in the circumstances, to notify individuals of the facts, circumstances and purposes of collecting their personal and sensitive information and the consequences of not collecting that information in breach of APP 5; and
    • failing to include, at the relevant time, information within its privacy policy about the kinds of personal information that it collected and held, and how it collected and held that personal information, in breach of APP 1.3 and APP 1.4.

The Privacy Commissioner also addressed Bunnings’ defence that it could rely on one of the “permitted general situation” exceptions under items 1 and 2 of section 16A of the Privacy Act. One core requirement that Bunnings needed to prove to rely on these items was that it reasonably believed that the collection or other activity was necessary (respectively):

  • to lessen or prevent a serious threat to the life, health or safety of an individual, or to public health or safety: or
  • for Bunnings to take appropriate action in relation to unlawful activity or misconduct of a serious nature.

The Privacy Commissioner highlighted the limitations of the FRT system in achieving the above purposes and noted that the FRT system impacted a much broader cohort of persons than alternative options available. The Privacy Commissioner ultimately found that Bunnings could not have held a reasonable belief that the collection of personal information through the use of FRT was necessary to achieve the above purposes. In doing so, the Privacy Commissioner noted that the term “necessary” requires something more than a collection, use or disclosure being merely helpful, desirable or convenient. The Privacy Commissioner noted that, while FRT may have been an efficient and cost effective option available to Bunnings at the time, its deployment of FRT was the most privacy intrusive option available to Bunnings and was not proportionate to the benefit gained from its use.

The Privacy Commissioner stated: “the impact on the privacy of individuals outweighed the benefits that were or could be realised by the use of the FRT system in respect of lessening or preventing serious threat situations.” Bunnings’ use of the FRT system involved an “indiscriminate collection of personal information” in order to “take appropriate action in respect of actual or suspected unlawful activity by a relatively small number of individuals and in a limited set of circumstances”.

Bunnings has issued a statement on its website indicating that it will seek a review of the Privacy Commissioner’s decision before the Administrative Review Tribunal.

Facial recognition technology and the statutory tort for serious invasions of privacy

In its September 2023 response to the Privacy Act Review Report, the Australian Government agreed that further consideration should be given to enhanced risk assessment requirements in the context of FRT and other uses of biometric information. The Australian Government also agreed in principle that the collection of biometric information for use in FRT should be an exception to the small business exemption. Those specific reforms did not form part of the tranche 1 privacy reforms which were recently passed by Parliament. However, the statutory tort for serious invasions of privacy does form part of the tranche 1 reforms which recently passed (with some amendment). A high bar needs to be met for the tort to be established and a number of defences and exemptions are available. However, we can anticipate some situations where the use of FRT, or information obtained through FRT, could potentially give rise to a cause of action under the tort (for example, where information collected through the use of FRT is misused in an intentional or reckless manner and this results in a serious invasion of privacy in the absence of a countervailing public interest).

FRT is not fail proof and technical challenges, such as false positives, pose legal and other risks beyond privacy (for example, the risk of bias and discrimination). In addition to considering the privacy impacts associated with FRT, we recommend that entities ensure that the use and application of FRT is legally compliant more broadly.

Key takeaways

The Bunnings Case offers several lessons for organisations both from the perspective of using and deploying FRT and technologies more broadly. Set out below are some key takeaways for entities subject to the Privacy Act:

  • In using technologies, it is important to be aware that the privacy risks can be heightened where that technology involves the collection, use or disclosure of sensitive information. Sensitive information is generally afforded a higher degree of protection under the Privacy Act. For example, consent is generally required for the collection of sensitive information, unless a relevant exception under the Privacy Act applies. Whether an exception applies needs to be assessed having regard to the factual context and whether all elements of the exception are met. Entities should not rely on an exception under the Privacy Act in the absence of sufficient grounds.
  • Even if personal information is only held for a very brief or transient period of time, it can still be considered “collected” for the purposes of the Privacy Act.
  • Before deploying FRT systems, organisations should conduct a privacy impact assessment or PIA to identify and consider potential privacy risks and impacts and implement measures to address those risks and impacts.
  • Entities should develop written policies and procedures governing their use of FRT before implementing and adopting it.
  • An entity’s transparency in respect of the collection, use and disclosure of personal information is important, especially where newer technologies and/or sensitive information is involved. Entities subject to the Privacy Act should ensure that they regularly review and ensure that their privacy notices and consents are up-to-date, accurate and consistent with the Privacy Act and address all relevant matters required under the APPs.
  • Entities should train relevant staff in relation to the privacy issues associated with FRT, the entity’s obligations under the APPs and its policies and procedures relevant to the operation of FRT systems.
  • Entities should periodically review and consider the efficacy of any ongoing use of FRT and conduct regular privacy reviews to re-assess any new or emerging privacy risks and confirm that the use of FRT remains in conformance with the APPs.
  • When selecting and implementing technologies, like FRT, proportionality and balancing your organisational objectives with your privacy and broader legal obligations is critical. Organisations should only use technologies that involve the collection, use, disclosure or processing of sensitive information where they have proven utility, are necessary and compliant with all relevant laws, including the Privacy Act.
  • Good privacy governance is vital. This will put organisations in a better position to manage privacy risks, avoid potential privacy breaches and build community trust.

The OAIC FRT Guide reiterates the above considerations and offers guidance and several tips for private sector organisations that are considering deploying FRT to carry out facial identification in a commercial or retail setting. The OAIC FRT Guide centres on a few core themes, including (among others), accuracy, necessity and proportionality, consent and transparency and accountability.

As it involves the collection of sensitive information, one of the core challenges associated with the use of FRT is the issue of consent. The OAIC FRT Guide provides tips for ensuring that consent, in the context of FRT, is informed, voluntary, current and specific and that the persons providing consent have the requisite capacity to provide such consent.

Disclaimer
Clayton Utz communications are intended to provide commentary and general information. They should not be relied upon as legal advice. Formal legal advice should be sought in particular transactions or on matters of interest arising from this communication. Persons listed may not be admitted in all States and Territories.