This article was co-authored with Gabe Abfalter.
The Office of the Australian Information Commissioner (OAIC) has issued guidance to private sector organisations who are considering using facial recognition technology (FRT) for identification purposes in commercial or retail settings. The guidance follows a determination of the Privacy Commissioner which found that the use of FRT by a retailer breached the Privacy Act 1988 (Cth) (Privacy Act).
OAIC guidance on facial recognition technology
Biometric templates and biometric information, which are collected by FRT, are considered sensitive information under the Privacy Act. The Privacy Act imposes heightened requirements around the collection, use and disclosure of sensitive information than are applied to personal information.
Relevantly, individuals must consent to the collection of sensitive information (unless an exception applies under the Privacy Act), the information must be reasonably necessary for one or more of the organisation’s functions or activities and must only be used for the purpose for which it was collected or for a directly related purpose.
The OAIC guidance states that organisations should implement a ‘privacy by design’ approach and consider the following factors regarding FRT.
- Necessity and proportionality (APP 3) – personal information for use in FRT must only be collected when it is necessary and proportionate in the circumstances and where the purpose cannot be reasonably achieved by less privacy intrusive means.
- Consent and transparency (APP 3 and 5) – individuals need to be proactively provided with sufficient notice and information to allow them to provide meaningful consent to the collection of their information.
- Accuracy, bias and discrimination (APP 10) – organisations need to ensure that the biometric information used in FRT is accurate and steps need to be taken to address any risk of bias.
- Governance and ongoing assurance (APP 1) – organisations who decide to use FRT need to have clear governance arrangements in place, including privacy risk management practices and policies which are effectively implemented, and ensure that they are regularly reviewed.
The guidance states that it is best practice for organisations to undertake a privacy impact assessment prior to implementing FRT.
Privacy Commissioner’s determination
The Privacy Commissioner investigated the use of FRT to capture images of people’s faces from CCTV footage in-store which were then converted into a “vector set” representing an individual’s facial features. A vector set was compared against a database of individuals deemed to be a risk, for example, due to past crime or violent behaviour.
Notable features of the FRT system and determination included:
- Collection (APP 3). If no match was found, the facial image and associated vector set were automatically deleted within approximately 4.17 milliseconds. Notably, the facial image and vector set of non-matches were stored only in RAM for this brief period. The Privacy Commissioner found that this constituted collection for the purpose of inclusion in a record.
- Notification (APP 5.1). The retailer posted two types of notices at the entrances of its stores. One type stated that the retailer used “video surveillance” and the other type stated that “Video surveillance, which may include facial recognition, is utilised”. The Privacy Commissioner found that the notices were insufficient because they did not explicitly state that FRT was being used, the purpose for which facial images were being collected, and consequences for the individual if the information was not collected.
- Consent (APP 3.3) The FRT system did not involve the collection of consent from individuals entering stores, instead the retailer sought to rely on a permitted general situation under the Privacy Act, which provides that sensitive information can be collected without consent in limited circumstances.
The Privacy Commissioner found that neither the ‘serious threat situation’ nor ‘unlawful activity or misconduct situation’ applied. Those situations require the retailer to have a reasonable belief that it was necessary to collect facial images and vector sets to (1) lessen or prevent a serious threat to the life, health or safety; or (2) take appropriate action in relation to suspected unlawful activity or serious misconduct.
The Privacy Commissioner held that a reasonable belief that the FRT system was necessary, could not be held as the system:
- Only detected individuals who were involved in a previous incident and did not address one-off incidents.
- Did nothing to deter misconduct, as individuals did not know whether they were added to the retailer’s database.
- Was just one tool in an array of strategies that the retailer used to prevent and address incidents.
- Disproportionately invaded the privacy of individuals through the ‘wholesale and indiscriminate collection’ of personal and sensitive information to take action against a small number of individuals in a limited set of circumstances. The retailer’s database included 448 individuals at its peak, while it likely scanned the faces of hundreds of thousands of individuals.
- Privacy policy and privacy impact assessment. The determination also found that the retailer’s privacy policy did not sufficiently address the use of FRT and that undertaking a privacy impact assessment (PIA) was a reasonable step the retailer could have taken (or, at a minimum a privacy threshold assessment to document the reasons the retailer believed that a PIA was not necessary).
In summary, the determination found that use of FRT was in breach of the Privacy Act, as it did not provide for:
- Obtaining consent from individuals to collect their sensitive information, and no exceptions under the Privacy Act applied (APP 3 and 3.3).
- Taking reasonable steps to notify individuals about the facts, circumstances and the purpose of their personal information being collected (APP 5.1).
- Taking reasonable steps to implement practices and procedures to ensure compliance with the APPs because it failed to conduct a privacy impact assessment (APP 1.2).
- Including in its privacy policies information about the kind of personal information collected, how it was collected and held (APP 1.3).
The Privacy Commissioner issued a declaration requiring the destruction all personal information and sensitive information collected via the FRT system in 12 months and one day from the publication of a statement on the retailer’s website (which is to be made within 30 days of the determination).
- Acknowledgement of cooperation. The Privacy Commissioner specifically noted that the respondent was cooperative and forthcoming throughout the investigation.
Retailer to seek review of determination
Following the determination, the retailer announced they will seek review before the Administrative Review Tribunal, citing abusive and threatening encounters have increased in their stores by 50 per cent in the last year and that FRT is an important and proven tool for employee and customer safety.
Considerations for organisations operating or considering using facial recognition technology
There has been an explosion in the use of FRT in areas such as airports, sports stadiums, shopping centres and public transport. The determination and OAIC guidance raise questions for organisations already using or considering using FRT.
It will be particularly important for organisations to ensure that:
- Use of any FRT satisfies consent and notification requirements under the APPs.
- FRT system design and associated policies take account of the fact that holding and using personal information for a brief period (milliseconds in the current case) can be considered collection under the Privacy Act.
- Privacy policies are up to date and address the use of FRT.
- Consideration is given to whether a PIA is required.
Australia’s privacy laws and enforcement continue to progress
The determination has parallels with other prominent examples where FRT was found to breach privacy laws, such as the determination against Clearview AI, whose facial recognition tool illegally amassed a database of over 3 billion images from publicly available sources.
This determination reflects some of the risks of using FRT that we outlined last year in our Privacy Act Review report, as the Commonwealth Government strengthens Australia’s privacy laws to align with international standards.