This article was co-authored with Donnacha Egan and Isabella Dudkowski.
On 29 November 2024, the first tranche of sweeping Australian privacy reforms contained in the Privacy and Other Legislation Amendment Bill 2024 (Cth) (Bill) passed both Houses of Parliament. It received Royal Assent on 10 December 2024, and the Privacy and Other Legislation Amendment Act 2024 (Cth) (Act) is now in effect. We previously considered the Bill’s implications when it was tabled on 12 September 2024. In this update, we summarise key amendments to the Privacy Act 1988 (Cth) and other legislation, including some of the ‘eleventh hour’ additions. We also outline strategies that may assist organisations refocus privacy practices to comply with the new law.
The Act represents the most substantial change to Australia’s privacy regime since its inception. In most cases, it requires organisations to undertake detailed reviews and remediation not only in respect of current and proposed data collection, use, storage and disclosures but extending to raising enterprise-wide awareness about privacy and implementing new measures to minimise litigation exposure arising from individual claims for serious invasions of privacy.
This update covers:
- A new tort of ‘serious invasions of privacy’
- Expansion of regulatory enforcement powers
- Automated decision-making transparency requirements
- Overseas disclosures using a ‘white list’ of safe jurisdictions
- Technical and organisational measures to ensure security
- New criminal offence for doxxing
- Eligible data breach declarations requiring information sharing
Timeframes
The Bill was signed into law by Royal Assent on 10 December 2024, and the amending provisions came into effect upon signing, with the exception of two major changes: the tort of serious invasions of privacy which commences six months after Royal Assent and some provisions relating to automated decisions have a two-year grace period from Royal Assent.
Tort of serious invasions of privacy
In this first tranche of major updates to the Privacy Act 1988 (Cth), a statutory tort of serious invasions of privacy has been introduced. This means privacy will now be a personal right; for the first time, Australians will have a personal right of action to sue another party where that party has invaded the individual’s privacy by intruding upon their seclusion or misusing information relating to them.
In summary, an individual will have a civil cause of action in tort (allowing recovery of damages and/or the obtaining of an injunction) against another party if:
- that party invaded the individual’s privacy by either intruding upon the plaintiff’s seclusion, and/or misusing information that relates to the individual; and
- a person in the position of the plaintiff would have had a reasonable expectation of privacy in all the circumstances; and
- the invasion of privacy was intentional or reckless; and
- the invasion of privacy was serious; and
- the public interest in the individual’s privacy outweighs any countervailing public interest.
The comparison between the ‘public interest in the individual’s privacy and countervailing public interest was added to the Act as an express element of the cause of action in the ‘last minute’ amendments. The Act provides some guidance around what is considered a ‘countervailing public interest’ such as freedom of expression (including political communication and artistic expression) and freedom of the media, public health and safety and national security.
The exemptions to the new statutory tort were broadened in the amendments made by the Senate. Those exempt from liability for the tort include:
- journalists in the collection, preparation for publication or publication of journalistic material, or publication or distribution of journalistic material prepared for publication by a journalist;
- agencies and state and territory authorities (and their staff members) provided the agency or authority has invaded the individual’s privacy in good faith to perform its functions or exercise a power;
- law enforcement bodies, and disclosures to and from law enforcement bodies;
- intelligence agencies, and disclosures to and from intelligence agencies; and
- persons under 18 years of age.
Significantly, the most recent amendments to the Act saw the introduction of the ability for a court, at any stage in the proceedings, to determine whether an exemption applies in relation to the invasion of privacy.
In terms of the remedies available in successful actions, the court must not award aggravated damages but may award damages for emotional distress, or award exemplary or punitive damages in exceptional circumstances. Further, damages for non-economic loss and exemplary or punitive damages are subject to a cap equal to the greater of $478,550 and the maximum amount of damages for non-economic loss that may be awarded in defamation proceedings under Australian law.
Expanded powers of Office of the Australian Information Commissioner (OAIC)
The Act includes new enforcement and investigative powers for the OAIC. The OAIC will adopt standard regulatory tools available under the Regulatory Powers (Standard Provisions) Act 2014 (Cth) with rights to:
- request information from Australian Privacy Principles (APP) entities about actual or suspected eligible data breaches;
- conduct assessments of compliance with the notifiable data breaches scheme; and
- investigate and resolve privacy breaches.
The Act also introduces a new tiered civil penalty process with medium- and low-level penalties and infringement notices that can be issued by the OAIC directly. Finally, the Senate amendments also give the OAIC the power to issue compulsory ‘compliance notices’ to an entity as an alternative to seeking civil penalty orders, enforceable undertakings or issuance of infringement notice. These powers give the OAIC more flexibility and enforcement options, which is likely to lead to a significant increase in enforcement action, particularly in respect of ‘administrative’ breaches such as lack of a compliant APP privacy policy or failure to draw an individual’s attention to the ability to opt out of direct marketing communications.
Automated decisions using personal information
In an area which will force many organisations to review in detail their current uses of technology, the Act requires those engaging in automated decision making to provide individuals with adequate information on these decisions by updating their privacy policies to achieve transparency.
What is an automated decision?
An automated decision is when a computer program uses an individual’s personal information to make a decision, or to do a thing that is substantially and directly related to the making of a decision, and the decision could reasonably be expected to significantly affect the rights or interests of the individual, whether adversely or beneficially. In essence, it is when a decision that greatly impacts an individual is made by automated means without any human involvement.
Examples of automated decisions include:
- a decision made under a provision of an Act or a legislative instrument to grant, or to refuse to grant, a benefit to the individual;
- a decision that affects the individual’s rights under a contract, agreement or arrangement; and
- a decision that affects the individual’s access to a significant service or support.
Real world examples would include a computer system deciding whether or not to grant a loan to an individual based on the personal information they have provided, or an automated system marking an exam and deciding an individual’s grade. It is important to note that automated decisions are not limited to the use of artificial intelligence tools. Many existing technology with “hard coded” decision making logic would likely be captured.
When does it come into effect?
There is a 24 months’ grace period from the commencement of the Act before the automated decision requirements come into effect. However, organisations should be aware that, once in effect, the requirements will apply to all automated decisions, regardless of whether:
- the arrangement of the computer program to make the decision was made before or after the commencement date of the new law;
- the use of personal information by the computer program occurred before or after the commencement of the new law; and
- the personal information was acquired or created before or after the commencement of the new law.
Overseas disclosure of personal information (Australia’s ‘adequacy’ list)
The Act introduces a new ‘whitelist’ mechanism for disclosing personal information overseas to recipients who are in approved countries. Countries and binding schemes may be approved by the Minister if:
- they provide the same, or at least substantially similar, level of protection as the APP would provide; and
- there are mechanisms the individual can access to enforce this protection.
The approval may be subject to conditions for certain classes of entities and kinds of personal information.
This ‘whitelist’ approach has parallels with the adequacy decisions of the European Commission under the General Data Protection Regulation (GDPR) which allow for the transfer of personal data from the EU to countries whose privacy laws provide a level of protection equivalent to the GDPR, without the need for any additional safety mechanisms.
Companies should note that the whitelist mechanism applies to information disclosed after the commencement date of the new law, regardless of when it was acquired or created. The mechanism will benefit organisations by reducing their compliance burden when disclosing personal information overseas to recipients in approved countries.
Technical and organisational measures to ensure security of personal information
APP 11 is essentially the cornerstone of all APPs under Australian privacy law, requiring organisations to take ‘such steps as are reasonable in the circumstances’ to keep personal information secure. The Act introduces APP 11.3, which requires that APP entities consider ‘technical and organisational measures’ as steps to take to meet the requirements of APP 11, modelling language used in the European Union’s GDPR.
The purpose of introducing APP 11.3 is to address the common misconception that IT security is a purely technical problem, and that organisations can rely solely on IT security to protect personal information. This can leave gaps in the privacy and cyber defences of organisations by overlooking the importance of organisational measures, such as continuous training of staff on key privacy and cyber security issues, and introducing policies, standards and procedures. The expectation is now clear; APP entities must take continuing and proactive steps including to conduct training of relevant personnel. This training should be documented and reviewed on a regular basis to ensure staff are up to date with, aware of, and can respond effectively to evolving threats.
It is no longer sufficient for entities to solely have strong technical defences or strategies; these now need to be coupled with built-in organisational measures to comply with APP 11.
Doxxing and a new criminal offence under the Criminal Code
In a world-leading move, the Act introduces a new criminal offence involving “doxxing" by amending the Criminal Code Act 1995 (Cth) (Criminal Code). Doxxing is the targeted release of personal information in a malicious manner using a carriage service. There are two new offences relating to doxxing:
- releasing personal data that is menacing or harassing towards an individual; or
- releasing personal data relating to one or more members of a group due to that person’s belief that the group is distinguished by race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin.
‘Personal data’ in this context has been given a more expansive definition than ‘personal information’ as defined under the Privacy Act. Personal data encompasses information about an individual or group member that enables them to be identified, contacted or located, including names, photographs or images, work or business addresses, places of education and worship.
The maximum penalty for a doxxing offence against an individual is 6 years’ imprisonment and an offence against one or members of a group is punishable up to 7 years’ imprisonment. It is also immaterial if a group is actually distinguished by any of the characteristics listed above. As the doxxing offences are contained within the Criminal Code they are not subject to the exemptions under the Privacy Act, meaning small businesses and journalists could also be charged with, and found to have committed, a doxxing offence.
Eligible data breach declarations
The Act introduces the ability for the Minister to make a declaration to allow the sharing of information to prevent or manage a large data breach. The declaration could allow financial institutions to share information about exposed personal information (such as Tax File Numbers and passports), with government agencies as well as with other organisations such as other financial institutions or competitors. This tool is only available for specific data breaches for limited periods and purposes. It may be cumbersome to seek and obtain such a declaration from the Minister and is only suitable for large data breaches where harm mitigation requires disclosure of personal information.
Where to from here?
The Attorney-General’s Department has indicated it plans to begin consulting in December 2024, on the second tranche of privacy reforms to which the government has agreed or agreed in principle. These could include the removal or reduction of both the employee records exemption and small business exemption, expanded individual rights such as the right to erasure, and the controller/processor distinction to mirror the GDPR.
Should the second tranche incorporate the remaining reforms, this may allow Australia to be re-assessed for an ‘adequacy decision’ from the EU which would facilitate the transfer of data from the EU to Australia without the need for additional safeguards.
What to do now
Reviewing privacy practices and policies is a fundamental starting point to ensure they are compliant, particularly around adequacy of the information contained in privacy policies, collection statements as well as direct marketing and consent collection mechanisms.
Another important action item is reviewing decision-making processes to identify automated decision-making already in use or plans for procurement, and updating privacy practices and privacy policies sufficiently to achieve the required transparency. The use of AI will be particularly relevant as it is increasingly being used for some forms of decision-making.
For organisations that regularly collect or amass personal information, it is critical to commence review of your organisation’s conduct against the possible risk of serious invasions of privacy, the potential litigation threat this may pose and how new and mitigating practices could reduce that risk.
It is also important for organisations to consider introducing regular and comprehensive training for staff on privacy and cyber risks, and additional organisational measures to ensure security.
A final take-away is that the law requires immediate action and a roadmap for future privacy compliance. As the Australian Privacy Commissioner Carly Kind recently stated, 2025 is going to be a big year for privacy and that is also likely to be accompanied by a big year of enforcement action.