Publication
International arbitration report
In this edition, we focused on the Shanghai International Economic and Trade Arbitration Commission’s (SHIAC) new arbitration rules, which take effect January 1, 2024.
Australia | Publication | September 2024
The Privacy and Other Legislation Amendment Bill 2024 (Bill) containing the most anticipated changes to Australian privacy law since its inception in 1988, was tabled on 12 September 2024. If passed, the new privacy laws will commence from the date of Royal Assent, with some laws being provided a longer runway for compliance. Enactment of the full range of reforms arising from the Attorney General’s Department Privacy Act Review Report remain more distant, most likely due to the complexities of getting businesses ready to comply and in balancing competing interests. For privacy enthusiasts, this interim update to federal privacy law, offers something meaningful, if below expectations.
The major updates to the Privacy Act 1988 (Cth) include enshrinement of the public interest in the protection of privacy, a statutory tort of serious invasions of privacy, transparency when using automated decision-making, fines for minor privacy breaches and recognition of the privacy of children with the introduction of a Children’s Online Privacy (COP) code. The Bill coincides with release of another bill amending communications laws to address online misinformation and disinformation. The Criminal Code Act 1995 (Cth) will be amended to include prohibitions against the malicious release of personal information – online–doxxing – with criminal sanctions of up to 7 years.
This modernisation of privacy law follows a lengthy review of the Privacy Act 1988 (Cth) commenced by the Attorney-General’s Department four years ago. The Department completed its review in late 2022, publishing its report in February 2023. The Federal Government announced its response in September 2023, signalling appetite for a GDPR-aligned regime, which has set a high watermark for privacy protections throughout the EU. Since then, we have seen the rapid rise of general purpose artificial intelligence with generative capabilities agitating new privacy issues globally and demonstrating the ever-changing world of privacy, its increasing interaction with technology and the challenge the law has keeping pace.
While the extent and timing of all recommended privacy reforms remain elusive, some amendments were expedited in 2022 and are currently live. These changes were seemingly in response to a stream of persistent and pervasive data breaches across corporate Australia, with the introduction of significantly greater fines for serious or repeated interferences with privacy, being the greater of $50million, three times the value of the benefit obtained from the breach, or 30% of adjusted turnover in the relevant period.
Other fast-tracked amendments included expanding the extra-territorial reach of the Act so that it applies when an entity is carrying on business in Australia regardless of where it is domiciled, and further empowering the OAIC with rights to request information from APP entities about actual or suspected eligible data breaches, to conduct compliance assessments with the notifiable data breaches scheme and to investigate and resolve privacy breaches.
To date, the 2022 reforms represented the most significant since the introduction of the notifiable data breaches scheme in 2018. The question now is just how deeply will the latest changes affect both Australian citizens and Australian businesses and what will come of the remaining reforms? Another pressing question is how will the changes impact Australia’s adoption of general purpose AI, with a new EU-style AI Act also being flagged earlier in September for accelerated introduction, alongside the publication of a Voluntary AI Safety Standard and a proposals paper foreshadowing 10 mandatory guardrails for high-risk settings.
Significantly, the Federal Government was supportive of the majority of the Attorney-General’s Department’s proposed reforms, stating that it ‘agreed’ to 25 proposals, ‘agreed in principle’ to 56 proposals and noted 8. When a proposal is ‘agreed in principle’, the government is basically saying it’s a good idea but more work needs to be done to ensure that it is good in practice. As the waiting game continues for some of the proposals, which of those in the current ‘GDPR-lite’ Bill will be the most impactful reforms and why?
A change that could bite all APP entities, is the OAIC’s new power to issue infringement notices for civil penalties where there has been a minor breach of the Act such as failure to have in place a privacy policy or a failure to have a compliant one.
A change that will impact both organisations and individuals, is the creation of a statutory tort of serious invasions of privacy. This means that, for the first time, individuals will have a personal right of action under the Act and the ability of individuals to be compensated by those who seriously invade their privacy, if their claim is successful.
Its introduction could also, in part, help to deal with the malicious practice of doxxing (i.e. publishing or distributing a person’s identity and/or sensitive information, encouraging others to harass or send violent or threatening messages to that person). Doxxing recently became a prominent issue after a doxxing incident led to threats and harassment of members of the Jewish community. A personal tort would be in addition to the consequences of criminal sanction in relation to doxxing.
While such a tort needs to be aligned with the privacy-related laws of the states and territories to ensure a consistent national approach, it affords individuals an important direct cause of action. At the state and territory level, privacy-related laws such as surveillance laws already exist, limiting the ability to monitor people by video or other recordings (but with gaps and inconsistencies around the country). However, those state and territory laws are generally concerned with punishment of the offender via criminal offences and not compensation of the affected individual. The new doxxing offences, which are intended to be inserted into the Criminal Code Act 1995 (Cth), have penalties of up to six years’ imprisonment, or seven years where the doxxing relates to members of groups distinguished by race, religion, sexual orientation and various other characteristics.
In contrast, the new tort for serious invasions of privacy can give rise to awards of compensation. However, awards of compensation for non-economic loss are capped, and claims must be brought within one year of the claimant becoming aware of the invasion of the privacy. A tort accompanied by an avenue for criminal prosecution of an offence in the case of doxxing has raised the stakes seriously higher when it comes to privacy protection in Australia.
For businesses, the impact of this tort and criminal offence could extend to the misuse of information obtained about an employee, having the potential to cause detriment to that employee’s employment or promotion prospects. To manage this exposure, there will likely need to be far more stringent efforts to ensure personal information is kept secure, access to it is more limited, and data retention practices reviewed to ensure personal information is not retained longer than it should be.
For media organisations, and specifically public interest journalism, defences have been included for ‘journalistic material’. Despite the availability of defences, they are not absolute and so the change will likely impact ‘clearance’ protocols prior to publication similar to those that currently surround decisions to publish potentially defamatory material. The statutory protection to limit the media’s exposure to the criminal offence is a critical win for the protection of freedom of expression in a democratic society.
The new law relating to automated decision-making, involves transparency requirements and extends to notifying individuals when this occurs, including where the decision has a legal effect. Automated decision-making is on the rise and often underpins processes such as speed and seat-belt cameras, insurance claims processing, and detection of scam email and fraudulent payments. Notifications via inclusion in an organisation’s privacy policy will be required. However, the precise nature of that notification in the context of a transparency requirement, will require careful consideration and explanation of how the automated decisions are made, rather than a simple reference to the fact of automation.
The topic of automated decision-making returns us again to the impact of AI on privacy or, rather, the impact of privacy on AI, and whether these privacy law updates will slow innovation in Australia. AI is increasingly used as the engine for automated decision-making and the privacy law changes have the potential to slow down the adoption of AI tools.
Developers and users of general-purpose AI models may struggle to articulate a specific-enough purpose for the collection of personal information and find it hard to assess whether that information is reasonably necessary for the function or activities of the model. Despite that struggle, in view of the time it has taken to introduce privacy reform, it is unlikely that weaker privacy protections will be on the political agenda in the near future nor enforcement action delayed. The adoption of general-purpose AI models will soon mean more due diligence work than ever to make sure that the business understands and manages the increased privacy risks.
Publication
In this edition, we focused on the Shanghai International Economic and Trade Arbitration Commission’s (SHIAC) new arbitration rules, which take effect January 1, 2024.
Subscribe and stay up to date with the latest legal news, information and events . . .
© Norton Rose Fulbright LLP 2023