Slow and steady: 2024 Privacy Act Reform Bill released

17 September 2024

by James Patto, Khushboo Ruhal and Olivia Sasse

After a range of inquiries, reviews and reports, on 12 September 2024, almost seven years after the last major reform to the Privacy Act 1988 (Cth) (the ‘Act’) (being the introduction of the mandatory notifiable data breach scheme), the Attorney-General, Mr Dreyfus, introduced the highly anticipated Privacy and Other Legislation Amendment Bill 2024 (the ‘Bill’) to parliament. The Bill, if enacted, will amend the Act to implement some of the recommendations that were agreed to by the Government in its response to the Privacy Act Review Report (the ‘Response’). However, the Bill does not include some of the more substantive proposals which had been anticipated given the Government had agreed or agreed in principle to these proposals in the Response.

In the current climate, privacy reforms in Australia have gained significant momentum. Recent major data breaches have heightened public awareness and concern, prompting the Office of the Australian Information Commissioner (the ‘OAIC’) to ramp up its enforcement activities. The rapid advancement of artificial intelligence has introduced new privacy challenges, further intensifying the call for comprehensive reform.

This urgency was underscored by the Attorney-General's speech at the Privacy by Design Awards in July, where he criticised the existing framework as "woefully outdated and unfit for the digital age" and outlined a range of key reforms that the Government was considering which included some major changes to the Act. The combination of these factors suggested that we would see transformative changes to privacy laws in Australia – something that there appeared to be a strong appetite for.

Despite the calls for swift and substantial reform of Australia’s privacy regime, the Bill reflects a more cautious and measured approach, much like the Response did in first instance. Of the 89 proposals for legislative change, the Response agreed to 25, agreed in principle to 56 and noted 8. The Bill itself aims to implement 23 of the 25 legislative proposals that were agreed in the Government Response to the Privacy Act Review and one of the ‘agreed-in-principle’ recommendations.

While the Attorney-General has promised further consultation on remaining reforms, there is uncertainty about introducing additional tranches of legislation before the 2025 federal election. Moreover, a change in Government could significantly impact whether the agreed reforms in the Response that have not been included in the Bill will still be pursued in their current form.

Reforms contained in the Bill

Set out below is a snapshot of the Bill, including all reforms, together with a high-level list of some key reforms that are not included in the Bill but may be introduced at some point in the future:

This diagram summarises the main inclusions and exclusions of the Bill outlined in this article
a data server room

Key takeaways:

  • The Government continues to take a measured and cautious approach in refreshing Australia’s privacy regime on the basis that practical considerations need to be examined to ensure there is balance between enhancing privacy protection and the potential increase in regulatory burden.
  • Many comprehensive reforms agreed to by the Government in the Response, such as organisational accountability, amendments to exemptions, broadening the definition of personal information, and the 'fair and reasonable' test for handling personal information, are not included in the Bill.
  • The new penalty and enforcement regime in the Bill will be crucial in addressing the OAIC's challenges in enforcing privacy laws and imposing civil penalties. The new tiered civil penalty system offers flexibility for the regulatory to take a more nuanced approach to enforcing the Act. Success of the regime will depend on adequate funding of the OAIC and its strategic use of enforcement measures.
  • We anticipate the OAIC will update its guidance on APP 11 to define "reasonable steps" more clearly. Organisations should proactively secure information by implementing governance, processes, procedures, and technologies to meet these standards.
  • As AI technology becomes more prevalent, privacy concerns are rising, highlighted by cases like Clearview AI and Facebook's photo scraping. The changes relating to automated decision making aims to enhance transparency in these processes, but the two-year lead-in period seems lengthy, potentially leaving a transparency gap. While the amendment is a step towards transparency, its delayed implementation and lack of additional individual rights may limit its immediate impact. Future guidance and legislative enhancements will determine its effectiveness.
  • In the absence of a complete set of reforms, a clear Government roadmap and timeline for when other major changes may be introduced, would be invaluable. This would set a date for a decision on whether the reform will be part of the next legislative amendments, providing some certainty to organisations about their priorities. Such clarity would help businesses focus on necessary compliance activities and prepare for future legislative changes, reducing the current state of reform limbo.
  • Forward planning will be critical– organisations should take initiative and begin scoping and planning early, especially with the increased enforcement powers and new tiered penalty regime. Organisations should continue to refresh their privacy and data protection practices as “no-regret" activities. These include establishing robust privacy frameworks, setting clear data retention periods, and conducting mandatory Privacy Impact Assessments for high-risk activities, in addition to improving, more generally, data governance and transparency, consent and control mechanisms for individuals and refining security (in particular, in respect of destruction and de-identification).
  • Getting ahead of the reforms will place organisations in a good position for when these reforms are introduced, as compliance activities can be accelerated (as and when necessary), allowing for not only expedited compliance, but also enhanced business value and insights from data holdings.
  • The development and use of AI pose significant privacy risks, with Australian privacy laws currently lagging international standards like the GDPR. Reports have highlighted instances where Australian personal information has been used in ways that diverge from community expectations. The anticipated AI guidelines by the OAIC will operate within an existing legal framework struggling to keep up with rapid technological changes. Addressing these issues is crucial to ensure responsible AI use and alignment with community standards, especially as technologies like quantum computing introduce further complexities.
  • The reforms represent a positive, albeit modest, step forward. Despite over two years of consultation, businesses still lack clarity on the Act's ultimate direction and timeline. This uncertainty complicates data governance and compliance priorities, leaving organisations unsure whether to invest in privacy upgrades without a definitive Government stance on major reforms. A change in Government could further complicate the reform process, adding to the ambiguity for organisations trying to plan and fund improvements.
Key reforms in the Bill and our observations

Details around the key changes contained in the Bill, together with our observations, are set out below:

Area of reform Details of reform Our observations
Children’s Online Privacy Code (COP Code) Information Commissioner to develop and register a COP Code, which would be an enforceable APP code that sets out how one or more of the APPs are to be applied or complied with in relation to the privacy of children (currently, this is only contained in OAIC guidance materials and is therefore non-binding).
  • The development of a COP Code by the Information Commissioner is a positive step towards enhancing online privacy protections for children, particularly on social media and electronic services. This initiative aims to align with international standards, such as those in the UK, and is supported by $3m in funding over three years for the OAIC.
  • However, it remains uncertain if the new code will address the recommendation for entities to consider the best interests of the child when collecting, using, or disclosing information. It's surprising that this wasn't included as a requirement.
  • Additionally, the Federal Government plans to ban social media use for children under a certain age, potentially up to 16 years old. This social media banning legislation will be developed in collaboration with states and territories and will include trials of age verification technology, which raises further privacy concerns.
  • If you collect children’s personal information in the course of business, you should already be considering how you can put in place additional processes, procedures and technology to enhance your data governance and protection environment.
Clarification of reasonable steps Clarifying that reasonable steps to protect information in APP 11 includes implementing both technical and organisational measures (such as encrypting data, securing access to systems and premises, and undertaking staff training).
  • The clarification that APP 11 includes both technical and organisational measures, such as data encryption and staff training, formalises best practices from OAIC guidance into a legal requirement.
  • This aligns our legislation with the GDPR and other international standards. We anticipate the OAIC will update its guidance on APP 11 to define "reasonable steps" more clearly. Organisations should proactively secure information by implementing governance, processes, procedures, and technologies to meet these standards.
Overseas data flows

Introducing a mechanism to prescribe countries and binding schemes as providing substantially similar protection to the APPs and then allowing disclosure of information to organisations in those jurisdictions. This will occur where a country or binding scheme:

  • has been prescribed in regulations as protecting the information in a way that, overall, is at least substantially similar to the way in which the APPs protect the information; and
  • provides mechanisms that the individual can access to take action to enforce that protection.
  • Introducing a mechanism to recognise countries and binding schemes with protections similar to the APPs is a positive step for facilitating cross-border data flows. This simplifies legal challenges for global organisations, especially those dealing with Europe (and other countries with GDPR like laws), the UK, New Zealand, and parts of the US and Canada.
  • As the EU revisits its standard contractual clauses under the GDPR, we anticipate that we will eventually see a similar approach integrated into the Act. The key question is whether these will be mandatory or part of a broader framework for countries lacking adequate protections.
Eligible data breaches Empowering the Minister to make a reasonable, proportionate and necessary declaration (operating for no more than 12 months) enabling entities to handle personal information in a manner that would otherwise not be permitted under the APPs or certain secrecy provisions in order to prevent or reduce the risk of harm to individuals in the event of an eligible data breach , where the Minister is satisfied that the making of the declaration is necessary or appropriate to prevent or reduce a risk of harm.
  • This amendment, likely further inspired by the Optus cyber incident, allows the Minister to temporarily enable entities to handle personal information in ways not usually permitted under the APPs to mitigate harm from data breaches.
  • It's a positive change, ensuring laws don't hinder harm minimisation. However, it requires careful balancing to avoid increasing risks to individuals through these sharing activities. There are harsh penalties for organisations who receive this information and do not handle it in accordance with the scheme — so beware! 
Penalties and enforcement
  1. Providing a non-exhaustive list of matters that a court must take into account when determining whether an interference of privacy is serious that includes the kind of information, the sensitivity, consequences or potential consequences, number and kinds of individuals affected, repetition of the act and steps that were not taken;
  2. Creating new civil penalty provisions as follows:
    1. where an entity engages in an interference with the privacy of an individual

      Maximum penalty of $660,000 (2000 units) for individuals or $3.3m (10000 units) for bodies corporate (section 13H).

      (Note: A court may determine there has been a contravention of this section where the seriousness element has not been met or there is a contravention of both)
    2. an APP entity prepares an incomplete eligible data breach statement under section 26WH or breaches the following:
      - APP 1.3: Requirement to have an APP privacy policy
      - APP 1.4: Contents of APP privacy policy
      - APP 2.1: Individuals may choose not to identify themselves in dealings with entities
      - APP 6.5: Written notice of certain uses or disclosures
      - APP 7.2(c) or 7.3(c): Simple means for individuals to opt out of direct marketing communications
      - APP 7.3(d): Requirement to draw attention to the ability to opt out of direct marketing communications
      - APP 7.7(a): Giving effect to request in a reasonable period
      - APP 7.7(b): Notification of source of information
      - APP 13.5: Dealing with requests
      - any other that becomes prescribed in legislation.

      Maximum penalty of $66,000 (200 units) for individuals or $330,000 (1000 units) for bodies corporate (section 13K).
  3. Enabling the Information Commissioner to also to issue infringement notices in relation to B(b) above

    Maximum penalty of a $3,960 (12 units) penalty for individuals, $19,600 (60 units) for non-listed bodies corporate and $66,000 (200 units) for listed bodies corporate.
  4. Expanding the powers of the Court in civil penalty proceedings to make any order it sees fit, provided the Court is satisfied there has been contravention of a civil penalty provision.
    1. Includes orders for compensation and to take steps to minimise further impacts to individuals impacted by the interference with privacy.
    2. Notably, such orders may be made on the Court’s own initiative during proceedings or on application by the Commissioner or an affected individual within 6 years of the contravention.
    3. It also has a retrospective application.
  • This reform is a pivotal part of the current legislative changes. The introduction of a non-exhaustive list of factors for courts to consider when determining serious privacy interference provides much-needed clarity. This codification aligns with existing OAIC guidance, ensuring consistency in legal interpretation.
  • The new tiered penalty system offers the regulator greater flexibility to address privacy breaches. It allows for a nuanced approach, particularly beneficial for dealing with smaller organisations. The penalties range from $660,000 for individuals to $3.3m for bodies corporate, depending on the severity and nature of the breach.
  • For those smaller operators above the small business threshold, these reforms act as a significant deterrent. If the OAIC can efficiently issue infringement notices, it could lead to widespread compliance improvements across the economy. The penalties for incomplete data breach statements and breaches of specific APPs further reinforce this.
  • One potential risk is that, without increased funding, the OAIC might opt for utilising simpler penalty regimes, potentially limiting the effectiveness of enforcement and the deterrence for larger organisations. The deterrent effect of massive GDPR fines has been considerable and the OAIC should not abandon that approach due to the complexity and challenges of enforcement. It will be important for the OAIC to use an appropriate mix of the enforcement measures now available to it to maximise their role in driving compliance.
  • The changes to the Court's powers regarding compensation orders allow individuals to seek compensation directly when a civil penalty is applied. This could temporarily address the absence of a direct right of action. However, it still depends on the OAIC initiating and succeeding in civil penalty proceedings, but this process is now somewhat easier due to the lower threshold for new civil penalties.
  • The retrospective application of expanded Court powers could significantly impact ongoing civil penalty proceedings. If civil penalties are successfully applied by the OAIC, individuals may bring direct compensation claims against these organisations under the new provisions. This aspect of the reform underscores the critical importance of compliance and highlights the potential consequences of privacy breaches. This serves as a reminder of the evolving legal landscape and the need for vigilance in adhering to privacy regulations today.
  • Overall, while the reforms provide a robust framework for privacy enforcement, their success largely depends on the OAIC receiving adequate funding to carry out its expanded enforcement activities effectively and the approach that they ultimately take in utilising these new enforcement measures.
Automated Decisions and Privacy Policies The amendment to APP 1.7 and 1.8 introduces a requirement for entities using automated decision-making (ADM) to disclose in their privacy policies the types of personal information used, and the decisions made by computer programs that could significantly affect individuals' rights or interests. This requirement will take effect 24 months after the Act receives Royal Assent.
  • As AI technology becomes more prevalent, privacy concerns are increasingly coming to the forefront. Issues such as those seen with Clearview AI and Facebook's public photo scraping highlight the potential risks associated with AI. This amendment aims to address these concerns by enhancing transparency around ADM processes. However, the two-year lead-in period seems lengthy for what is essentially a transparency measure. During this time, significant automated decision-making will likely occur, potentially leaving a gap in transparency.
  • Originally, this amendment was paired with a right for individuals to seek further information about ADM in the Response, which has not been included in the final version. It's unclear if this right will be introduced as part of individual rights in future legislation. Given the privacy risks posed by AI, more comprehensive and immediate requirements might have been expected.
  • Overall, while the amendment is a step towards greater transparency in ADM, its delayed implementation and the absence of additional rights for individuals may limit its immediate impact. The effectiveness of this measure will depend on future OAIC guidance and potential legislative enhancements.
Statutory Tort

Schedule 2 of the Draft Bill introduces a new statutory tort for serious invasions of privacy. This provision allows individuals to take legal action against others who invade their privacy, either by intruding upon their seclusion or misusing their information. Defences are available if the accused acted with lawful authority or in situations involving consent, necessity, or the defence of persons or property. If the invasion involves publishing information, defences similar to those in defamation cases may apply. Additionally, if a defendant identifies competing public interests, such as freedom of expression, the plaintiff must prove that their privacy interest outweighs these.

Exemptions are in place for intelligence agencies, those disclosing information to such agencies, and individuals under 18. Journalists and certain associated persons, as well as enforcement bodies, are also exempt in specific circumstances.

The court can grant remedies, including capped damages, and may issue interim injunctions to prevent further privacy invasions. Proceedings can be summarily dismissed under certain conditions.

  • This tort extends beyond APP entities, applying to individuals and small businesses, which broadens its scope significantly. It's unclear why this was prioritised over a direct right of action, but it may be due to the extensive groundwork laid by the ALRC in developing the tort model which has not been done for a direct right of action under the Act.
  • As this provision comes into effect, it's expected to trigger an initial wave of litigation, which will help establish the legal framework and set precedents for future cases.
Other reforms in the Bill

Other legislative changes contained in the Bill include:

  • allowing the Minister to give direction to Commissioner to develop an APP code or temporary APP code to provide greater clarity and specificity about the application of, or compliance with the APPs, where the Minister is satisfied that it is in the public interest to develop the code;
  • allowing the Minister to have the power to issue a more targeted emergency declaration, requiring that the declaration specify the kinds of personal information that may be handled, the entities which may handle the personal information, the entities to which the personal information may be disclosed, and the permitted purpose of the collection, use or disclosure of the personal information;
  • enabling the Information Commissioner to conduct public inquiries into specified matters as directed by or subject to Ministerial approval, including examining acts and practices that may illustrate systemic or industry-wide issues relevant to individuals’ privacy;
  • allowing the Information Commissioner to issue a determination requiring the entity to assist affected individuals in replacing compromised credentials, or to engage service providers such as identity theft and cyber support providers to give support to affected individuals for a certain time period after the incident; and
  • empowering the OAIC to use more robust investigation and monitoring powers from Part 2 and 3 of the Regulatory Powers (Standard Provisions) Act 2014, such as:
    • entry into premises with a judicial warrant or informed and voluntary consent of the occupier;
    • seizing evidence of a kind not specified in the warrant in specific circumstances; and
    • use such force against things as is necessary and reasonable in the circumstances of executing a warrant.

Criminalisation of doxxing

This Bill also seeks to introduce targeted criminal offences for doxxing by restricting the ability for people to use a carriage service to make available, publish or otherwise distribute an individual’s personal data online in a manner that would be menacing or harassing towards that individual.

The Bill, which applies a maximum penalty of 6 years’ imprisonment for the new offence, also applies a higher maximum penalty of 7 years’ imprisonment where a person or group is targeted based on protected characteristics, such as race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin.

Our perspectives and insight

These reforms to Australia's privacy regime represent a positive, albeit modest, step forward. They lay a solid foundation for future, more comprehensive changes. However, it's disappointing that more ambitious reforms weren't included in the Bill. After over two years of consultation (without factoring in other reviews such as the Digital Platforms Inquiry), businesses still lack clarity on the Act's ultimate direction and timeline.

Our previous analysis pointed out the risk of increased uncertainty for organisations regarding data governance and compliance priorities. While some level of reform is anticipated, it's unclear if this justifies significant investment in privacy upgrades without a definitive Government stance on major reforms. Organisations face continued ambiguity about whether to act now or wait for clearer regulatory expectations. Additionally, a change in Government could further complicate the reform process, adding further uncertainty for organisations attempting to plan for, and fund, data governance improvements.

In the absence of concrete reforms, a clear Government roadmap and timeline for major reforms would be invaluable – setting out a date where a go / no go decision will be made on whether the reform will form part of the next tranche of legislative amendments. This will provide at least some certainty to organisations waiting to see what they should be prioritising in terms of their uplift activities.

In this vein, it's puzzling why straightforward reforms promoting good data practices weren't included in this tranche of reforms, despite the Government's willingness to consider some agreed-in-principle changes (e.g. the statutory tort). These practices are not just compliance activities; they enhance business value and insights from data.

We continue to recommend that industry proactively enhance compliance and data governance to address data risks, maximise data value and create a solid foundation for future compliance. Key reforms to consider adopting early include:

  • Organisational accountability: Organisations should establish robust privacy and data governance frameworks. This involves understanding and mapping all personal information, both structured and unstructured, across the full data lifecycle and appointing a single officer responsible for privacy.
  • Data retention: APP entities should establish and periodically review maximum and minimum retention periods for personal information, considering the type, sensitivity, and purpose of the information, as well as organisational needs and legal obligations. These retention periods should be clearly specified in privacy policies.
  • High-risk activities: Mandatory Privacy Impact Assessments (PIAs) should be undertaken for activities that significantly impact individual privacy. Organisations should ensure relevant teams understand the PIA process and their roles within it.

These examples highlight recommendations that could easily have been included in this round of reforms.

The inclusion of doxing criminal offences wasn't part of the Privacy Act review or the Response, emerging instead from a high-profile incident in February involving the personal information of Jewish community members that was maliciously leaked. While necessary, it's unfortunate that significant effort appears to have been dedicated to this new offence rather than addressing long-criticised systemic issues within the regime.

The scale of change required to align Australia's privacy laws with leading international regulations, such as the GDPR, is substantial. However, many of the proposed reforms in the Response mirror those mandated by the GDPR, which has been in place for six years. This means that implementing these reforms in Australia would likely have a lesser burden on organisations who are able to benefit from the EU's experience and related technology tools – and many Australian organisations that operate internationally already look towards the GDPR as a benchmark to assess their compliance against.

Another significant concern is the Australian privacy risks involved with the development, training and use of AI by both local and international organisations. Reports have already highlighted instances where Australian personal information has been collected and used in ways that diverge from community expectations. In some of these cases, the GDPR has provided a level of protection that Australian law currently lacks, and without clear rules, this issue may worsen and become more complex. The development of AI guidelines by the OAIC is eagerly anticipated, but these guidelines will operate within the existing legal framework, which is already struggling to keep up with rapid technological changes. This challenge is further compounded by the potential impact of quantum technologies, which could introduce even more complexities into the privacy landscape. Addressing these issues is crucial to ensure that organisations can navigate their AI acceleration journey responsibly and in alignment with community standards.

For now, organisations remain in reform limbo, awaiting clarity on what will be expected of them in the coming years.


The information contained in this article is general in nature and is not intended to be a substitute for legal advice. Readers should obtain independent legal advice as to their specific circumstances.

Note: Calculations based on penalty unity value as at date of this article.

Contact us

James Patto

Director - Digital, Cyber and Tech Law, Melbourne, PwC Australia

+61 431 275 693

Email

Adrian Chotar

Partner, Digital, Cyber and Technology Law, Sydney, PwC Australia

+61 457 808 068

Email