Lessons for New Zealand’s Privacy Laws
written by Sweta Bhattacharya, a student in LAWCOM733: Special Topic: Shaping the Law in Tech Driven Era (2025); This blog summarises their longer academic essay

In today’s digital landscape, where smartphones and social media are integral to daily life, concerns have intensified regarding how online platforms collect and use personal information. No platform has embodied this concern more than Facebook (now Meta), with over 2.11 billion daily users, known for its data-driven advertising algorithms.
The 2018 Cambridge Analytica scandal exposed the misuse of millions of personal data points, highlighted regulatory gaps, and triggered privacy breaches and global legal controversies.
Following these revelations, Facebook faced lawsuits worldwide, including a significant legal case in Australia: Facebook Inc v Australia Information Commissioner [2022] FCAFC 9 (“Facebook Inc v OAIC”), an appeal from Australian Information Commissioner v Facebook Inc (No 2) [2020] FCA 1307.
The case provides insights for New Zealand, whose Privacy Act 2020 (“2020 Act”) shares principles with Australia’s Privacy Act 1988 (“1988 Act”). NZ’s earlier Privacy Act 1993 (“1993 Act”) was introduced four years after Australia’s and reflected comparable provisions. However, the 1993 Act lacked clear provisions for overseas organisations without a physical presence in NZ. Recognising such shortcomings, NZ enacted the 2020 Act, extending some aspects of the extraterritorial application.
The Scandal That Shook Trust
At the heart of the Cambridge Analytica scandal was an application called “This Is Your Digital Life” (“the app”). On the surface, this was just a personality quiz app. But beneath that, it was harvesting a vast amount of data points, not only from those who downloaded it, but also from their Facebook friends without their consent. These data points were later used to target people with political advertising, fuelling election campaigns. Suddenly, the world understood the real nature of “free” social media. Australia’s privacy watchdog – the Office of the Australian Information Commissioner (“OAIC”) alleged that between 2014 and 2015, data from over 311,127 Australian users was disclosed, most without installing the app.
Facebook in Court – Australian proceedings
In 2020, the OAIC took legal action against Facebook Inc and its subsidiary Facebook Ireland (collectively “Facebook”), alleging violations of the 1988 Act. OAIC claimed breaches of the Australian Privacy Principles (“APPs”), specifically APP 6 (limiting use to the original purposes) and APP 11 (requiring reasonable data security).
OAIC alleged that Facebook contravened Section 13G of the 1988 Act, involving serious/repeated privacy interferences, which carries civil penalties. Since Facebook Inc is a foreign entity, OAIC was required to establish an “Australian link” under Sections 5B(3)(b) and 5B(3)(c), demonstrating that Facebook was carrying on business in Australia and it has collected or held personal information there, relevant to OAIC’s claim regarding APP 6 and 11.
Facebook Inc applied to set aside leave to serve outside Australia. On 14 September 2020, Thawley J dismissed the application, finding a prima facie case and that OAIC sufficiently demonstrated an “Australian link”. He inferred Facebook carried on business & collected and held personal information in Australia, partly based on its use of “cookies” on users’ devices and provision of the Graph API (“API”) to Australian app developers.
Facebook Inc appealed (Facebook Ireland did not). The Full Federal Court (“the FFC”), determined whether Facebook Inc was:
- carrying on business in Australia under Section 5B (3)(b); and
- collecting or holding personal information in Australia under Section 5B (3)(c) (now repealed).
On 7 February 2022, the FFC dismissed the appeal, finding Facebook Inc was indeed carrying on business in Australia through a “Data Transfer and Processing Agreement” with Facebook Ireland. Further, the installation of cookies on Australian devices and the provision of API to local developers, even without physical presence or revenue, constituted carrying on business. Regarding data, the FFC agreed that APP 11 covers a wide scope of personal data collected or held in Australia, even if undisclosed to an app. Facebook’s cookies collected data locally, though without controlling devices or servers, it did not “hold” the data. The FFC held a prima facie case that an Australian link was present, and the 1988 Act applied to extraterritorial conduct by Facebook Inc. The FFC also held that Thawley J was correct in refusing Facebook Inc’s application to set aside service.
Relevance to NZ or What Should We Learn?
In 2018, the NZ Privacy Commissioner (“Commissioner”) found that Facebook breached the 1993 Act for refusing access to personal information on users’ accounts, despite Facebook’s claim that the Act did not apply to an overseas entity. The Commissioner identified Facebook as an agency under section 2 and noted extraterritorial reach under section 10. This highlighted Facebook’s failure to comply with Principle 6, focusing on its monetisation and service provision to NZ users.
The principles in Facebook Inc v OAIC are highly relevant for interpreting the 2020 Act, especially its extraterritorial scope under section 4. If similar reasoning is adopted, activities such as Facebook installing cookies on NZ devices or making APIs available could meet the ‘carrying on business’ threshold in NZ. While it remains unclear whether NZ-based developers accessed the APIs, NZ’s users interacted with the app based on the API, connecting Facebook’s processing role to NZ. Therefore, even if NZ developers didn’t directly use the API, the app relied on its functionality, implementing Facebook’s role in data processing.
Facebook’s operations are ongoing, repetitive, and systematic in collecting and using personal data from NZ users for advertising and personalisation. Its current privacy policies confirm the collection of emails, phone numbers, age, friends/followers & user interaction data, used to personalise ads and recommendations. Facebook uses cookies for platform operation, security and ad-effectiveness. Although it may not directly target NZ users, Facebook provides location-based content, collects geographical data, and has nearly 3.4 million NZ monthly users. Even if processed overseas, this affects NZ directly, satisfying the criteria for “carrying on business’ in NZ.
Section 4 of the 2020 Act extends to any overseas agency “carrying on business” in NZ, regardless of where the information is collected, held or where the individual is located. An agency may be considered as such even if it has no commercial operation, physical office, payment receipt, or profit in NZ. This aligns with the FFC’s finding in Australia, indicating Facebook can be regulated in NZ even without local contracts or physical presence. Activities like cookies, app logins or API usage on NZ users’ devices could suffice to meet the threshold of carrying on business under the 2020 Act. It is also pertinent to note that approximately 64,000 NZ users were impacted by this breach.
Where Facebook Likely Breaches NZ Laws
So, what would the Cambridge Analytica scandal look like under the 2020 Act? Several Information Privacy Principles (“IPPs”) appear relevant:
- IPP 2: Data was harvested not just from consenting users who downloaded the app but also from their Facebook friends who hadn’t interacted with the app.
- IPP 3: Even if users consented to app use, they were unaware that their data was exploited for political profiling and advertising.
- IPP 5: Facebook’s failure to prevent unauthorised third-party access to sensitive data may indicate insufficient safeguards. This aligns with Australia’s APP 11.
- IPP 10: Facebook used information for political advertising, not its initial use, mirroring a breach of APP 6.
- IPP 11: Facebook’s conduct facilitated the disclosure of user and non-user data to third parties without user consent.
Further, from a technical standpoint, FFC found that Facebook Inc’s deployment of cookies on Australian users’ devices was a key business activity, not incidental and central to its global operations. With cookies, Facebook tracked users for authentication, content delivery and advertising. Under the 2020 Act, cookies would likely count as personal data collection, triggering transparency and consent requirements.
Further, FFC also focused on Facebook Inc’s API, which enabled app developers in Australia to use Facebook login and indirectly facilitate the data collection behind the Cambridge Analytica scandal. The core problem stemmed from the design of the API’s extended permissions, which allowed third-party apps not only to access personal data from users who directly engaged with them, but also to collect extensive information about those users’ Facebook friends without their knowledge or consent. Such practices likely breach NZ’s standards for lawful, transparent, and fair data handling, especially in the digital context where global platforms operate seamlessly across borders, setting vital legal precedents for regulatory reach and enforcement.
The Gaps in the 2020 Act
NZ’s 2020 Act strengthens protections for personal information but faces limitations in the digital age. Despite enhanced provisions, enforcement remains challenging due to the Commissioner’s restricted powers. While the Commissioner can investigate and mediate, they cannot issue binding rulings or compel compliance directly. Instead, unresolved complaints require referral to the Human Rights Review Tribunal (“HRRT”), creating delays and discouraging complainants. NZ’s small market size also reduces incentives for global tech firms to prioritise compliance, often treating NZ as a test case.
Further, NZ also differs in handling sensitive personal information. While the 1988 Act provides additional safeguards for data like ethnicity or political views, NZ does not categorise it separately.
The penalties under the 2020 Act are limited, with maximum fines of NZD 10,000 for specific breaches -an amount arguably too low against multinationals like Facebook. Further, there are no civil penalties under the 2020 Act, and the highest HRRT award has been NZD 168,000. In contrast, Australia’s 1988 Act imposes substantial civil penalties, up to AUD 50 million for corporations, giving regulators stronger enforcement tools.
The European Union’s General Data Protection Regulation (“GDPR”) sets even higher standards, including fines reaching millions or proportional to company turnover, alongside expanded individual rights and enforcement mechanisms. Such contrast highlights significant gaps in NZ’s framework, particularly around enforcement capacity and financial sanctions. Without stronger deterrents, the Act risks being less effective in holding powerful digital companies accountable in an increasingly complex global data environment.
A Call for Action
Considering legal weaknesses exposed by Facebook Inc v OAIC, NZ requires legal reforms to strengthen its existing 2020 Act. One key reform required is the enhancement of the powers of the Commissioner. The Commissioner should be given the authority to conduct investigations into their initiative and issue binding determinations and administrative penalties without relying on the HRRT. This would expedite enforcement and signal a proactive and protective stance for New Zealand.
Similarly, the penalty framework needs expansion. The current levels of penalties are minuscule. Introducing a civil penalty regime with scalable fines, calculated based on company turnover and the severity of the breach, would create a credible and much more reasonable deterrent.
Further, improving jurisdictional clarity, particularly the term “carrying on business” is essential. Without a clear and operational definition, there is a risk of foreign companies evading privacy obligations by exploiting jurisdictional ambiguities. Defining this term in statute would enable:
- regulators to confidently assert jurisdiction;
- assist courts in interpreting cross-border claims; and
- prevent companies from avoiding liability through technical loopholes.
The 2020 Act could be amended to create a separate category for “sensitive personal information”, drawing inspiration from the 1988 Act and the GDPR. The amendment could be made under Section 7 (1), encompassing data such as racial or ethnic origin, political or religious beliefs, sexual orientation, and criminal records. This would allow:
- heightened standards of consent;
- increased security obligations; and
- stricter limitations on the use of such information.
Conclusion
The Cambridge Analytica scandal revealed how easily personal data can be collected and misused through technical tools like cookies and APIs. Australia’s case against Facebook showed that courts can hold global platforms accountable – even without a physical presence or revenue.
For NZ, the question is whether we have the legal tools to do the same. The 2020 Act was a step forward, but with miniscule penalties, limited enforcement and no sensitive data category, it shows being too weak in practice. If we want to protect New Zealanders’ trust in the digital age, we need laws that see through the technical smoke and mirrors. Otherwise, we risk becoming a testing ground for companies willing to exploit legally grey areas.
The lesson from Cambridge Analytica is clear: when technology evolves faster than the law, it’s ordinary people who pay the price.




