Why Your Data Matters and Could AI Protect Our Data?

Sep 9, 2025 | Blogs

by Brenden Nielsen-Stoudt
Based on a paper for “Shaping the Law in Tech-Driven Era”, LAWCOM 733 @ University of Auckland
Divider Line
The Hidden Life of Your Data

Every click, swipe, and search you make leaves behind a trace. For years, most people ignored it, assuming their data was invisible or insignificant. Not anymore. A recent Time magazine survey found that 74% of Americans now consider their personal data “very important.”

But awareness is not the same as control. While people increasingly recognise the value of their information, many remain uncertain or indifferent about where it goes, who uses it, and to what ends. That uncertainty isn’t an accident. It’s the product of an industry designed to profit from our digital lives, often in ways we never agreed to.

Entire industries known as “data brokers” exist to collect, analyse, and sell personal information. They harvest details from public records, social media, purchases, location trackers, and almost anything we do online or offline. Then they package and sell this information to advertisers, insurers, employers, and even political campaigns. According to Justin Sherman in Federal Privacy Rules Must Get ‘Data Broker”, this is not a fringe activity but a thriving sector worth $250 billion today and projected to reach $411 billion by 2030 (Maximised Market Research).

The question is not whether this industry exists but whether we should regulate it. And if we don’t, what are the consequences?

More Than Just Numbers: How Personal Data Affects You

Take something as ordinary as a smartwatch. It tracks steps, sleep, heart rate, and location. On its own, that might seem harmless.

But what happens if your health insurer gains access to that data and raises your premiums because you didn’t meet last month’s step goal? It sounds dystopian, but it’s already happening — often with our unwitting consent. Deloitte found that 90% of people agree to terms and conditions without reading them, essentially signing away rights to their own data.

This example illustrates why stronger, clearer, and enforceable data protection frameworks are essential. Safeguards must be transparent, accessible, and genuinely protective of individuals.

GDPR: New Rules of the Road

One of the most significant steps in this direction has been the EU’s General Data Protection Regulation (GDPR). It empowers people to access their data, understand how it’s used, and request deletion in certain cases.

The GDPR has also influenced other countries. New Zealand’s Privacy Act 2020 borrows similar principles but with far lighter penalties: while the EU can impose fines of up to €20 million (around NZD $40 million), New Zealand’s maximum fine is capped at just $10,000. One should ask why so little, and why wouldn’t the NZ Government want to protect our – and their – data.

Nothing is though perfect – and regulation is always a work in progress. Even the GDPR faces challenges, as Its effectiveness depends on how “personal data” is defined. We know that a lot of data that might seem non-personal can be very easily personalise and therefore potentially misused. As such, the debate around data protection intensifies.

The Definition Dilemma

At the heart of the regulation of data lies a deceptively simple question: what counts as personal data?

Different jurisdictions answer differently. Names and addresses clearly qualify, but what about search histories, app usage, or household energy patterns? Ambiguity creates loopholes that companies can (and do) exploit, undermining regulations like the GDPR.

A more consistent approach would be to treat all user-provided data as personal data by default. This would require organisations to implement strict safeguards, from encryption and regular audits to automated deletion tools. Here, artificial intelligence could become not just a threat but a solution.

AI: Threat and Opportunity

AI currently enables companies to analyse personal data at massive scale. But what if we used AI to protect data instead?

Imagine a system where AI automatically detects when data could identify you, then masks, flags, or deletes risky elements before misuse occurs. Large language models could also support regulators, helping them keep pace with fast-moving companies and evolving challenges. By automating much of the monitoring process, regulators could enforce privacy laws more effectively without enormous staffing costs.

Of course, this vision requires more than just technical tools. Secure systems, robust ethical oversight, and strong legal frameworks must underpin any AI-based protections.

A Roadmap for the Future

So what would it take to secure personal data in a sustainable way? Two key steps stand out:

  1. Universal Definitions
    Establish clear, universal definitions of personal data. Ambiguity enables exploitation; clarity creates accountability. Users, companies, and regulators must share a common understanding.
  2. AI-Assisted Oversight
    Equip regulators with AI-powered tools to monitor and enforce privacy laws. Automating oversight can reduce costs, accelerate enforcement, and turn data protection from a paper promise into a practical reality.

Ultimately, data should be treated as critical infrastructure — like water, electricity, or roads. It underpins modern life and should not remain under the unchecked control of corporations operating in the shadows.

Protecting What Makes Us Human

As our digital lives expand, so must our ability to control how our data is used and shared.

If we succeed, the fusion of law and technology can safeguard something deeper than information: the very essence of our humanity. Data is not just numbers on a screen. It is human in origin, seeded by our choices, shaped by our actions, and etched by our presence in the digital realm. It is the living record of our struggles, joys, and daily existence.

If we fail, we risk ceding power over our identities and freedoms to systems we can’t see and companies we can’t hold accountable. If we succeed, we build a digital world where privacy and human dignity are not afterthoughts, but foundations.

In a world increasingly driven by data, protecting our data means protecting ourselves.

Sources and references available upon request. AI was used minimally for formatting and proofreading purposes.

 

Related Posts

Behind the Screens: New Zealand’s First Legal Tech Hackathon  

How an Access to Justice Hackathon became the catalyst for homegrown innovation  by Keenan Evans, Research Assistant @ ALTeR  Opening: Setting the Stage Unleash Space: The Beginning  Imagine this: the Unleash Space at the University of Auckland, Friday afternoon....

Rethinking Digital Regulation in Aotearoa New Zealand

Rethinking Digital Regulation in Aotearoa New Zealand

by Keenan Evans, Research Assistant @ ALTeR  The saturation of technology products and services influences everyday life – from social media algorithms to legal practice and even government environments – the question is no longer whether to regulate, but how. While...

New Zealand’s Strategy on AI: Three Fallacies

New Zealand’s Strategy on AI: Three Fallacies

New Zealand’s long-awaited AI strategy positions the country as a follower rather than a leader in artificial intelligence development.
First, by deliberately emphasising AI adoption over development, New Zealand surrenders the opportunity to create sovereign AI capabilities—while countries like Switzerland, Singapore, and Israel build their own systems reflecting their values and priorities.
Second, the strategy’s reluctance to provide clear regulatory frameworks creates the very business uncertainty it aims to avoid. Businesses need clarity, not regulatory ambiguity.
Third, the strategy ignores the fundamental question of who will control the AI systems that increasingly govern our economy and decision-making.
The upcoming New Zealand Institute for Advanced Technology offers a chance to demonstrate that commercial success and clear governance can work together—but only if we move beyond the false choice between regulation and innovation.