How do PETs influence the interpretation of “personal data” under DPDPA and other laws?

Introduction
Privacy-Enhancing Technologies (PETs) are transforming how organizations handle, analyze, and protect personal data. These technologies, including differential privacy, homomorphic encryption, secure multi-party computation, and federated learning, are designed to ensure that sensitive information is processed in a manner that preserves privacy, even during computation. In an age of data-driven innovation, PETs provide a promising way to balance the utility of data with privacy compliance.

One of the most profound legal implications of PETs is how they challenge and potentially reshape the definition and interpretation of “personal data”—especially under regulatory frameworks like India’s Digital Personal Data Protection Act (DPDPA) 2023, the EU General Data Protection Regulation (GDPR), and the California Consumer Privacy Act (CCPA). Understanding the interplay between PETs and legal definitions of personal data is crucial for businesses, regulators, and technologists navigating today’s compliance landscape.


Understanding “Personal Data” in the Legal Context

A. DPDPA’s Definition
Under Section 2(13) of India’s DPDPA 2023, “personal data” refers to any data about an individual who is identifiable by or in relation to such data. The focus is on identifiability—either direct (e.g., names, Aadhaar numbers) or indirect (e.g., behavioral patterns or device IDs).

B. GDPR’s Definition
GDPR Article 4(1) defines personal data as any information relating to an identified or identifiable natural person. Identifiability includes data that, when combined with other information, could reasonably lead to the identification of a person.

C. CCPA/CPRA (California)
Under the CCPA and its amendment, the CPRA, personal information is information that identifies, relates to, describes, or could reasonably be linked to a particular consumer or household.

Across all these laws, the idea of identifiability—even when indirect or probabilistic—is key. If PETs can eliminate or sufficiently reduce this identifiability, the processed data may fall outside the legal scope of “personal data.”


Impact of PETs on Identifiability

PETs work by removing, obfuscating, or protecting identifying elements of data. They do this through various mechanisms:

  • Differential Privacy: Introduces mathematical noise to data outputs, making it extremely difficult to infer individual contributions.

  • Homomorphic Encryption: Enables computation on encrypted data without revealing the underlying data.

  • Secure Multi-Party Computation (MPC): Distributes pieces of data among parties to compute results without sharing actual data.

  • Federated Learning: Keeps data on local devices while only aggregating model updates centrally.

These techniques reduce or eliminate the risk of re-identifying individuals—thereby directly influencing whether the data is still considered “personal.”


1. Can PET-Processed Data Fall Outside the Definition of “Personal Data”?

Yes, under certain conditions. Legal regimes generally acknowledge that if data has been irreversibly anonymized—such that the individual is not identifiable “by any means reasonably likely to be used”—then it may no longer be considered personal data.

A. Under DPDPA 2023
DPDPA does not explicitly define “anonymized data,” but in practice, if PETs like differential privacy are applied robustly to render data non-identifiable, it would no longer qualify as “personal data.” This allows for more flexible use and processing without obligations like consent, purpose limitation, or data principal rights.

B. Under GDPR
According to Recital 26, anonymized data is excluded from the regulation if it is truly irreversible. The Article 29 Working Party and subsequent EDPB guidance have supported the notion that PETs can be used to anonymize data, but caution that the technique must be robust against re-identification.

C. Under CCPA/CPRA
If data is “de-identified”, i.e., cannot reasonably identify a person and is subject to technical safeguards, it is not subject to many of the Act’s provisions. PETs like federated learning and encryption may help meet this threshold.

Conclusion:
If PETs are applied rigorously and in line with best practices, the processed data may no longer be subject to core data protection laws. However, this depends on the implementation quality, context, and risk of re-identification.


2. Grey Areas and Regulatory Caution

A. Pseudonymization vs. Anonymization
Many PETs (like encryption or tokenization) achieve pseudonymization, not full anonymization. This means that while identifiers are removed, the data could still be linked back to individuals with additional effort or information.

  • Under GDPR and DPDPA, pseudonymized data is still personal data.

  • PETs must be evaluated on residual re-identification risk.

B. Risk-Based Approach
Regulators adopt a risk-based interpretation: even PET-protected data may be considered personal if:

  • It is linked to auxiliary datasets

  • The PET implementation is weak or reversible

  • The attacker is motivated and well-resourced

Therefore, simply applying a PET is not sufficient. The context, threat model, and technical robustness matter.


3. PETs as Evidence of Due Diligence

Even if PETs do not always push data outside the legal definition of personal data, they are viewed positively by regulators as part of a privacy-by-design approach.

A. Under DPDPA
The law encourages data fiduciaries to use technological safeguards. PETs:

  • Support purpose limitation by reducing unnecessary access

  • Minimize data collected and stored

  • Help meet reasonable security safeguards under Section 8

B. Under GDPR
PETs are recognized as “appropriate technical measures” (Articles 25 and 32), and may be considered during:

  • Data Protection Impact Assessments (DPIAs)

  • Audits and breach response evaluations

  • Determining severity of penalties

C. Under CCPA/CPRA
Organizations that adopt strong PETs may be seen as “reasonable in their security practices,” potentially shielding them from private right of action for data breaches.


4. Role of PETs in Emerging Interpretations and Guidance

A. International Standards
Bodies like ISO, NIST, and the OECD are working to codify PET guidelines, influencing how regulators interpret anonymization and personal data boundaries.

B. Sandboxes and Regulatory Experiments
Authorities in Singapore, the UK, and the EU are testing PETs in sandboxes to refine their understanding of when PET-protected data can be considered non-personal.

C. Indian Outlook
The DPBI (Data Protection Board of India), though yet to issue detailed guidance, may follow global best practices in evaluating when PETs anonymize data sufficiently to exempt it from DPDPA’s scope.


5. Practical Implications for Businesses

A. Greater Flexibility Post-Processing
Organizations can process data more freely (e.g., for analytics or AI model training) once PETs have anonymized it, thus reducing legal exposure and operational overhead.

B. Streamlined Consent and Data Subject Rights
If PETs are applied effectively to anonymize data, businesses may be:

  • Exempt from fulfilling access, deletion, or portability requests

  • Not required to maintain elaborate consent logs

C. Safer Cross-Border Data Use
PETs can facilitate compliant cross-border analytics by reducing identifiability, avoiding data transfer restrictions, and bypassing localization requirements.


Conclusion

Privacy-Enhancing Technologies are redefining the boundaries of what constitutes “personal data.” Under laws like India’s DPDPA, the EU’s GDPR, and California’s CPRA, if PETs are applied effectively and responsibly, the resulting data may fall outside the legal scope of personal data, offering organizations significant compliance advantages. However, this legal relief is contingent on context, technical implementation, and the threat environment. PETs are not a silver bullet but are essential tools in a broader strategy of privacy compliance, risk management, and ethical data stewardship.

Priya Mehta