Unit-V, Data Protection Law
Unit-V, Data Protection Law
The Digital Personal Data Protection Act, 2023 (DPDPA) is a pivotal piece of legislation in
India, aimed at regulating the processing of personal data and ensuring privacy protection. Its
journey began with the Personal Data Protection Bill, 2019, which was introduced by the
government to create a comprehensive data protection framework. This bill was based on the
recommendations of the Srikrishna Committee (2018), (The ten-member committee was headed
by Supreme Court Judge (retired) Justice B N Srikrishna and included members from
government, academia, and industries. The committee also had the mandate to propose a draft
bill for data protection)
Members-
The Justice B.N. Srikrishna Committee, constituted by the Government of India in 2017,
consisted of 10 members. It was established to draft a data protection framework and was headed
by Justice B.N. Srikrishna, a retired Supreme Court judge. Other notable members of the
committee included:
1. Justice B.N. Srikrishna (Chairman) – Retired Supreme Court Judge
2. Arghya Sengupta – Research Director, Vidhi Centre for Legal Policy
3. Rama Vedashree – CEO, Data Security Council of India
4. Dr. Ajay Kumar – Additional Secretary, Ministry of Electronics and Information
Technology
5. Sivarama Krishnan – Partner, PricewaterhouseCoopers India
6. Nehaa Chaudhari – Public Policy Lead, Ikigai Law
7. Sunil Abraham – Former Executive Director, Centre for Internet and Society
8. Malavika Raghavan – Lawyer and Researcher in data rights
9. Gopalakrishnan S. – Director-General, Software Technology Parks of India (STPI)
10. Additional representatives from the government and private sector experts in data and
privacy matters
which emphasized privacy as a fundamental right. The 2019 bill provided a strong regulatory
framework, proposed a Data Protection Authority, and set out clear guidelines for data
fiduciaries (entities that handle personal data). However, it also allowed significant exemptions
for state authorities in the name of national security and public order.
After extensive public consultations, the bill underwent revisions. The 2022 revision further
simplified the framework, reducing state regulatory burdens while making private organizations
more accountable for data protection. It also added provisions for government access to personal
data in specific circumstances, like public welfare schemes, which caused concern over potential
privacy invasions
The final version, passed by both houses of Parliament in August 2023, made substantial
changes to previous drafts. Key features include:
1. Broad definition of personal data and requirements for obtaining user consent for
collection and processing.
2. Exemptions for government use of data for welfare schemes and national security
purposes.
3. Limited provisions for individual rights: The final law removed provisions such as the
right to data portability and the right to be forgotten, which were included in earlier
drafts.
Parliamentary Debates
The law was the subject of significant debates in Parliament, with key issues including:
• Concerns over government surveillance: Lawmakers criticized the provisions that
allow government agencies to access personal data without consent, particularly in the
context of national security and public welfare
.
• Lack of independent oversight: The Data Protection Board, established under the law,
operates under the control of the Ministry of Electronics and Information Technology,
raising concerns about its independence and impartiality.
• International data transfers: The bill allows the central government to regulate cross-
border data transfers, but the criteria for approval remain vague, leading to fears of
unrestricted access to Indian citizens' data by foreign governments.
The law, while aiming to protect personal data and promote digital trust, has drawn criticism for
giving the state excessive power, especially regarding data access and surveillance. Critics argue
that the law's scope does not sufficiently protect citizens' privacy rights and may lead to
government overreach
The Digital Personal Data Protection Bill, 2023, which was introduced in Lok Sabha on August
3, 2023, by the Minister of Electronics & Information Technology has been passed by the
Parliament i.e., by Lok Sabha on August 7, 2023, and unanimously by Rajya Sabha on August 9,
2023; and has further received Presidential assent on August 11, 2023.
The previous Personal Data Protection Bills of 2019 & 2022 being ascribed to numerous
amendments, laced with several issues relating to data localization, transparency, compliance
intensive, etc., had been withdrawn by the Central Government (CG). The said Bill came into
being after the Supreme Court, in Justice K.S. Puttaswamy vs. Union of India, (2017), upheld the
‘Right to Privacy’ as a part of the fundamental right-'Right to Life’ enshrined under Article 21 of
the Indian Constitution and had suggested the CG to put in place an act/regime for protection of
Personal Data.
Object and Applicability of the Digital Personal Data Protection Act, 2023 (the Act)
The primary objective of the Act is to establish a comprehensive framework for the Protection
and Processing of Personal Data (as defined below).
“The Act provides for the processing of digital Personal Data in a manner that recognizes both
the rights of the individuals to protect their Personal Data and the need to process such Personal
Data for lawful purposes and matters connected therewith or incidental thereto”.
The Act shall apply to the processing of Personal Data in India, including both online and
digitized offline data, and shall further extend to the processing of such data outside India
relating to the offering of goods or services in India.
The Act also lays the foundation for various other laws such as the Digital India Act and other
industry-specific laws around privacy and data protection to augment India’s march towards the
adoption of Artificial Intelligence (AI) and other future technologies while protecting Personal
Data. The Act may also aid Indian businesses to enhance collaboration with other businesses
located internationally under reciprocal arrangements while safeguarding Personal Data.
Notably, the Act is the first-ever central law in India to use she/her pronouns while referring to
individuals.
Definition and Salient Features
1. Data: Any representation of information, fact(s), concept(s), opinion(s), and
instruction(s) which is capable of being communicated, interpreted, and processed by
human beings or by automated means. Further, any data about an individual (Data
Principal) who is identifiable by or in relation to such data has been referred to
as Personal Data in the Act.
2. Processing of Personal Data: Processing has been defined as the performing of a set of
operation(s) by wholly or partly automated means on digital Personal Data and
includes collection, storage, indexing, sharing, use, disclosure, dissemination, and
includes erasure thereof. Such processing can only be undertaken for a ‘lawful purpose’
for which a Data Principal has given her consent and for certain legitimate uses as laid
down in the Act.
3. Applicability: The Act shall apply to the processing of digital Personal Data within
India where such data is: (i) in digital form, or (ii) in non-digital form and is digitised
subsequently. However, the Act shall also apply extraterritorially to the processing of
digital Personal Data if such processing is in connection with any activity related to
offering goods or services to Data Principals within India. It shall not apply to
the Personal Data when such data is (i) processed by an individual for any personal or
domestic purpose, and (ii) is made or caused to be made publicly available by the Data
Principal herself or any other person being under an obligation (under any law in force in
India during that time being) to make such Personal Data publicly available.
4. Consent: It has been provided in Section 6 of the Act that Personal Data may be
processed only for the specified purpose and after obtaining the consent of the Data
Principal (individual). Such consent has to be free, specific, informed, unconditional,
and unambiguous with a clear affirmative action. A notice as per Section 5 must be given
by the Data Fiduciary before seeking consent, containing details about the Personal Data
to be collected and the purpose of processing. The individual whose data is being
processed can withdraw her consent at any point of time. Notably, such consent, as per
Section 7, shall not be required for ‘legitimate uses’ which inter alia include: (i) specified
purpose for which data has been provided by an individual voluntarily, (ii) for the State
to provide benefit or service such as subsidy, certificate, license, benefit, permits, etc.,
(iii) for the security of the State or in the interest of sovereignty and integrity of the
country (iv) for responding to a medical emergency, treatment or health services, (v) for
safety, and in interest of the security of the State and public order, and (vi)
employment. For individuals with disabilities or below eighteen (18) years of age, the Act
provides that their consent will be provided by their parent(s) or legal guardian.
However, the State or any instrumentality of the State has been empowered to retain Personal
Data or reject any request made for the erasure of Personal Data vide Section 17(4).
5. Rights and Duties of Data Principal: An individual whose data is being processed shall
have certain rights as per Sections 12 to 14 which include the right to (i) obtain
information about processing, (ii) seek correction and erasure of Personal Data, (iii)
nominate another person to exercise rights in the event of death or incapacity, (iv) for any
grievance redressal and (v) withdraw her consent at any time during or after the
processing of Personal Data. Further, as per Section 15, Data Principals will be duty-
bound and under an obligation not to: (i) register a false or frivolous complaint; (ii)
suppress any material information while providing her Personal Data; and (iii) furnish
any false particulars or impersonate in specified cases. The breach of said duties will
attract a penalty as per the Schedule to the Act.
6. The obligation of Data Fiduciary: The Data Fiduciary as per Section 8 of the Act, must:
(i) process the Personal Data only for which the Data Principal has given her consent or
deemed consent (when any individual does not indicate to the Data Fiduciary that she
does not consent to the use of her Personal Data); or for certain legitimate uses; (ii) make
reasonable efforts to ensure the accuracy and completeness of data, (ii) implement
appropriate measures to protect Personal Data in its possession or under its control, (iii)
Respond to any communication from the Data Principal for the purpose of exercise of her
rights, (iv) inform the Data Protection Board of India and affected persons in the event of
personal breach, and (v) erase Personal Data as soon as the purpose has been met and
retention is not necessary for legal purposes (storage limitation). In the case of
government entities, storage limitation and the right of the data principal to erasure will
not apply. Any breach of the said obligation is to be dealt in accordance with Section 33
of the Act read with the Schedule thereto.
7. Transfer of Personal Data outside India: Section 16 allows extraterritorial processing
and transfer of Personal Data, except to such countries restricted by CG through
notification.
8. Exemptions: As per Section 17 of the Act, provisions contained in Chapter II (except
Section 8 (1) & (5) and Chapter III (except Section 16) of the Act i.e., provisions related
to ‘Obligations of Data Fiduciaries’ and ‘Rights & Duties of Data Principal’ have been
made inapplicable (exempted) in specified cases which inter alia includes: (i) prevention,
investigation or prosecution of offences, and (ii) enforcement of legal rights or claims
(iii) not within the territory of India (iv) processing for the purpose of ascertaining
financial information, assets, and liabilities. Further, as per Section 17(2), the provisions
of the Act shall not apply in case of processing of Personal Data: (i) by the State or any
other instrumentality of the State in the interest of the security and public order, and; (ii)
necessary for research, archiving, or statistical purposes.
9. Data Protection Board of India: CG shall, in terms of Chapter V of the Act, establish
a Data Protection Board of India (Board) consisting of a Chairperson and other
members. The Board will exercise and perform such powers and functions laid down in
Sections 27 and 28 of the Act, which inter alia includes (i) directing urgent
remedial/mitigating measures in case of any breach of Personal Data (ii) inquiring into
such breach and (iii) imposing penalties as per the Act. The Board will be a civil
court with Original jurisdiction to entertain the complaints/matters pertaining to the Act
and any other civil court will be barred under Section 39 to entertain any Suit or
proceeding in respect of any matter for which the Board is empowered to adjudicate upon
under the Act.
10. Appeals: The Appeals against the decisions of the Board shall, as per Section 29, lie
with the Telecommunications Dispute Settlement and Appellate Tribunal (TDSAT)
established under the Telecom Regulatory Authority of India Act, 1997 (TRAI Act).
Limitation to prefer such an Appeal is sixty (60) days from the date of receipt of the
Board's decision. Further, the Orders passed by TDSAT shall be appealable before the
Hon’ble Supreme Court as per Section 18 of the TRAI Act.
11. Penalties: The Schedule to the Act lays down the quantum of penalties to be imposed for
various offences and breaches committed under the Act. For instance, a penalty
amounting to (i) INR 200 Crore for non-compliance of obligations in relation to children;
(ii) INR 250 Crore for failure to take security measures to prevent data breaches, under
Section 8(5); and (iii) INR 200 Crore for breach in giving notice of a Personal Data
breach to the Board or the Data Principal under Section 8(6). Such penalties will be
imposed by the Board after conducting an inquiry under Section 33.
Impacts and Implementation
With this new Act, the Companies and businesses handling Personal Data in any manner
whatsoever would now have to develop a standard operating procedure and train their personnel
in order to oblige with certain compliances such as cooperating with the Data Protection Officer
appointed by the Significant Data Fiduciary under Section 10 of the Act; hiring an Independent
Data Auditor; put in place a consent management mechanism to collect, maintain, track, and
update consent from individuals; doing assessments to protect data; maintaining valid contracts
with data processors; etc. However, the basis of classifying companies, and start-ups as Data
Fiduciaries need to be clarified especially concerning certain thresholds and eligibilities such as
net worth, assets, size, number of personnel, and their qualifications, etc.
A serious effort to protect Personal Data or an eyewash to gain Legitimate Control &
Surveillance
The Act in its present form prima facie proposes to protect the Personal Data, but it there may
be concerns with the implementation of the provisions technically. For instance, as per Section
36, CG has been empowered to call for ‘such information’ from the Board or any Data Fiduciary
or intermediary. Such wide power and broad terminology once viewed with a legislative lens
would show the engrained intent of surveillance of the CG. Moreover, Section 17(2)(a)
empowers the CG to exempt any instrumentality of the State from the rigors of the provisions in
respect of the processing of Personal Data. Additionally, since Section 8(1)(j) of the Right to
Information Act, 2005 (RTI Act) is amended by Section 44(3) of the Act, the balance struck by
the RTI Act between privacy and informational right, will be lost as the power of a Public
Information Officer (PIO) can be seen to have been widened as now such PIO can reject an
application made under RTI Act on the pretext of information sought relates to Personal Data.
Balance between Individual Privacy and Governmental & Business Requirements
A major development in India's legal system is the Data Protection Act, 2023, which addresses
the delicate balance between people's right to privacy and the requirements of businesses and
governments for data. This is a thorough examination of how this equilibrium is reached:
1. Individual privacy protection
The Act guarantees that people have more control over their personal data by enshrining the right
to privacy as a fundamental value. Important clauses consist of:
Consent-Based Data Processing: Before collecting or using someone's personal information,
data fiduciaries (organizations that handle data) must get that person's express and informed
consent. This gives people the power to control how their data is utilized.
Right to Data Access and Correction: People are entitled to see the data that is kept about them
by any data custodian and to ask for changes if the information is out-of-date or erroneous.
Right to Data Erasure: When personal information is no longer needed for the reason it was
gathered, people have the right under the Act to request that it be deleted.
Data Localization: To increase security and lower the likelihood of global data breaches,
sensitive personal data must be processed and maintained in India.
2. Providing Safeguards for Government Access
The Act acknowledges the necessity of government access to data for law enforcement, public
order, and national security and contains the following provisions:
Legitimate Grounds for Processing: In certain situations, such as maintaining national security
or handling public health emergencies, governments are permitted to treat personal data without
obtaining consent.
Independent Oversight: When the government accesses or handles data, the Data Protection
Board's creation guarantees accountability and oversight.
Tests of Proportionality and Necessity: The Act requires that any information gathered by the
government be both necessary and appropriate for the intended use.
AI systems can learn from data, absorbing biases that can lead to discriminatory algorithmic
decisions that infringe on civil rights. This often arises because AI development teams and
training data lack diversity. To promote fairness, bias testing and diverse data/teams are required
when building AI systems. Advanced AI surveillance systems also present threats to privacy and
civil liberties, as techniques like facial recognition and predictive analytics are deployed without
sufficient consent or oversight. Policies governing responsible use of AI surveillance are needed
to prevent abuses and uphold privacy rights.
Widespread AI adoption can indirectly threaten privacy and civil rights through economic
disruption. Displaced workers may struggle to find new roles, forcing some into precarious
contract labor with fewer protections. Failing to manage the societal impact of AI thoughtfully
endangers rights and privacy. AI's broad integration demands proactive policies to prevent biased
decisions, misuse of surveillance technologies, and exploitation of vulnerable populations.
The evolving landscape of privacy in an AI-centric world is rife with new challenges, including
pervasive surveillance, nonconsensual accumulation of personal data, and the extensive influence
of Big Tech over data collection, analysis, and use. Big Tech must adopt transparency in data
practices and ethical, responsible data use, promoting inclusivity and accessibility instead of
dominance by the few.
AI systems have a profound impact on data collection, accumulating and leveraging vast
amounts of data to learn and improve. However, this escalating trend raises privacy and
protection concerns, as tools like ChatGPT use personal data without explicit consent. A key
issue is the lack of transparency around how AI uses personal data, making it difficult to
understand how data informs decisions affecting individuals. Organizations employing AI must
proactively safeguard privacy through robust data security, ethical principles, and transparency.
Applying AI in surveillance has emerged as a debated issue, with critics contending that these
systems could enable individual surveillance and control, infringing on civil liberties. To mitigate
issues, stringent oversight and transparency must govern AI surveillance deployment. Clear
policies and independent oversight mechanisms should dictate appropriate usage, and individuals
must be able to access data collection details.
The rise of generative AI for text and image creation raises significant privacy concerns,
necessitating robust data protections. Companies developing generative AI may amass and
analyze user prompts and sensitive information, necessitating robust data protections. Companies
like Google must ensure compliance and rigorous protocols to preserve user privacy and data
security in location tracking.
Personalized recommendations and data access are another case study that highlights the
importance of ethics in assessing implications and safeguarding user privacy as reliance on AI
and big data grows.
AI in law enforcement, such as predictive policing and facial recognition, has faced criticism for
perpetuating biases and prejudices. Some systems have been flagged for unfairly targeting
minority groups, prompting allegations of discrimination. AI also raises privacy and civil liberty
concerns, with instances of false matches leading to wrongful arrests. As AI integrates further,
risks exist of amplifying societal biases and injustices. Regulations and oversight are essential to
ensure ethical, rights-respecting AI use. Robust accountability and transparency frameworks can
mitigate adverse AI impacts.
AI adoption in hiring and recruitment has surged, with companies increasingly using AI
screening and selection tools touted to increase efficiency and objectivity. However, widespread
use raises critical concerns about fairness and bias. Meticulous testing is crucial to prevent unfair
AI practices in hiring. As AI hiring usage grows, emphasizing transparency and accountability is
paramount to prevent biased outcomes and ensure workplace equity. Constructing a legal
framework mandating accountability in AI hiring tools is vital for non-discrimination.
Thoughtful regulations are imperative to govern AI systems responsibly, including the EU's
GDPR, the proposed EU AI Act, the U.S. Algorithmic Accountability Act, Illinois' AI Video
Interview Act, and healthcare rules like HIPAA. These laws foster AI accountability on biases,
privacy, transparency, and human rights, but regulations must balance oversight and innovation.
As AI systems grow more advanced, processing expansive data, misuse and abuse risks heighten.
Effective oversight and regulation are vital to ensure AI development and use aligns with
individual rights and freedoms.
In the era of Artificial Intelligence (AI), privacy protection is a multifaceted challenge that
requires a collaborative and multifaceted strategy. Individuals play a pivotal role in safeguarding
their own privacy by understanding what data is collected and how it is used, using privacy tools
and settings in software and social media, and being mindful in online activities. Blockchain has
enabled promising decentralized AI possibilities, such as Ocean Protocol, Singularity NET, and
Deep Brain Chain, which enable secure, transparent, and affordable access to AI algorithms and
services.
Technological innovations such as decentralized AI platforms reduce vulnerabilities associated
with centralized data repositories, enhancing data security and user privacy. However, these
solutions are not a panacea and must be complemented by comprehensive regulatory frameworks
that define clear standards for data collection, processing, and sharing. The evolution of privacy
laws, such as the EU's General Data Protection Regulation (GDPR) and India's Digital Personal
Data Protection Act (DPDP), illustrates the global shift towards more stringent privacy
protections.
The proactive role of governments and regulatory bodies is crucial in navigating the privacy
implications of AI. By instituting robust oversight mechanisms, conducting regular impact
assessments, and fostering transparency, policymakers can ensure that AI technologies are
developed and deployed in ways that respect privacy rights. The involvement of industry
stakeholders in crafting and adhering to ethical guidelines is essential for building trust and
accountability in AI applications.
The societal implications of AI on privacy extend beyond technical and legal considerations,
emphasizing the need for a holistic approach that considers the ethical and social dimensions of
AI. Public awareness and engagement are vital in shaping the discourse on privacy and AI,
empowering individuals to advocate for their rights and participate in the development of
technologies that align with societal values.
Global Trend: Brief Comparison: India, EU, and US Data Protection Policies
European Union (EU)
• General Data Protection Regulation (GDPR): Implemented in 2018, the GDPR is
considered the gold standard for data protection globally. It provides robust rights to
individuals, including data portability, the right to be forgotten, and strict penalties for
non-compliance.
• Privacy First: GDPR emphasizes privacy by design and default, requiring data
controllers to integrate data protection into their processing activities.
United States (US)
• Sectoral Approach: The US lacks a comprehensive federal data protection law. Instead,
it relies on sector-specific regulations like the Health Insurance Portability and
Accountability Act (HIPAA) and the California Consumer Privacy Act (CCPA).
• Corporate Leverage: The US model is often criticized for being lenient towards
corporations, prioritizing innovation and market growth over stringent privacy
protections.
India
• Data Protection Act, 2023: India’s law falls between the EU and US approaches. It
offers significant individual rights but allows government access under specified
conditions, aiming to balance privacy with national and economic interests.
• Developing Framework: India’s framework, enacted in 2023, aligns with global trends,
joining 137 out of 194 countries that have data protection laws as of 2023.
7. India’s Ranking in Enacting Data Protection Laws
India’s Data Protection Act, 2023, positions it as a late entrant in the global data protection
landscape. While the EU’s GDPR remains the benchmark for stringent privacy laws, and the US
exhibits a fragmented approach, India’s Act is ranked as follows:
1. European Union: Leading due to GDPR’s comprehensive framework and early adoption
in 2018.
2. India: With its 2023 enactment, India has joined nations like Chile, New Zealand, and
South Africa in adopting GDPR-like laws. It ranks among the top 50 countries globally in
terms of data protection comprehensiveness.
3. United States: Lagging due to the absence of a unified federal law, with only state-
specific regulations like the CCPA in California.
Conclusion
The Data Protection Act, 2023, represents a progressive approach to data governance in India.
By embedding robust privacy protections for individuals, enabling lawful access for the
government, and regulating corporate data practices, the Act seeks to create a balanced
ecosystem that promotes trust, innovation, and security. Its implementation will require vigilance
and collaboration among stakeholders to address challenges and ensure the realization of its
objectives.
Conclusion
The Act marks a distinctive approach to safeguarding Personal Data, addressing longstanding
needs in the context of increasing internet users, data generation, and cross-border trade.
However, it is felt that various details regarding implementation need clarification which may
happen upon the establishment of the Data Protection Board of India and the promulgation of
Rules under the Act. In its entirety, the Act signifies India's unique stance on modern data
protection, enriched by extensive post-draft consultations. While the provisions of the Act are
less detailed than European Union’s GDPR, it certainly mandates a significant shift from how
Indian businesses should now approach privacy and Personal Data, while legitimizing CG’s act
to control, retain, and monitor its citizens’ personal information.
While the notification of the Sections of the Act for their implementation is still awaited, one has
to wait and watch how the Courts interpret wide empowering provisions and in what manner the
Act evolves.