On January 3, the Union Ministry of Electronics and Information Technology (MeitY) released the draft Digital Personal Data Rules, 2025,1 and opened it for public discussion and feedback from all stakeholders by March 5, 2025. The much-awaited Draft Rules, once finalised and notified, would implement and operationalise the Digital Personal Data Protection (DPDP) Act, 2023.2
Firstly, to answer a nagging question on many minds: “why regulate now” when most personal data are already in the public domain thanks to past collection frenzy: “better late than never!”
The digital personal data protection ecosystem rests on three principal players: The Data Principal, the Data Fiduciary, and the Data Processor.
The day-to-day digital personal data protection legal ecosystem envisaged through the Act and the Draft Rules rests on three principal players: The Data Principal, who is the individual whose data is being collected, the Data Fiduciary, the organisation/agency which decides the purpose and means of processing the data and the Data Processor, who “processes the data on behalf of the Data Fiduciary”. In the context of the Act & Rules, the Consent Manager, appears as a sui generis creation of the Indian state, who “acts as a single point of contact” on behalf of the Data Principal for managing consent.
Related articles from The Hindu Centre:
1. Parthasarathy, S. 2017. Privacy, Aadhaar and the Constitution, Jun 02.
2. Resources. 2022. Draft India Data Accessibility & Use Policy 2022 [PDF 1.34 MB] with Background Note [PDF 1.4MB], March 09.
3. Resources. 2018.Official Documents: Justice B.N. Srikrishna Committee Report and Personal Data Protection Bill, 2018, Jul 31.
The proposed regulatory regimen endeavours to enforce a healthy interplay by tagging a definitive legal purpose for collection of personal data, anchored in informed and transparent consent from the Data Principal. For instance, if one’s phone number is collected for the purpose of generating a bill, it cannot be used for any other purpose such as sending promotional feeds. Any use/s for purpose/s not consented to, would amount to a breach under the Act. Similarly, it is expected that only necessary data for achieving the identified purpose needs to be collected. To continue with the same example, one need not collect a phone number and an email ID to generate a bill as it may be excess data-gathering for the purpose. One can be given an option of the mode through which the invoice will be passed on to the customer and that relevant data alone can be collected for the said purpose.
Related articles from The Union Ministry of Electronics and Information Technology (MeitY):
1. Draft – Digital Personal Data Protection Rules, 2025, The Gazette of India – Extraordinary, January 3, 2025.
2. The Digital Personal Data Protection Act, 2023, The Gazette of India – Extraordinary, August 11, 2023.
Such minimal data collection tied to strict purpose articulation may be a mammoth shift from the prevalent routine adopted by many business entities, habituated to perennially collecting and exploiting customer data. Even though India’s data privacy regime had a long-laboured birth, the language in the Act and the Draft Rules now bring out the essence that the rights of the Data Principal reign supreme. As commonly attributed to the French writer, Victor Hugo: “nothing is as powerful as an idea whose time has come”.
In a nutshell, compliance under this Act and the proposed Draft Rules gravitate towards consent and avoidance of data breach in the lifecycle of the personal data. The sheer novelty of the privacy regime, coupled with the paradigm shift in dealing with personal data, has thrown most stakeholders into a tizzy of incomprehension and has pushed them to adopt a wait-and-watch mode. No doubt it is seen to be a policy change ushered in with much implementation vigour. The efforts involved in shaping the Act and the Draft Rules in a bare-bones model – fleshed only with the fundamental principles of data privacy such as purpose limitation, data minimisation, lawfulness, fairness, transparency, confidentiality, integrity, accuracy, and accountability – are indeed laudable.
Scope for deliberation
1. Model Clauses for Processing Agreements
The Draft Rules support the intent of the Act well by adhering to the ethos of principle-based application, thereby consciously steering clear of a prescriptive style. While fully endorsing the principle-based approach for the legislation, the Rules could be more prescriptive in nature. Such an approach would assist Data Fiduciaries in navigating the new regulatory waters. In its current avatar, the Draft Rules offer no guidance to Data Fiduciaries in respect of details to be webbed into a data processing contract, mandated under Section 8(2) of the Act, which reads:
“ 8. (2) A Data Fiduciary may engage, appoint, use or otherwise involve a Data Processor to process personal data on its behalf for any activity related to offering of goods or services to Data Principals only under a valid contract.”3
The Rules can mention mandatory clauses to ensure basic standards to hold processors accountable to recognise the rights of the Data Principal.
For instance, model clauses akin to the Standard Contractual Clauses (SCC) by the European Commission may be considered. Such an approach can go a long way in guiding Data Fiduciaries while engaging Data Processors. The Rules, in their final form, can also mention certain mandatory clauses in such contracts to ensure base level standards that would hold processors morally accountable to recognise the rights of the Data Principal.
In this context, it would also be worthwhile to identify a commission/authority, if not the Data Protection Board of India4 (Board) itself, to undertake exercises for drawing up pre-approved standard clauses for data processing contracts. This would help in setting the tone for fiduciaries engaging processors, providing sufficient guarantees to the rights of the Data Principal. In turn, it will encourage the ease of doing business in India from the data privacy perspective.
2. Consent Manager
The obligations of a Consent Manager in the framework are relatively new, particularly as picturised by the Draft Rules. The Act defines the “Consent Manager” as follows:
“ 2. (g) “Consent Manager” means a person registered with the Board, who acts as a single point of contact to enable the Data Principal to give, manage, review and withdraw her consent through an accessible, transparent and interoperable platform.”5
The looming question is whether this introduction of a designated Consent Manager into the personal data protection ecosystem has complicated the consent management options or whether it has streamlined certain special category of consent management businesses. This seemingly untested terrain needs to be kept open for further refinement based on how the mechanism unfolds.
Related articles from The Hindu Group:
The Hindu:
1. Editorial. No secret affair: on the draft Digital Personal Data Protection Rules, 2025, Jan 6
2. The Hindu Bureau. 2025. Decoding the Draft Digital Personal Data Protection Rules, 2025 | Video , Jan 13.
Frontline:
1. Dar, J. M. 2025. Our memories die inside our phones, Jan 14.
2. Tripathi, P. S. 2019. Concerns over linking Aadhaar to voter ID and social media accounts, Sep 15.
The Hindu BusinessLine:
1. Team BL. 2025.Nuggets. Digital privacy and marketing, Jan 12.
2. Vaitheeswaran, K. 2025.Data rules raise privacy concerns, Jan 08.
While trying to understand this novelty, one may look at other geographies and their practices. Although the European Union’s General Data Protection Regulation (GDPR) provides a framework for consent management, it does not explicitly define or regulate ‘Consent Managers’. While the GDPR does not designate Consent Managers, many organisations use facilities offered by third-party consent management platforms (CMPs) such as cookie consent banners and preference centres, to comply with the GDPR requirements on consent management. However, they are categorised as ‘data controllers’ or ‘data processors’ depending on their role in deciding the purpose and means of processing. No additional status is ascribed to such consent management platforms.
The Consent Manager under the DPDP seems to be a unique genre by itself. The Act read with the Draft Rules envisages a Consent Manager to be a body corporate registered with the Board and act as a single point of contact for the Data Principal to give, manage, review, and withdraw consent through a platform that is accessible, transparent, and interoperable. Their registration is tied to stringent conditions, such as being a company incorporated in India, having a minimum net worth of Rs. 2 crores and demonstrating technical and operational capability. Additionally, it also calls for conditions such as compulsory avoidance of conflict of interest with Data Fiduciaries, any change of control needing prior Regulatory Board approval, among others.
A Consent Manager being the single point of contact for the Data Principal, cannot be perceived as a data processor at any cost under this framework. The only available fit is that of a data blind fiduciary with an extra mandate to deal with probably pseudonymised data as seems to be the suggestion in Part B, 2 read with 3 and 4 of the First Schedule of the Draft Rules6. The referenced part mandates the Consent Manager to ensure that the manner of making available or sharing of the personal data is such that the contents thereof are not readable by it. However, it has to make available the information contained in its record in a machine-readable format to the Data Principal, on request.
Under the present structure there is no room for argument that Consent Managers are not processing identifiable personal data.
Pseudonymisation is a process of removing personal identifiers from data and replacing these identifiers with placeholder values which are convertible into machine-readable format at any time. Under the present structure applicable to a Consent Manager in the Draft Rules, there is no room for argument that Consent Managers are not processing identifiable personal data on account of the mandate of storing records in a non-machine-readable format. The reason being that the DPDP Act defines personal data in Section 2(t)7 as “ any data about an individual who is identifiable by or in relation to such data”. Hence, keeping it in a machine-non-readable format to be convertible into readable format will attract the definition of personal data under the Act .
Jurisprudential support can be taken from Recital 26 of GDPR8 which throws light on the attributes of pseudonymised data and the applicability of privacy regulations to it:
“Pseudonomised personal data which could be attributed to a natural person by the use of additional information should be considered to be information on identifiable natural person.”
This clarifies the distinction and assists in comprehending the differential treatment of anonymised and pseudonymised data in privacy laws. Anonymised data are specifically exempted from privacy regulations the world over as this process makes it impossible to link the data back to a specific individual. From the above deliberations, one can clearly deduce the applicability of the DPDP Act to pseudonymised data handled by data blind Fiduciaries such as the Consent Manager.
Interestingly, Part B, 2 of the First Schedule to the Draft Rules calls for the consent manager to be necessarily data blind as they cannot hold such data in a machine-readable format. However, Part B,4 (b) of the First Schedule9 requires the Consent Manager to make available such records to the Data Principal in a machine-readable form on request as per the terms of service.
It would be ideal to insert a clarification that the maintenance of such record in Part B3 of the First Schedule by Consent Managers refers to metadata and not personal data.
A conjoint reading of 2 and 4(b) of Part B exhibits with clarity the obligation of the Consent Manager to either mask or pseudonymise data while storing and recording such data. However, the corresponding obligation of a Consent Manager in Part B 3 (a) & (c)10 of the First Schedule to maintain a record of consent given or the record of sharing of the personal data with a transferee Data Fiduciary can lead to confusion. It would be ideal to insert a language of clarification that the maintenance of such record by the consent manager, as contemplated in Part B 3 (a) & (c), is in reference to the metadata of such personal data and not the personal data itself. This would go a long way in putting to rest myriad possible interpretations aimed at manipulating this grey patch as it stands now.
It is noteworthy that the language of the Draft Rules ingeniously balances the risk of adding an intermediary to the channel while keeping them in the grasp of the regulatory framework as an independent Data Fiduciary. One can see further obligations added onto Consent Managers such as avoiding conflict of interest with Data Fiduciaries or the promoters and key personnel of such Data Fiduciaries on its platform. The nuances relating to appropriate business models for a Consent Manager are currently beyond the imagination and comprehension of many. Hence most players are presently navigating a vortex of questions and scepticism in this regard.
3. Compliance Heavy Incident reporting
In the context of data breach, it is abundantly clear that the Draft Rule 7 has chosen to recognise any incident of personal data breach to be a subject matter of intimation to the concerned Data Principal11. It does not contemplate a threshold on the basis of severity/impact criteria of the breach, on a scale of critical/high/ medium/low risk, to the rights and freedoms of the Data Principal. It amplifies the sentiments of the draftspersons toward the rights of the Data Principal, in the event of a data breach. Simultaneously, it also raises onerous obligations on the Data Fiduciaries for want of a risk-based threshold. Consequently, inadequately staffed Data Fiduciaries may face the risk of a compliance overload.
4. Child Protection and Anomalies in the Fourth Schedule
The much-debated verifiable consent for children’s data has also seen certain alterations in the Draft Rules at a granular level. The requirements pertaining to children’s data under sec 9 (1) and (3) of the DPDP Act, have been diluted to the limited extent as contemplated in the Fourth Schedule12 of the Draft Rules.
Part A of the Fourth Schedule identifies classes of Data Fiduciaries in respect of whom provisions of Sub-sections (1) and (3) of Section 9 shall not apply in respect of conditions qualified. This would mean, there is no applicability of Section 9(1) and 9(3) of the DPDP Act, in tandem, to such classes of fiduciaries processing children’s data identified to conditions marked out in the said schedule. Careful scrutiny is required to establish whether Part A of the Fourth Schedule falls short of guaranteeing the intent of the Act to children subjected to tracking and behavioural monitoring.
Section 9(1)13 of the Act mandates blanket verifiable consent of parent or lawful guardian of the child before the Data Fiduciary embarks on processing children’s data. Similarly, Section 9(3) puts forth an unequivocal prohibition on Data Fiduciaries from undertaking tracking or behavioural monitoring of children or targeted advertising directed at children. The wordings of the heading to Part A of the Fourth Schedule tied to Sl. No 3, 4 & 5 therein will amount to excluding the need for verifiable consent when the said fiduciaries are permitted to undertake tracking and behavioural monitoring of children.
This calls for grave attention, as verifiable consent if sidestepped by educational institutions /childcare provider/crèche, while undertaking tracking and behavioural monitoring, will be catastrophic. Plugging this gap would entail infusing such language to read that the exemption of Section 9(3) of the Act, to such data fiduciaries as specified in the Fourth Schedule, shall be subject to them obtaining verifiable consent from the parent or lawful guardian as per Section 9(1) of the Act.
Exemptions to track and monitor child behaviour should be limited to the educational purpose of the child rather than the institution.
Furthermore, the conditions qualifying such exemptions in Part A, Sl. No 3(a)14 of the Fourth Schedule for tracking and monitoring of behaviour of a child, even if permitted after taking verifiable consent, should be qualified with a purpose limitation attributable to the educational purpose of the child rather than the institution, as this wider purpose mandated by the Draft Rules may give room for misuse of such sensitive personal information. In the alternative, the Rules should qualify the permitted educational activities of such institution covering services such as counselling services to the child (behavioural/career) or any other specified activities in the interest of such child’s education.
Yet another Achilles’ heel in the current language of Fourth Schedule is the usage of the words “ in the interest of safety of children enrolled with such institution”. The words “ interest of safety” has a very wide connotation and is advisable to taper the language or qualify specifics, to contain the rampant usage of behavioural monitoring and tracking activities by such exempted group of data fiduciaries.
State and its instrumentalities brought under the framework
Bringing the state and its instrumentalities under the duty of processing personal data with the backing of appropriate technical and organisational measures is a noticeable masterstroke. The Second Schedule to the Draft Rules15 lays down the standards for processing personal data by the state and its instrumentalities except in cases falling under Section 17(2)(a)16 of the Act – i.e. by such instrumentalities of the state as Central Government may notify, in the interest of sovereignty and integrity of India. Otherwise, the state and its instrumentalities appear to be exempted primarily from the consent regime as seen from Section 7(b) (i) &(ii) for linked welfare purposes as demonstrated by Illustration under Section 7(b) of the Act17.
For instance, in the case of consent based processing for the purpose of a maternity benefit programme, the state can extend the same consent to determine eligibility under any other prescribed benefit from the state, without the need to draw up fresh consent for the enlarged purpose However, the state and its instrumentalities are duty bound to process information, in a lawful manner adhering to minimal data processed for the purpose, ensuring accuracy of data and employ reasonable security safeguards to prevent personal data breach. Processing by the state under Section 7(b) also calls for intimation to the Data Principal about the point of contact and communication link for the web site or app or both, enabling the Principal to exercise her rights under the Act.
The Second Schedule of the Draft Rules seems to have eluded the attention of many who still question as to why the state is generally exempted under the Indian data privacy framework.
Dark Patterns
Obtaining consent from a Data Principal may appear to be a direct and transparent process. However, that is not the case always. As the Act and the Draft Rules deal with matters of privacy of data shared by an individual, it is also worthwhile to ponder on the need to address matters relating to Dark Patterns – a term used to describe manipulative methods to obtain consent from Data Principals.
Dark Patterns are online deceptive design patterns that control the behaviour of the Data Principal at the time of granting consent. The Draft Rules recognise the various umbrella principles of transparency, fairness, data minimisation, accountability, purpose limitation, and data privacy by design and default as can be seen under Draft Rules 3, 12 and 13. These Rules can be connected to Data Fiduciary’s obligation to steer clear of deceptive design patterns. However, the Draft Rules neither explicitly mention or call out Dark Patterns nor provide guidelines for regulating such deceptive techniques that are prevalent.
There is a need to deliberate on the guidelines for Dark Patterns applicable to the privacy law regime and mechanisms should be evolved to lay down standards to eradicate unfair practises that impinge on the rights of the Data Principal. There has been some attempt in India to discuss Dark Patterns in the Draft Guidelines for Prevention and Regulation of Dark Patterns 2023, under the Consumer Protection Act, along with the Advertising Standard Council of India.18 However, those deliberations are not extensive to cover Dark Patterns relevant to privacy laws.
Chalking out specific categories of deceptive design patterns can assure effective implementation of data privacy legislations.
Overloading of options and information, skipping principles of privacy, stirring with visual/language nudges are some common modus operandi in Dark Patterns. Chalking out specific categories of deceptive design patterns and laying out best practises for end-user interface requirements, can assure added advantage for effective implementation of India’s data privacy legislations. The European Data Protection Board’s [EDPB] Guideline 3/2022, on Deceptive Design Patterns in Social Media Platform Interfaces, published in February 2023,19may be a reference point for deliberations to protect the Data Principal from such dubious practises.
Conclusion
Despite much progress in creating a legal architecture for the protection of personal data in India, it is the need of the hour to accelerate the present momentum garnered for data privacy, to take it closer to the goal of making India a geography with adequate data privacy framework. This, in turn, would ensure comprehensive protection to personal data and lend a common ground for Indian businesses in the context of cross border data flow.
[Sarah Abraham is a Partner at A.K. Mylsamy & Associates LLP, where she heads the Privacy Law vertical. An Advocate with over 25 years of expertise in corporate advisory and litigation practice, she is a Certified Information Privacy Professional (Europe)(CIPP/E).
As an MCPC-certified mediator by Mediation and Conciliation Project Committee under the Supreme Court of India, Ms. Abraham is associated with the Tamil Nadu Mediation and Conciliation Centre, annexed to the High Court of Madras. Her educational training includes dual degrees in law and commerce and master’s in business laws. She can be reached at [email protected]]
Endnotes:
1. Ministry of Electronics and Information Technology. 2025. Draft – Digital Personal Data Protection Rules, 2025, The Gazette of India – Extraordinary, January 3. [URL:https://www.meity.gov.in/static/uploads/2025/02/f8a8e97a91091543fe19139cac7514a1.pdf]. Return To text.
2. Ministry of Electronics and Information Technology. 2023. The Digital Personal Data Protection Act, 2023, The Gazette of India – Extraordinary, August 11. [URL:https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf]. Return to Text.
3. op. cit. p.7. Return to Text.
4. For the purpose of functions under the DPDP Act 2023, the Data Protection Board of India “shall have the same powers as are vested in the civil court” and is empowered to inquire into breaches of personal data, issue directives, and impose penalties. Return to Text.
5. Ministry of Electronics and Information Technology. 2023. op. cit. p.2. Return to Text.
6. Ministry of Electronics and Information Technology. 2025. op. cit. p.39. Return to Text.
7. Ministry of Electronics and Information Technology. 2023. op. cit. p.3. Return to Text.
8. intersoft consulting. Recital 26 EU GDPR [https://gdpr-info.eu/recitals/no-26/]. Return to Text.
9. Ministry of Electronics and Information Technology. 2025. op. cit. p.39. Return to Text.
10 . ibid.Return to Text.
11. op. cit.p.30. Return to Text.
12. op. cit.p.44. Return to Text.
13. Ministry of Electronics and Information Technology. 2023. op. cit. p.8. Return to Text.
14. Ministry of Electronics and Information Technology. 2025. op. cit. p.44. Return to Text.
15. op. cit.p.41. Return to Text.
16. Ministry of Electronics and Information Technology. 2023. op. cit. p.11. Return to Text.
17. op. cit.p.6. Return to Text.
18. Central Consumer Protection Authority. 2023. Guidelines on Prevention and Regulation of Dark Patterns, Department of Consumer Affairs, Ministry of Consumer Affairs, Food and Public Distribution, Government of India. [https://consumeraffairs.nic.in/sites/default/files/file-uploads/latestnews/Draft%20Guidelines%20for%20Prevention%20and%20Regulation%20of%20Dark%20Patterns%202023.pdf]. Return to Text.
19. European Data Protection Board. 2022. Deceptive design patterns in social media platform interfaces: how to recognise and avoid them - Version 2.0. [https://www.edpb.europa.eu/system/files/2023-02/edpb_03-2022_guidelines_on_deceptive_design_patterns_in_social_media_platform_interfaces_v2_en_0.pdf]. Return to Text.
Please Email The Hindu Centre