Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Data protection and privacy guidelines for DIGIT implementations (administrative authorities)
DIGIT, an open-source platform, enables governments and service providers to provide interdepartmental coordination and citizen-facing service delivery systems - currently, in urban governance, sanitation, health, and public finance management.
As citizen data is collected and used for such governance services, data privacy and protection measures are required to ensure this data is managed responsibly and safely.
This document is created to be an online guide, providing guidelines for administrative authorities (government entities like state government bodies, local governing bodies etc.) to maintain data privacy and protect individuals’ data.
Readers can use this to identify the steps they must take, in their capacity as administrative authorities, to ensure data privacy and protection in the context of a DIGIT or DIGIT-like implementation.
It can also provide source material for privacy policies, which should be included in each portal & application.
This is not a technical reference or documentation. It serves as a policy guideline.
References made to DIGIT are also applicable to other platforms similar to DIGIT. Not all parts of the guidelines or featured content may match the reader's platform or context, hence this document is open to be referred to in parts as needed.
These guidelines are to be read through the eyes of roles that are part of the administrative authority’s (AA) offices in the journey of adopting a DIGIT-based system or platforms similar to DIGIT in a government entity/ies.
If a government authority adopts DIGIT as a citizen service platform, then these guidelines are apt. Some points in the guidelines may not be relevant to platforms other than DIGIT in the governance ecosystem. Hence these guidelines have to be read as advisory.
Common roles of government entities that are part of the DIGIT currently are:
Field level employee
Line Management employee
Administrative Officer (AO)
Domain-level data collector
Verifiers
Field inspectors
State or ULB-level head role/authority
State administrators
This is a non-exhaustive list of roles.
There may in the future be more roles involved from a government entity while using DIGIT, including Ward-level volunteers and/or Intermediaries
Guidelines to be read with the Digital Personal Data Protection Act, 2023
As the AA adopts platforms like DIGIT, it gets access to digital personal data and therefore comes into the ambit of the Digital Personal Data Protection Act, 2023 (DPDP Act). The roles a Prog plays as per the DPDP Act, can be of a and/or of a . The AA decides the purposes and means of processing the digital personal data-making it a data fiduciary under the Acr. Therefore obligations for data fiduciaries have to be considered for an AA to remain compliant with the DPDP Act,2023.
For these guidelines, we assume that the AA processes or directs the processing of digital personal data to provide for certain benefits, services, certificates, licenses or permits ( these cover most of the functions that DIGIT provides and are mandates of Urban Local Bodies )
It is to be noted that similar guidelines can be found for ‘data controllers’ under the European General Data Protection Regulation (GDPR). There are a few basic and minimum requirements that data controllers (bodies that decide the purpose for processing data and the means used to process the data) have to comply with to ensure (DbD and PbD) and design (data minimization, implementing safeguards for protection of data, controlled access to data, limited purposeful storage, etc). In Europe, government entities are legally obligated to maintain DbD and PbD or else they are heavily penalised.
There are a few common guidelines that all roles must follow.
Data security measures and practices stand out as the core / foundational guidelines to follow for all roles.
Data Privacy and Protection (DPP) is only possible when the systems and computer resources receiving and storing data are secure (safe from any harm). Some key data security measures include:
Access control: Establish audited and controlled access for personally identifying data, including authentication and authorization mechanisms. Authorize individuals only if they have a legal basis and a legitimate purpose to access the data.
Encryption: Encrypt PII data both in transit and at rest to protect it from unauthorized access and theft.
Data backup and disaster recovery: Regularly backing up PII data - including auditing the need for PII - if not required then deleting the PII, and implementing disaster recovery plans to ensure that important data can be recovered in the event of a data loss or breach.
Network security: Implement firewalls, intrusion detection and prevention systems, and other network security measures to protect against cyber threats.
Vulnerability management: Regularly scan for vulnerabilities and patch systems and applications to minimize the risk of a breach.
Employee education: Educate employees/contractors/personnel on the importance of data security and provide training on best practices for protecting PII.
Physical security: Implement physical security measures, such as access controls and video surveillance, to protect against unauthorized access to data centres and other data facilities.
Third-party security: Carefully vet third-party vendors and service providers, to ensure that they have appropriate security measures in place to protect PII.
Incident response: Develop and test incident response plans to ensure that the organization can quickly and effectively respond to a security breach.
Compliance: Adhere to relevant security and privacy regulations, such as the IT Act, 2000, and its subsequent amendments and Rules. Conforming with standards such as ISO 27001 is also a good practice.
The broad roles we cover under these guidelines include:
Administrative authority (AA) - the body mandated to provide for the government service, usually an agency or department of the State government. This can include bodies like the State Program Directorate and Program Monitoring Units. The above-listed roles (In Section D) come under this category.
Implementation agency - the body that implements the platform provided by the platform owners into the administering authority’s IT systems.
Maintenance team - can be an additional team that gets involved after the platform is implemented in the administering authority’s systems to maintain the system.
The previous document in this series covered the guidelines for platform owners/providers.
To understand what each role should do to safeguard DPP, it is important to understand what these roles do at each phase of the implementation of DIGIT.
What is a program?
A program can be a delivery of any government service/s which the AA is mandated to provide to citizens for which it requires a platform. Defining the scope of the program is within the power of an AA.
D.2.1.1 What happens in this stage?
A Memorandum of Understanding is signed between the AA and the platform owners.
The AA appoints a State Program Head/Nodal Officer.
Resources and funding for the program are identified.
The AA-specific procurement process is defined.
SI team onboarding is initiated.
D.2.1.2 AA and implementing agency’s role at Stage 0
Ensure onboarding of manpower and infrastructure as specified.
Lead the setup of the program and related governance structure
Appoint program steering committee and state nodal officer.
Initiate onboarding of implementation partner.
Initiate onboarding of Cloud infrastructure provider.
Identify module deployment priorities
Operational
Must-haves
The relevant department of the AA makes a financial commitment towards enabling data privacy, protection, and security in the system. This would solidify the operations of any measures taken for DPP.
Spells out data use and governance terms and conditions in all agreements with partners, e.g. MoUs and contracts.
There is a clear definition of who owns data who is responsible for data protection, extent and purpose of data access in MoUs and contracts.
The AA documents the purposes for data collection, use and disclosure. Such purpose is to be found in a legally backed instrument or documentation (can be a government authority-sanctioned document pointing at a law/ policy that needs such data to be processed for citizen delivery purposes).
AA ensures that the design of the platform would involve data collection and processing only if there is a defined purpose for such data being processed.
The selection of a particular technological solution considers the risks inherent to that technology. For instance, the AA questions the risks that the tech system brings to the citizens and employees if used for service delivery. Risk assessment can be done by conducting an anticipated data protection impact assessment (DPIA). The platform owner must be involved in this assessment, as they are probably the best placed to evaluate the technologies that might be used and that might involve a high risk for the rights and freedoms of people, making the DPIA necessary.
AA creates measures that can be put in place to reduce the anticipated risks deduced from the above, taking into account the state of the current technology and the cost of applying them.
The other actors that AAs engage with to provide digital government service delivery are -the Implementation Agencies (IA)
At this stage the AA -
Defines how the IA must ensure that DPP practices are maintained
Draft a working agreement which commits the IA to safeguard the data and the right of citizens to data protection and privacy
A few clauses which the AA could ask the IA to contractual perform are:
Maintain transparency in data practices (API-based data received, shared or used should be visible to the AA)
Report any data security breach to the AA
Audit and create safeguards against non-authorized third-party access to data
Implement appropriate security controls like encryption at source, masking of data, ABAC logins, and conducting regular security audits and checks.
Regularly educate its employees on data privacy, data ethics and data privacy.
The IA can preferably:
In the works with the AA, include at least one person having a working knowledge of data security, privacy, and protection. IA usually have an in-house team for data security and privacy; if there is no such team, the IA may bring a person with relevant skills in for the project. In case the AA has chosen an in-house entity to implement the project rather than an IAI, the project in charge should again ensure that a person with these skills is brought in for the project.
At stage zero, the AA begins to consider which infrastructural systems they would be adopting. Cloud infrastructure providers are usually sourced by the IA or by the AA. Choosing a cloud storage provider with safe data governance practices is important to avoid data breaches.
A few must-haves the AAs must keep in mind before selecting a cloud infrastructure provider are:
The Cloud provider has to be compliant with the under the Digital Personal Data Protection Act, 2023 [DPDP (ACT)]
Maintains and implements data protection policies.
Encrypts PII with private keys,
Conduct regular security checks and audits on data security and privacy
Safeguards data from being shared with unauthorized users
Maintains transparent data storage and governance models
Publishing of the program charter and implementation plan.
Master data collection kickoff in Pilot ULBs
Cloud Infrastructure is procured
Program branding is done (name, logo, tagline etc.)
1. Identify Pilot ULBs (urban local bodies) or any other local governmental agency or body that is mandated to provide the government service
2. Appoint the data collection team and initiate data collection from the Pilot ULBs/ bodies in the required format.
3. Organize implementation kick-off workshop with relevant stakeholders of identified Pilot bodies
4. Create the program charter and implementation plan
Operational:
Musts
At this stage, a best practice model of master data collection (steps listed below) is designed - for it to be implemented when actual data is collected at the next stage. - Here master data is the primary data needed for module functionality.
The master data collection model design includes:
AA collects minimum personal data - they can do so by collecting personal data only once, using federated databases and interoperable systems to avoid re-collection of personal data.
The categories of datasets the AA entity would most necessarily require to provide a service are defined and fixed. This is done so as to adopt the practice of collecting only data that is necessary for the provision of a service at the next stage.
Data without any defined use or need is not collected (data minimisation, ).
Citizens of the ULBs are informed of the purpose of their data being collected.
Data collected at the next stage is stored safely ( on paper or digitally). If collected on paper then once transmitted into a digital system, it is destroyed from the paper or source device.
Dashboards displaying the nature of data to be collected and their corresponding purposes and uses are built; (for transparency and awareness of citizens).
Specific roles are created for building accountability for safe, limited and purposeful data collection.
The program charter clearly states that the data provider i.e. the citizen is the owner of the data.
The implementation plan has DPP practices embedded in it. For example in the processes of data migration and data processing, the system does not permit sensitive data to be visible to unauthorized roles, strict logins are maintained, and the implementing partners' employees are trained in safe data handling.
Proof of consented collection of personal data (past and future)- either maintained by the AA (as a data fiduciary) or maintained by a separate data processor ( any entity that processes personal data for the AA) and given to the AA as and when necessary.
Proof of source of personal data -(past and presently held data) which is to be processed is sourced from a database, register or book that is maintained by the administering authority
The DPDP Act permits the AA to process or has processed through a processor any personal digital data of citizens for providing a benefit, subsidy, service, certificate, license or permit (in this case any urban local body function such as birth or death certificate, property license, building plan permit etc) subject to the below conditions :
(i) she has previously consented to the processing of her personal data by the Administering authority or any of its instrumentalities for any subsidy, benefit, service, certificate, licence or permit; or
(ii) such personal data is available in digital form in, or in non-digital form and digitised subsequently from any database, register, book or other document which is maintained by the administering authority or any of its instrumentalities and is notified by the Central Government. All of the above must follow standards that the Central Government may set as policies to follow for processing.
Such previously consented evidence for personal data collection must be maintained for compliance.
Before master data collection (which partly begins in this stage but is majorly conducted at the next stage), AA arranges for staff/employee training in regard to information privacy protection and awareness of relevant policies regarding citizen's privacy protection
The AA begins to draft a privacy policy, envisioning its data governance model.
AA ensures that the IA is onboarding a team with appropriate skill sets through reviews and initial expectation-setting
The implementation kickoff workshops with representatives of the AAs includes training on purposeful master data collection (for the next stage) in an informed and transparent manner (letting the citizen know why they are collecting the data).
As suggested in the above stage, a secure cloud infrastructure provider is appointed.
Standardized ontologies (uniform terminology for easier understanding), processes and workflows are created.
Master data collected in the desired format.
Agreement on AA-specific product customisations is required.
A detailed program plan is made and the tracking mechanism is finalised.
Standardize the ontology, processes and workflows in the states
Initiate policy change if required
Provide state-specific requirements, if any.
Lead solution design, impact analysis, and integration analysis.
Get designs and requirements signed off.
Operational
Musts
AA conducts assessments on data governance of the prepared processes and workflows before finalisation. This includes factors like:
Personally identifying information (PII) to be used in an encrypted/ masked manner through the workflows.
Processing of data takes place only if the benefits from the use of the processed data would be proportional to the risks that such processing produces.
Data that flows in the processes and workflows have strict access requirements
As per the DPDP Act, citizens would have a on request -
(a) a summary of personal data that is being processed by the AA and the processing activities undertaken by that AA with respect to such personal data; (b) the identities of all other Data Fiduciaries and Data Processors with whom the personal data has been shared by the AA, along with a description of the personal data so shared; and (c) any other information related to the personal data of such Data Principal and its processing, as may be prescribed.
As per this right, the AA must maintain a record of the personal datasets it has captured
The above Act also empowers citizens with a right to correct, completely update and erase data. The AA must, on receiving the request of the citizen, correct the inaccurate or misleading data, complete the incomplete data, update the personal data, and also erase the data unless a specific purpose or legal compliance requires such data to remain.
As per the above, the AA has to create processes to undertake such correction, completion, erasure or updation of data - for which keeping an audit log of what data is collected, why, for how long and where is crucial.
At this stage, the AA initiates the introduction of a state-level data policy for clear, legitimate and informed data governance practices it plans on adopting.
Product customization requested by the AA keeps in mind the breach of data and confidentiality and the rights of citizens to data privacy. The AA studies the risks and harms that customisation would cost to the safety of the citizen through the use of his/her data (). The will take into consideration the impact that data use may have on an individual(s) and/or group(s) of individuals, whether legally visible or not and whether known or unknown at the time of data use.
Configurations include the feature of asking for feedback from citizens when the platform proceeds for a UAT. Citizens are asked for feedback on how data is being handled and whether they are aware of why their data is being used.
Service Level Agreements include security checks at each level of implementation of the platform for data to be kept secure and safe.
Technical
AA implements privacy-enhancing technologies (PeTs), such as encryption, anonymization, and access control systems, to protect the personal data that is part of the master data.
The existence, nature, and anticipated period of retention of data and the purpose of data used through the workflows and processes are publicly disclosed and described in clear and non-technical language suitable for a general audience
A configured/customized product is created that is ready for UAT (User acceptance testing).
Monitoring Reports and Dashboards are ready ( to understand the rollout of the modules).
Product artefacts like user guides are created.
Identification of participants for the UAT session.
Master data cleaning and validation.
Collect ULB-specific baseline data to measure performance and adoption.
Identify ULB level Nodal officers for day-to-day support.
Operational
Musts
At this stage, the user guides, and instruction manuals are created which include guidelines and best practices on DPP. These user guides could pick up practices from Table 2 here.
The master data cleaning and validation is the last opportunity for the AA officials to decide on the necessity of data collected for defined purposes. If no corresponding purpose is found for a dataset collected, the dataset does not proceed to flow through the UAT demo.
The nodal officers are made aware (through training and workshops) of the importance of data privacy and protection and are trained to manage data in a secure manner
The AA’s privacy policy is visible on the UX ( privacy policy must be easy to understand and small - a sample privacy policy can be found here)
AA assesses the design of dashboards displaying data, and other data interactions, just as one would for other elements of the product experience.
AA conducts reviews and checks on whether people can easily access and understand relevant privacy information, whether participants feel they have the right information at the right time, and whether they are in the appropriate state of mind to take informed action.
As part of UAT, include questions that check if the platform permits people to access their data, understand the purpose for such data being used, by whom is it used and what happens to the data after its use is over. AAs or IAs can use this sheet as an activity to assess if they are providing the rights to citizens.
UAT Sign-off & Go Live permitted for Pilot ULBs or other mandated govt bodies for delivery of services
Setup of review & monitoring cadence.
Conduct user acceptance testing and provide sign-off.
Organise ULB employee training workshops.
Set up help desks and support mechanisms for ULB.
Lead the UAT / pilot / Go-live as required along with training key client personnel.
Operational
Musts
Ensure that the training program for the AA personnel includes DPP practices training as well (can include training in data ethical data collection, use and storage, communicating purpose of data collection, etc.)
Conduct a data security check while/before conducting the UAT (A data security checklist should include questions like-
Is PII data encrypted when shared?
Is data stored in safe databases?
Do limited roles have access to PII?
Are employees aware of incident reporting?
Is there a data protection and privacy policy for hardware protection and external media devices? (Wherever there is a use of removable physical media document the use of such removable physical media and maintain an accurate, up-to-date record of the user profiles created for users who have been authorised access to the information system and the personal data contained therein).
As part of the review and monitoring cadence the AA creates a DPP checklist (A DPP checklist can have items like -
A privacy policy has been established and approved by the AA
PIAs, or privacy risk assessments, are planned to be regularly performed
Data processing agreements have been established with all third parties that will process personal data
The software and infrastructure regularly undergo security risk and threat analysis
AA has a privacy education/awareness training program
AA is prepared to handle security incidents affecting personal data
The amounts of personal data that can be collected have been minimized
The purpose of data collection has been defined to be as specific as possible
Any sharing of personal data to third parties has been clearly specified
The retention date is no longer than necessary to fulfil the purpose of data collection (or to comply with existing legislation)
The privacy policy clearly states who are responsible for the personal data and how they can be contacted
The privacy policy is clearly written, to make it easy to understand the intended end-users
The length of the privacy policy is not excessive
The privacy policy can easily be retrieved by citizens at all times
The helpdesks provide simple material to explain to citizens or employees the concepts of DPP. These help desks serve as one-stop spots for citizens and employees to understand DPP concepts like data privacy methods, masking and limited and purposeful data sharing. Therefore the help-desk representatives are trained well in DPP concepts and use cases before the platform goes live. They also become the first stop for any incident to be reported on data privacy breach
-Statewide Rollout in batches
- Help desk effectiveness assured
- Critical bugs fixed
- Program success metrics tracking kick-started
Lead rollout, training, change management workshops, training activities and monitoring with client teams.
Plan phase-wise pan-state rollout.
Operational
There is transparency on measures taken for protecting citizens' data privacy and protection ( dashboards showing steps taken, open display of categories of data the AA can see and use, grievance redressal officer appointed)
Awareness has been built amongst AA team/departments and ULB employees on DPP practices listed above and now employees are held accountable for displaying their training in their functions.
Regular Privacy Impact assessments (including gap assessments) and data security audits are conducted to check whether the state-wide platform is safeguarding DPP. Refer to Table 2 to understand if the AA has considered global practices and principles of data protection and privacy for adoption.
Feedback loops must remain active and are speedy at providing solutions for any anticipated or addressed privacy or data protection issue that comes up
While transitioning from old/existing systems to new platform-based systems, the data migration process has taken into consideration data masking, encryption, data deletion, purposeful data retention, etc.
The first batch of ULBs has been made Live after the Pilot.
There is the adoption of the platform in the State among the ULB employees and the citizens.
The adoption tracking & review cadence is set up.
The adoption team at the State and ULB level are set up to drive the adoption of the system.
The state runs multi-channel awareness campaigns to drive adoption among the citizens of the state.
B.2.7.3 Role of Supporting Agency
At this stage, a Supporting agency’ (SA) gets involved. An SA is one that provides support in any functional aspect required by the Prog with respect to that platform implementation (e.g. assistance in maintenance of the platform, technical or operational problem-solving, bug/error resolution).
An SA can be contracted at Stage 5 or at this stage.
It primarily assists, guides, firefights and resolves difficult problems for the AA while the platform is implemented/ getting implemented.
An SA can be given a specific role of overseeing and helping in the DPP practices adoption.
Operational
Musts
The AA team conducts reviews on the comfort or discomfort citizens are experiencing with the above DPP practices and design features implemented on the ground.
Feedback is received from the citizens. Such feedback is analyzed and relevant AA officials are alerted about such feedback for steps to be taken for improvement.
An SA oversees the applicability of the DPP features are live and troubleshoots any problems arising for all stakeholders.
SA whose duty is to set DPP practices into action must create impact reports, firefight any issues that arise and create operational awareness for all stakeholders within the AA.
Awareness campaigns are conducted to build awareness of citizens on their right to data protection and privacy.
There are continuous sessions for employees of AA and IA held on DPP training.
AA or implementation team holds community or ULB-based workshops and creates awareness materials like posters, videos, brochures etc.
There are a few guidelines that are preferred to be followed by the AA but are not to be taken as a must-do. A few of them are:
Stage 1: Program Kick-off
Preferable:
The data collection model is designed to collect the consent of the citizen before collecting the data. Such consent is recorded in a log registry and maintained with each AA department.
There are privacy-specific KPIs or OKRs (objectives and key results) embedded. For example - setting privacy benchmark deliverables like privacy audits and security gap assessments. This would help the AA and IA monitor privacy issues and provide early warning of problems.
AA provides procedures for citizens to complain or inquire about their data
Stage 2: Solution design
Preferable:
AA and IA use privacy concepts as a prompt for generating ideas, such as a ‘crazy eights’ sketching exercise that explores how the platform might work if it processes no personal information (to reduce the amount of PII collected)
AA establishes an internal system for constant data updation and deletion of obsolete data, wherever appropriate and practically possible ( to maintain data accuracy and quality).
Data protection and privacy guidelines for DIGIT implementations for platform owners
DIGIT, an open-source platform, provides a citizen-facing service delivery system in urban governance, sanitation, health, and public finance management.
As citizen data is collected and used for such governance services, data privacy and protection measures are required to ensure this data is managed responsibly and safely.
This document is created to be an online guide, providing guidelines for platform owners (providers of DIGIT-like systems) to maintain data privacy and protect individuals’ data.
Readers can use this to identify the steps they must take as platform owners or providers to ensure data privacy and protection in the context of a DIGIT or DIGIT-like implementation.
It can also provide source material for privacy policies, which should be included in each portal & application.
This is not a technical reference or documentation. It serves as a policy guideline.
References made to DIGIT are also applicable to other platforms similar to DIGIT. Not all parts of the guidelines or featured content may match the reader's platform, hence this document is open to be referred to in parts as needed.
Data is the information collected from an individual or about an individual. In DIGIT, data can be any information pertaining to a citizen, which is required for delivery of service and/or information exchange flow amongst citizen, employee, vendor and administrator. Data and information are interchangeably used.
Data security is the act of securing and safeguarding data. Active measures are taken to protect the data from misuse or harm. Such actions can be pre-emptive i.e. security steps taken before the data is collected or the actions could be reactive i.e. steps taken after the platform has been set up. DIGIT recommends pre-emptive steps as a best practice, and has data security measures embedded within its design.
Data privacy means keeping the data in the control of the individual and allowing the individual to decide what is to be done with their data. Data that is collected or shared without the consent of an individual, especially when exposed and combined with other data points, can cause citizens to be identified and targeted. That’s why protecting data indirectly protects the citizen as well.
Data Privacy requires protecting the data from unwarranted viewing, use, or sharing. This involves taking steps such as data minimisation (only collecting and storing data that serves a defined purpose), encrypting data, restricting access to the data, and/or masking what data is visible to users. DIGIT is designed to maintain the privacy of citizens and protect data by taking all the steps mentioned above and more.
PII stands for personally identifiable information or is also interchangeably used as personal data.
Personal data is defined under the Digital Personal Data Protection Act, 2023 DPDP Act at Sec 2(t) - “personal data” means any data about an individual who is identifiable by or in relation to such data
Personal data is basically any information that makes a person identifiable. This consists of data that is unique for each person. When it is combined with other forms of data, it can help in identifying a person. For example, our biometric information, credit card details, passport details etc. are all PII.
The DPDP Act at Sec 2 (u) states that “personal data breach” means any unauthorised processing of personal data or accidental disclosure, acquisition, sharing, use, alteration, destruction or loss of access to personal data, that compromises the confidentiality, integrity or availability of personal data
A personal data breach means a breach of security leading to the accidental or unlawful destruction, loss, alteration, or unauthorised disclosure of, or access to, personal data. This includes breaches that are the result of both accidental and deliberate causes. It also means that a breach is more than just about .
Personal data breaches can include:
access by an unauthorised third party;
deliberate or accidental sharing of data, or of credentials that enable access to data, with an unauthorised person by a handler, collector or processor;
sending personal data to an incorrect recipient;
computing devices containing personal data being lost or stolen;
alteration of personal data without permission; and
loss of availability of personal data.
Both the above definitions can be read together, but the definition given in the DPDP Act would get a preference in case of a conflict.
The DIGIT platform does not handle data unless and until it is implemented in a particular context. Each implementation of DIGIT handles a combination of non-personal data and PII; the latter may further be classified into data which, in itself, identifies the individual to whom it pertains (“identified”) and data which, if combined with other data, can identify the individual to whom it pertains (“identifiable”)[3].
Specific data recorded in various applications that may be implemented on DIGIT include name, parent/spouse’s name, mobile number, city, email id, date of birth, door no/address, and pin code. Data from administrative record systems, such as revenue survey number or other property identifier, connection number, meter number etc. may also be recorded. In the event a person makes payments to the local government, information related to the payment, such as transaction number may also be recorded.
Any personal data collected and stored in DIGIT is encrypted, both in storage and transmission. When displayed, this data is to be masked by default. Users can request to unmask data; if they have the appropriate authorisation, they will be able to view the unmasked data, and this request to unmask will be .
DIGIT implementations are used to provide government services to residents of a given city, town, village etc. The data collected, as described above, are necessary for the delivery of such services. When a resident approaches their local government for a particular service, they can be required to share, or allow access to, the information necessary to provide that service.
DIGIT implementations also collect, store, process, and share information pertaining to employees of local government or other government agencies. Specific information can include name, age, gender, details of spouse or dependents, address, administrative details such as employee ID or other reference number, as well as information on bank accounts (used for processing salaries or pensions).
As employers, local governments may be required to maintain certain information on their employees and pensioners for tax purposes, such as PAN numbers; this information may be recorded and shared as well.
DIGIT implementations may derive, store, process, and share transactional data, which describes the progress of any ongoing task (e.g. any service requested by a resident) through the systems of the local government or other government agency. This can include information about the specific role to whom a given task is assigned (or with whom it is pending), the amount of time elapsed on processing / completing a given request (or task / sub-part thereof), and whether this task has been completed within the benchmark or designated amount of time. It can also include details about the channel through which a particular request was received, such as the ULB counter, service centre, website, helpline, mobile app, chatbot, etc.
DIGIT implementations may derive, store, process, and share aggregated data. These aggregates are derived from the transactional data. This includes data such as aggregate or cumulative revenue collection, aggregate or cumulative number of service requests, percentage of requests resolved within the benchmark time period, etc. The above data may be further analysed and presented in terms of the type of collection, request, or complaint, the channel through which it was received, etc.
DIGIT implementations may collect, store, process, and share telemetry data, which studies how much time is spent on a given screen or field in a workflow/user interface (UI). Such data is normally processed and shared in aggregate, and not used to identify specific individuals. In the event that specific individuals are sought to be identified, such as for user research, their consent will be sought for the further processing or sharing of their data.
DIGIT is designed to enable compliance with requirements under Indian law, as well as global best practices, as follows:
The requirement for a legal basis for the collection, storage, processing, use, and sharing of such data is fulfilled by the mandate of the local government or other government agency to provide a particular service.
The requirement of a legitimate purpose is also met by the mandate of the local government or other government agency to provide that particular service.
The requirement of proportionality is generally met by only collecting or accessing such data as is necessary for providing that particular service. This is also in line with the global best practice of data minimisation.
The requirement for safeguards is generally met by the security and privacy measures, such as encryption and masking, described above; it is further met by implementing role-based access controls, which are aligned with the global best practice of differentiated access.
The requirement for safeguards is further served by providing itemised notice, and by enabling any person to view what data of theirs has been collected/processed/shared (including the purpose for such collection/processing/sharing). Such provision of notice and visibility is further in line with global best practices of notice & user control. Such notice can only be provided in the context of a DIGIT implementation, and eGov provides a draft privacy policy to implementing entities.
With respect to employees and pensioners, the requirements for legal basis, legitimate purpose, and proportionality are generally met by collecting, storing, processing, using, and sharing only such information as is relevant for assigning roles and administrative responsibilities, processing salaries or pensions, and completing any mandatory tax-related reporting.
The datasets flow from a point of service (POS) device to the integrator system, and from the backend of the integrator system to the DIGIT platform.
Data may be entered into a DIGIT registry or database in one of four ways:
A citizen directly enters the data into a web portal, application, chatbot etc., as part of requesting government services.
A local government or other government entity employee or contractor may enter the data into the system on their allocated device (computer at the service counter, mobile application in the field, etc.), as part of their work in providing government services.
Data is migrated/ported from some previous systems, including paper-based systems, especially during the initial setup of the DIGIT-based system. This is also known as data migration.
Data is shared in digital formats, such as through APIs or machine-readable spreadsheets, from some other system or platform to a system running on DIGIT.
Each module in DIGIT has different departmental employees feeding data at the field level.
These roles and their levels of access are determined by the administering authority (typically a local or state government), as part of the initial setup of the DIGIT-based system.
Existing roles may be modified and new roles may be created by persons who are authorised to act as system administrators following the initial setup. This means that the administering authority (or a person delegated this power by the administering authority) recognises these roles, and provides them with login credentials to access and collect citizen data for delivery of government services.
Examples of such roles that are recognised by our platform are: “EMPLOYEE","CITIZEN",”DOC_VERIFIER",”FIELD_INSPECTOR",”APPROVER","CLERK".
In any given implementation of DIGIT, the administering authority (local or state government) is understood to have responsibility for the data and is treated as the main data fiduciary with respect to the data of residents that is collected, stored, processed, used, or shared through or by the DIGIT-enabled tools or systems in use in its jurisdiction.
The primary responsibility for ensuring compliance with legal obligations and best practices with respect to data security, data protection, and privacy is thus that of the administering authority. To the extent that any other party (including eGov Foundation) serves as an agent of the administering authority, they are similarly considered responsible for ensuring such compliance, to the extent relevant to the tasks that they are performing.
These guidelines are to be read through the eyes of specific roles in the journey of adopting a DIGIT-based system or platforms similar to DIGIT in a government entity/ies. For each such ‘role’, a reader can follow its tasks in a given stage of implementation of DIGIT or such platform, and the guidelines associated with that task and stage.
It is to be noted that similar guidelines can be found for ‘data controllers’ under the General data protection regulation of Europe. There are a few basic and minimum requirements that data controllers (bodies that decide the purpose for processing data and the means used to process the data) have to comply with to ensure data protection by default (DbD and PbD) and design (data minimization, implementing safeguards for protection of data, controlled access to data, , etc). In Europe, entities are legally obligated to maintain DbD and PbD or else they are heavily penalised.
There are a few common guidelines that all roles in the platform implementation process must follow.
Data security measures and practices stand out as the core / foundational guidelines to follow for all roles.
DPP is only possible when the systems and computer resources receiving and storing data are secure (safe from any harm). Some key data security measures include:
Access control: Establish audited and controlled access for personally identifying data, including authentication and authorization mechanisms. Authorize individuals only if they have a legal basis and a legitimate purpose to access the data.
Encryption: Encrypt PII data both in transit and at rest to protect it from unauthorized access and theft.
Data backup and disaster recovery: Regularly backing up PII data - including auditing the need for PII - if not required then deleting the PII, and implementing disaster recovery plans to ensure that important data can be recovered in the event of a data loss or breach.
Network security: Implement firewalls, intrusion detection and prevention systems, and other network security measures to protect against cyber threats.
Vulnerability management: Regularly scan for vulnerabilities and patch systems and applications to minimize the risk of a breach.
Employee education: Educate employees/contractors/personnel on the importance of data security and provide training on best practices for protecting PII.
Physical security: Implement physical security measures, such as access controls and video surveillance, to protect against unauthorized access to data centres and other data facilities.
Third-party security: Carefully vet third-party vendors and service providers, to ensure that they have appropriate security measures in place to protect PII.
Incident response: Develop and test incident response plans to ensure that the organization can quickly and effectively respond to a security breach.
Compliance: Adhere to relevant security and privacy regulations, such as the IT Act, 2000, and its subsequent amendments and Rules. Conforming with standards such as ISO 27001 is also a good practice.
The roles we cover under these guidelines are that of:
Platform Owners (any individuals or body that creates and owns the platform which facilitates/enables the governments to provide for delivery of service)
The next document in this series covers the following stakeholders -:
Administering authority (AA) (the entity that is mandated to provide for the government service usually an agency or department of the State government). This can include bodies like the State Program Directorate and Program Monitoring Units.
Implementation agency (IA) / system integrator (the entity that implements the platform provided by the platform owner into the administering authority’s IT and operational systems).
Support agency (SA) (the entity that may be involved after the platform is implemented, to maintain the system and provide support/troubleshooting.)
To understand what each role should do to safeguard DPP, it is important to understand what these roles do at each phase of the implementation of DIGIT.
What is a program?
A program can be a delivery of any government service/s which the AA is mandated to provide to citizens for which it requires a platform. Defining the scope of the program is within the power of an AA.
D.2.1.1 What happens at this stage?
A Memorandum of Understanding is signed between the AA and the platform owners
State Program Head/Nodal Officer Appointed by the AA
Resources and funding for the program are identified.
The AA-specific procurement process is defined.
IA team onboarding initiated.
D.2.1.2 Platform Owner at Stage 0 -
Share specifications for man-power and infrastructure
Program setup best practices are shared
Enablement for product, configuration, infra
D.2.1.3. To-Do’s:
Operational:
Any kind of agreement to be signed between the platform owners and the AA (e.g. an MoU) must have clear terms of data management, liability, and governance.
Inform and clearly explain the platform’s in-house data practices and in-design DPP policies and practices of the platform to the AA.
Create and maintain data use agreements with the AA (points on the legal framework for data access, what is to be done of the data and security controls are agreed on)
Advisory:
Encourage and guide AA to adopt Privacy by Default
Encourage AA to include DPP-enabling tech/non-tech resources in infrastructure resource planning. ( For eg - budgeting and reserving resources for data privacy impact assessments on the modules to be deployed)
Encourage and guide the AA team in creating their data privacy and protection policy
Identify the risks and necessary measures with the AA to ensure that any present or future data processing conducted by the AA is safe by design of the platform (Conduct data protection impact assessment)
Guide and assist AA’s in training their employees in DPP practices
Platform team to write a pre-mortem – an imaginary article looking back from the future on the feature’s perfect launch or failure – to help the AA and IA to focus on the privacy aspects that will ensure success.
Encourage the AA to appoint privacy and data security-compliant Implementation teams or IA’s.
Technical:
Do not offer any service module unless it is not compliant with relevant data privacy and protection (DPP) checks/standards and systems (see checklist in Appendix B of this memo)
D.2.2.1 What happens in Stage 1
Publishing of the program charter and implementation plan.
Master data collection kickoff in Pilot ULBs
Cloud Infrastructure is procured
Program branding is done (name, logo, tagline etc.)
D.2.2.2 Platform Owner’s Role in Stage 1
Guide and review manpower, infra and program governance structure. Provide a stable platform version.
D.2.2.3 To-Do
Operational:
Assist and encourage the AA team in setting a standardised data collection framework (with DPP practices like minimum personal data collected, masking of data, providing clear and simple notice to residents.)
Assist the AA and IA teams in placing DPP principles and practices in the program charter
Implementation kick-off workshop must include training in DPP practices ( minimum data collection, masking at field level, etc.)
Encourage field-level ULB data collection team to inform residents about what data is collected and for what purposes
Advisory:
Encourage the AA team to keep DPP as a priority in the implementation plan
Advise the AA teams to collect as minimal data as possible to comply with the data minimisation principle ( at the master data collection level)
Encourage within the AA employees, the behaviour of adopting privacy enhancing technology (PET). Highlight the increase of trust and reputation through the adoption of PETs’.
Encourage IA to comply with data principle practices enlisted here.
Technical:
Procured cloud infrastructure providers must be reviewed for DPP practices/compliance, especially around retention periods, access logs, and incident/breach management.
D.2.3.1 What happens in Stage 2
Standardized ontologies (uniform terminology for easier understanding), processes and workflows are created.
Master data collected in the desired format.
Agreement on program-specific product customisations is required.
A detailed program plan is made and the tracking mechanism is finalised.
D.2.3.2 Platform Owner’s Role in Stage 2
Guide and review the platform's extension/customisation, data collection/migration and accepted best practices.
D.2.3.3 To-Dos:
Operational:
While standardizing processes and workflows, encourage the AA andIA teams to conduct a Privacy Impact Assessment (PIA): To identify PII that is being collected, processed, and stored, and to assess the risks associated with these operations. This will help identify any privacy issues that need to be addressed in the workflows.
Develop privacy policies and procedures that outline the AA’s / program’s commitment to data privacy, and specify the measures that will be taken to protect personal data. These policies and procedures should be integrated into the workflows of the modules.
Train employees on data privacy principles, such as data minimization and privacy by design. This will help ensure that they are aware of the importance of protecting personal data and the measures they need to take to do so.
Assist the AA and IA in designing a clear data breach response plan (sets out procedures and clear lines of authority for responding to data breaches)
Any changes or additions requested by the AA team that are contrary to any of the data privacy and protection regulations or principles must not be agreed to.
Advisory:
Advise to regularly monitor and audit workflows to ensure that privacy policies and procedures are being followed and that personal data is being protected. This will help identify any areas where improvements can be made to ensure that data privacy is embedded in the workflows of the organization. Create a role that does the above.
Assist the AAs in updating policies and procedures: Regularly review and update privacy policies and procedures in response to changes in technology, and data privacy regulations. This will help ensure that the state stays up-to-date with best practices for protecting personal data.
Advise maintaining a data security ISO standardization as a prerequisite (ISO 27001 is recommended)
Encourage the AA team to create strict access to workflows (login-based access, with access logs and periodic audits)
Encourage better privacy by design customisations (for eg, masking from the point of collection)
Technical:
Implement privacy-enhancing technologies (PeTs), such as encryption, anonymization, and access control systems, to protect personal data. These technologies should be integrated into the workflows of the module to ensure that personal data is protected throughout its lifecycle.
D.2.4.1 What happens in this stage?
A configured/customized product is created that is ready for UAT ( User acceptance testing).
Monitoring Reports and Dashboards are ready ( to understand the rollout of the modules).
Product artefacts like user guides are created.
Identification of participants for the UAT session.
D.2.4.2 Role of Platform Owner in Stage 3
Guide and review architecture, solutions and UX.
Support in the configuration/customisation processes
D.2.4.3 To-Dos
Operational:
Create user guides to maintain privacy as default (e.g. Appendix B in this memo can guide compliance with privacy principles.)
Advisory:
Advise for privacy by default features to be adopted
Guide in the protection of hardware from
Ensure and assist the AA in making sure that all access controls have been set up
Advise the state to include data privacy and protection checklist questions in the UAT phase
Guide and assist the AA and IA in creating anonymised testing data
Guide the AA in setting up privacy-friendly UX such as creating options for citizens to provide active opt-ins, clearly separating between terms and conditions that are compulsory and those that are voluntary, or formulating and displaying meaningful privacy notices.
D.2.5.1 What happens in Stage 4?
UAT Sign-off & Go Live permitted for Pilot ULBs or other mandated govt bodies for delivery of services
Setup of review & monitoring cadence.
D.2.5.2 Owner’s Role in Stage 4
Guide and review solution implementation, communication/branding, training/adoption.
D.2.5.3. To-Dos
Operational:
Assist the AA and IA in conducting training of personnel on DPP practices. Sessions can be held on use cases such as:
paper-based data management
maintaining DPP for paper-to-digital migration
identifying PII and classifying it,
creating a data usage policy,
monitoring access to sensitive data,
implementing encryption for data in transit and at rest,
regularly testing security measures,
providing general training and awareness around data protection,
using multi-factor authentication,
ensuring secure disposal of confidential information,
updating security policies on a regular basis
Check for any practices for DPP that are missed out or are not completely implemented into the designs
Set up a check for DPP compliance within the program review process
Assist the AA and IA in working on any issues or problems that arose from the UAT testing on DPP
D.2.6.1 What happens at Stage 5?
-Statewide Rollout in batches
- Help desk effectiveness assured
- Critical bugs fixed
- Program success metrics tracking kick-started
D.2.6.2 What does the platform owner do at stage 5?
Guide and review in change management, incident management, adoption and awareness plans
D.2.6.3 To-Do’s:
Operational:
Establish clear procedure, based on MOU and in compliance with existing legal requirements, to ensure that in case any data is shared with the platform owner (e.g. for support or bug-fixing purposes): - A secure channel is used for sharing this data - The data is shared by a person who has the authority to share it - The AA has given authorisation to the platform owner to receive it - No other/unauthorised person receives or accesses this data - Data is not retained after the purpose for its sharing has been completed/achieved
Assist the AA team and ULB employees in standardizing role creation, logins and access controls in change management
Advisory
Encourage AA to ensure: - Minimal, purposeful data is collected - Data collected, if sensitive, is encrypted and masked by default - ‘One role, one login’ and such login must not be shared with anyone other than the authorised role - Make citizens aware of the data policy and display it on the UX - Retain data with a documented purpose - Ensure all DPP design features are working and used correctly ( encryption, audit login records)
Assist the AA team in creating awareness on DPP practices through training workshops and guide the IA in conducting such workshops
Help the AA maintain and manage any DPP issues or data security issues (assist in incident management).
Become an advisory point of contact for the AA for any data security or privacy issue.
D.2.7.1 What happens at Stage 6?
The first batch of ULBs have been made live after the Pilot.
There is the adoption of the platform in the program’s jurisdictional zone and amongst its ULB employees and citizens.
D.2.7.2 What does the platform owner do at Stage 6?
Support and guidance in continued adoption
Ad-hoc assistance is required in troubleshooting problems
D.2.7.3 To Do’s
Operational
Assist if there is any issue or doubt that arises in data management, access controls, or security of the platform
If permitted, regularly scan for vulnerabilities and patch systems and applications to minimize the risk of a breach
Advisory
Guides AA and IA in better data confidentiality and privacy management.
If AA and IA insist, take part in leading training programs for employees on best data practices.
Data protection and privacy guidelines for DIGIT implementations (program owners)
DIGIT, an open-source platform, enables governments and service providers to provide interdepartmental coordination and citizen-facing service delivery systems - currently, in urban governance, sanitation, health, and public finance management.
As citizen data is collected and used for such governance services, data privacy and protection measures are required to ensure this data is managed responsibly and safely.
This document is created to be an online guide, providing guidelines for Program owners to maintain data privacy and protect individuals’ data.
Readers can use this to identify the steps they must take, in their capacity as program owners, to ensure data privacy and protection in the context of a DIGIT or DIGIT-like implementation.
It can also provide source material for privacy policies, which should be included in each portal & application.
This is not a technical reference or documentation. It serves as a policy guideline.
References made to DIGIT are also applicable to other platforms similar to DIGIT. Not all parts of the guidelines or featured content may match the reader's platform or context, hence this document is open to be referred to in parts as needed.
These guidelines are to be read through the eyes of roles that are part of the program owners' (Prog) offices in the journey of adopting a DIGIT-based system or platforms similar to DIGIT in a government entity/ies.
If a government authority adopts DIGIT as a citizen service platform, then these guidelines are apt. Some points in the guidelines may not be relevant to platforms other than DIGIT in the governance ecosystem. Hence these guidelines have to be read as advisory.
The previous document in this series covered the guidelines for platform owners (PO), implementing agencies (IA) and administering authorities (AA).
These guidelines share great similarities with the ones created for the AA.
For this document to understand what each program owner should do to safeguard data privacy and protection (DPP), it is important to understand what a Prog does at each phase of the implementation of DIGIT.
Guidelines to be read with the Digital Personal Data Protection Act, 2023
As the Prog adopts DIGIT, it gets access to digital personal data and therefore comes into the ambit of the Digital Personal Data Protection Act, 2023 (DPDP Act). The roles a Prog plays as per the DPDP Act, can be of a and/or of a . Depending on the nature of control the Prog has over deciding the purpose and means of data processing shall make it either a data fiduciary or no such control shall make it a data processor. Therefore obligations for both the roles have to be considered for a Prog to remain compliant with the DPDP Act,2023.
For these guidelines, we assume that the Prog processes digital personal data to provide for certain benefits, services, certificates, licenses or permits ( these cover most of the functions that DIGIT provides and are mandates of Urban Local Bodies).
What is a program?
A program can be a delivery of any government service/s which the AA is mandated to provide to citizens for which it requires a platform. Defining the scope of the program is within the power of an AA.
A Memorandum of Understanding is signed between the AA and the platform owners. A Prog can also be a party to the MoU or maybe an equal power holding or subordinate entity of the AA (which signs the MOU).
The Prog appoints a State Program Head/Nodal Officer.
Resources and funding for the program are identified.
The Prog-specific procurement process is defined.
IA team onboarding is initiated.
Ensure onboarding of manpower and infrastructure as specified.
Lead the setup of the program and related governance structure
Appoint the program steering committee and nodal officers.
Initiate the onboarding of the implementation partner.
Initiate the onboarding of cloud infrastructure providers.
Guide, support or enable the identification of module deployment priorities
Must-haves:
Must fold in clauses and language in the MoU or data access/sharing agreement around:
Data confidentiality and privacy breach provisions with consequences (as prescribed under Sec 72 of the Information Technology Act, 2000)
For strict access controls, damage accountability, and consequences for any data privacy and security breach (as prescribed under Sec 43A IT Act, 2000)
Preferable practices:
Actions which the Prog should ensure are required of the IA (i.e. included as responsibilities of the IA in the contract) are:
Maintain transparency in data practices and mandate regular reporting
Create safeguards against non-authorized third-party access to data
Implement appropriate security controls like encryption at source, masking of data, RBAC logins, and conducting regular security audits and checks.
Conduct periodic audits of access
Report any data security breach to the Prog
Regularly educate the employees of the Prog on data privacy, data ethics and data privacy.
Conduct a risk assessment of the platform technology along with regular data protection impact assessment (DPIA). It is important that the platform owner is involved in this assessment, as they are probably the best placed to evaluate the technologies that might be used and that might involve a high risk for the rights and freedoms of people, making the DPIA necessary.
The cloud infrastructure provider should be selected on the grounds that it:
- Maintains and implements data protection policies.
- Encrypts PII with private keys,
- Conduct regular security checks and audits on data security and privacy
- Safeguards data from being shared with unauthorized users
- Maintains transparent data storage and governance models
Publishing of the program charter and implementation plan.
Master data collection kickoff in Pilot ULBs (Urban Local Body).
Cloud Infrastructure is procured.
Program branding is done (name, logo, tagline etc.).
Appoint the data collection team and initiate data collection from the Pilot ULBs/ bodies in the required format.
Be a part of the implementation kick-off workshop or help organize it to include relevant stakeholders
Assist in creating the program charter and implementation plan
Must-haves:
Proof of consented collecting of personal data (past and future)
Proof of source - presently held personal data and any past personal data which is to be processed is sourced from a database, register or book that is maintained by the administering authority
The DPDP Act permits the Prog to process any personal digital data of citizens for providing a benefit, subsidy, service, certificate, license or permit ( in this case any urban local body function such as birth or death certificate, property license, building plan permit etc) subject to the below conditions :
(i) she has previously consented to the processing of her personal data by the Administering authority or any of its instrumentalities for any subsidy, benefit, service, certificate, licence or permit; or (ii) such personal data is available in digital form in, or in non-digital form and digitised subsequently from any database, register, book or other document which is maintained by the administering authority or any of its instrumentalities and is notified by the Central Government. All of the above must follow standards that the Central Government may set as policies to follow for processing.
Such previously consented evidence for personal data collection must be maintained for compliance.
Include in the program charter:
That the data provider i.e. the resident is the owner of the data.
Include the duty of maintaining data privacy and confidentiality of data collected in the program charter to avoid any illegal breach
Include in the implementation plan
Access controls and data collection practices to avoid breach of privacy or confidentiality of data
Consequences for third party unauthorized access to data
Safeguard measures to avoid any breach of law
Training of data collection teams in topics of safe data access, collection and storage
Safe and audited data sharing and transfer channels of data
Cloud infrastructure to have sufficient data safety and security features. It must have privacy by design inbuilt into its infrastructural design (encrypted storage, tight access controls, strict data security).
Preferable/Good practices:
At this stage, a best practice model of master data collection (steps listed below) can be designed (Here, master data is the primary data needed for module functionality).
The master data collection model design includes:
Collecting data only if it is needed for a specific legitimate reason and defined purpose (data minimisation, ).
Informing residents about the legal basis and reason/purpose for their data being collected (when collected directly from the resident).
Data encryption and masking when data is being migrated from paper to digital or old or new digital systems.
Strategies for safe storage of data ( on paper or digitally).
Destroying paper-based data after a defined migration period (AA or Prog to define a data deletion period post-migration).
Maintaining dashboards that display the nature of data to be collected and their corresponding purposes and uses (for transparency and awareness of citizens).
Embedding DPP practices in the implementation plan. For example, in the processes of data migration and data processing, the system does not permit sensitive data to be visible to unauthorized roles, strict logins are maintained, and IA employees are trained in safe data handling.
Draft/adopt a data privacy policy.
Ensure, through scope setting and reviews, that the IA is onboarding a team with appropriate Data privacy and protection safeguarding skill sets
The implementation kickoff workshops include training on purposeful master data collection (for the next stage) in an informed and transparent manner (letting the residents know why they are collecting the data).
Standardized ontologies (uniform terminology for easier understanding), processes and workflows are created.
Master data collected in the desired format.
Agreement on program-specific product customisations is required.
A detailed program plan is made and the tracking mechanism is finalised.
Approve standardized ontologies, processes and workflows.
Implement the policies required for work to begin on implementation.
Enable and support the IA in solution design, impact analysis, and integration analysis.
Signs off on design and requirements.
Must-haves:
Check for factors like:
Data that includes personally identifying information (PII) is kept in an encrypted/ masked manner through the workflows.
Strict data access requirements are in place (audit logs, restricted access points)
Data policy is created to ensure compliance with the law for data protection against breaches of confidentiality and privacy.
Avoid customisation, workflows, processing etc. that will cause unauthorized access to PII.
As per the DPDP Act, citizens would have a on request -
(a) a summary of personal data that is being processed by the AA or the Prog and the processing activities undertaken by that Prog with respect to such personal data; (b) the identities of all other Data Fiduciaries and Data Processors with whom the personal data has been shared by the Prog, along with a description of the personal data so shared; and (c) any other information related to the personal data of such Data Principal and its processing, as may be prescribed.
As per this right, the AA must maintain a record of the personal datasets it has captured.
The above Act also empowers citizens with a right to correct, completely update and erase data. The AA must, on receiving the request of the citizen, correct the inaccurate or misleading data, complete the incomplete data, update the personal data, and also erase the data unless a specific purpose or legal compliance requires such data to remain.
As per the above, the AA has to create processes to undertake such correction, completion, erasure or updation of data - for which keeping an audit log of what data is collected, why, for how long and where is crucial.
Preferable/Good practices:
Conduct a risk assessment of the customizations checking for risks and harms leading to breach of confidentiality or data privacy. will take into consideration the impact that data use may have on an individual(s) and/or group(s) of individuals, whether known or unknown at the time of data use.
Push for configurations to include the feature of asking for feedback from citizens when the platform proceeds for a UAT. Citizens are asked for feedback on how data is being handled and whether they are aware of why their data is being used.
Ensure that service level agreements include security checks at each level of implementation of the platform for data to be kept secure and safe.
Define a data retention period, keeping in mind purpose and legal requirements.
A configured/customized product is created that is ready for UAT.
Monitoring Reports and Dashboards are ready (to understand the rollout of modules).
Product artefacts like user guides are created.
Identification of participants for the UAT session.
Supervises the master data cleaning and validation.
Enables collection of ULB-specific baseline data to measure performance and adoption.
Identifies ULB-level Nodal officers for regular support.
Must-haves:
All data-related processes at this stage are undertaken while maintaining the confidentiality of the data (masking, restricted access, role-based access control).
Testing data should not include PII.
User guides have clear steps on data access and sensitive data management.
The consequences of unauthorized access and breach of privacy and confidentiality are made clear to all team members.
Any customization or configuration that could lead to a breach of data and affect the data privacy of citizens is rejected.
Ensure that the privacy policy is visible on the UX (privacy policy must be easy to understand and small - a sample privacy policy can be found here).
Preferable/ Good practices:
Ensure that the nodal officers are made aware (through training and workshops) of the importance of data privacy and protection, and trained to manage data securely
Check for feedback from employees on access mechanisms and delivering services with proposed levels of data access, masking, etc (Can use this sheet as an activity to assess how they are ensuring the privacy rights of residents).
The User acceptance test is conducted and a sign-off and go-live permission is given for identified Pilot ULBs or other mandated govt bodies for the delivery of services.
Setup of review & monitoring cadence.
Enables or conducts user acceptance testing
Organizes ULB employee training workshops
Sets up help desks and support mechanism for ULB’s
Lead the UAT / pilot / Go-live as required along with training key client personnel
Must-haves:
Training of employees on data safety and privacy practices
Conduct data security checks before signing off on the UAT.
A data security checklist should include-
Personally identifying information (PII) data is encrypted/masked when shared
Data is stored in safe databases
Employees don’t openly share access logins
Limited roles have access to PII,
Employees trained in incident reporting,
Data protection policy for hardware protection, external media devices
The monitoring and evaluation cadence has data privacy and protection as a threshold for security checks. A report is submitted to Prog as part of the review and monitoring cadence for DPP:
The privacy policy is uploaded and displayed
The privacy policy clearly states who is responsible for the personal data and how that official can be contacted
Assessments for data breaches and security checks are planned to be regularly performed
Data processing and sharing agreements have been established with all third parties that will process personal data
The software and infrastructure regularly undergo security risk and threat analysis
The program has a privacy education/awareness training
SOP for security incidents affecting personal data is established
The amounts of personal data that can be collected have been minimized
The purpose of data collection has been defined to be as specific as possible
The data is retained only till there is a need for it
There are checks on data sharing, with verification that sharing is legally authorised and approved by the appropriate official
Preferable practices:
The help desks provide simple material to explain to citizens or employees the concepts of DPP. These help desks serve as one-stop spots for citizens and employees to understand DPP concepts like data privacy methods, masking, and limited and purposeful data sharing. Therefore the help-desk representative is trained well in DPP concepts and use cases before the platform goes live. They also become the first stop for any incident to be reported on data privacy breach.
Statewide Rollout in batches
Help desk effectiveness assured
Critical bugs fixed
Program success metrics tracking kick-started
Leads the rollout, training, and change management workshops,
Monitors training activities
Must-haves:
Receive regular reports on any data breaches
Maintain a check on access controls
Regularly update and train employees on safeguards
Preferable/Good practices:
Maintain transparent practices for data governance
Work with employees to apply their DPP training in their functions.
Receive reports on Privacy Impact assessments (including gap assessments) and data security audits to check whether the program is safeguarding DPP. Refer to Appendix B in this memo to understand if the Prog has considered global practices and principles of data protection and privacy for adoption.
Maintain active feedback loops to provide solutions for any anticipated privacy or data protection issues that may arise
Manage data migration processes (while transitioning from old/existing systems to new platform-based systems) to maintain data safety and privacy best practices, i.e. data masking, encryption, data deletion, strict access controls, etc.
The first batch of ULBs have been made live after the Pilot.
There is the adoption of the platform in the program’s jurisdictional zone and amongst its ULB employees and citizens.
Maintains the adoption tracking & review cadence.
Drives the adoption of the system.
Implements multi-channel awareness campaigns.
Must-haves:
Prog checks for all of the above data privacy and protection measures being maintained and continuously running
Prog reviews implementation of DPP practices and reviews issues in adoption by employees. Prog tries to balance service delivery and data privacy and security.
Preferable/Good practices:
Conduct awareness campaigns for residents on their right to data protection and privacy, and DPP measures being taken in the program.
Organizes sessions for employees and contractors of Prog (and IA / SA if relevant) on DPP measures, principles, practices, etc.
Create awareness materials like posters, videos, brochures etc.
Draft data privacy policy for implementation agencies
This document is a sample data privacy and protection policy for the implementation agencies[1] to pick up and replicate on their user-facing web pages. Entities using this document may make modifications as relevant to the context in which they are using this document.
This document is a draft for reference and does not have any legal effect in and of itself.
eGovernments Foundation does not guarantee that this document will correctly represent all relevant laws or legal obligations, as these can vary across jurisdiction and time.
eGovernments Foundation does not guarantee that the use of this draft, with or without modifications, will cover any or all legal obligations of a specific entity in a specific jurisdiction.
Any entity or individual that uses this draft, with or without modifications, does so at their own discretion, and at their own risk.
By using the draft policy provided, you acknowledge and agree that:
The draft policy is provided for informational purposes only and does not constitute legal advice.
The draft policy is a general template and may not be suitable for your specific needs or circumstances. It is your responsibility to review and modify the draft policy to meet your requirements.
We make no representations or warranties, express or implied, regarding the accuracy, completeness, or reliability of the draft policy. We do not guarantee that the draft policy is up-to-date or compliant with current laws or regulations.
You assume all risks and liabilities associated with the use of the draft policy. We shall not be held liable for any direct, indirect, incidental, consequential, or special damages or losses arising from the use or reliance on the draft policy.
We recommend consulting with a qualified legal professional to obtain advice tailored to your specific circumstances before implementing any policy based on the draft provided.
By using the draft policy, you agree to release and discharge us from any claims, demands, liabilities, actions, or causes of action arising out of or in connection with the use of the draft policy."
It is crucial to seek legal advice to ensure that this policy meets your specific requirements and is enforceable in your jurisdiction.
<IMPLEMENTING AGENCY NAME> as an IMPLEMENTING AGENCY (IA)[2] (“we” or “us” or “our”) for <PROGRAM NAME> respects the privacy of the users (“user” or “you” also referred to as ‘your’). Hence, we maintain the highest standards for secure activities, user information/data privacy and security.
<PROGRAM NAME> is implemented through a collaboration between the <ADMINISTRATIVE AUTHORITY NAME>, us and the eGovernments Foundation. It is powered by DIGIT, which is an open-source software platform developed by eGovernments Foundation.
This privacy policy describes and determines how we deal with your personal and usage information in accordance with the applicable laws of India.
<PROGRAM NAME> refers to the services being provided through <PROGRAM NAME> website, mobile App, <OTHER CHANNELS AS RELEVANT>.
Through <PROGRAM NAME>, you can access and avail services offered by <STATE OR ULB NAME> Government departments, Central Government department, Local bodies & their agencies and corporate/private bodies (utility services) (Service Providers).
You can use <PROGRAM NAME> website/application/services in different ways such as service discovery, availing services, registering grievances, and so on.
The purpose of this policy is to maintain the privacy of and protect the personal information of users, employees, contractors, vendors, interns, associates, and partners of <PROGRAM NAME>. The <PROGRAM NAME> ensures compliance with laws and regulations applicable (partner to insert a list of laws they have to comply with ) to <PROGRAM NAME>.
We adhere to the principles of accountability, transparency, purposeful and proportional collection, usage, storage and disclosure of personal data (“PD”)[3].
We do not collect any data/information directly from you. We only access, process and use datasets given to us by the Service providers.
We access data that is required for designing, testing, deployment and configuration. Such data may include PD such as your first name, last name, parent’s / guardian’s name, address, email address, telephone number, age, gender, and identification documents. We may access, process, or use your educational, demographic, location, device and other similar information.
The nature of the data would change as per the needs of the service providers/ULB/State or Central Government. The scope of the datasets/data points we access is thus defined by the Local Government / Program Owner / Administrative Authority who has contracted our services.
We also access, store, process, and share information pertaining to employees and contractors of our own organisation, as well as local government or other government agencies. Specific information can include name, age, gender, details of spouse or dependents, address, administrative details such as employee ID or other reference number, as well as information on bank accounts (used for processing salaries or pensions). As employers, local governments may be required to maintain certain information on their employees and pensioners for tax purposes, such as PAN numbers; this information may also be accessed and processed in case we are asked to perform or support the performance of any of those tasks and functions.
We derive, store, process, and share transactional data, which describes the progress of any ongoing task (e.g. any service requested by a resident) through the systems of the local government or other government agency. This can include information about the specific role to whom a given task is assigned (or with whom it is pending), the amount of time elapsed on processing / completing a given request (or task / sub-part thereof), and whether this task has been completed within the benchmark or designated amount of time. It can also include details about the channel through which a particular request was received, such as the ULB counter, service centre, website, helpline, mobile app, chatbot, etc.
We derive, store, process, and share aggregated data. These aggregates are derived from the transactional data. This includes data such as aggregate or cumulative revenue collection, aggregate or cumulative number of service requests, percentage of requests resolved within the benchmark time period, etc. The above data may be further analysed and presented in terms of the type of collection, request, or complaint, the channel through which it was received, etc.
We may collect, store, process, and share telemetry data, which studies how much time is spent on a given screen or field in a workflow/user interface (UI). Such data is normally processed and shared in aggregate, and will not be used to identify specific individuals. In the event that specific individuals are sought to be identified, such as for user research, their consent will be sought for the further processing or sharing of their data.
We are either handed over such data by the Service providers or are given authorized permissions to access, process or use such data. We may also collect data from Union, State, and Local governments, including their agents/employees as well as receive data that is available openly for public use.
We do not collect any PD directly from you. We are given authorized access logins by the Program Owners or the Service providers.
We stop accessing, processing, using and storing any PD from the time our role or functions cease i.e. either end of the agreed period or till the final handover.
We do not directly store any data in our systems. We are given access to the databases and systems of the program owner. Such access is required for our functional purposes. Once our purpose is served - i.e. the platform has been implemented, and/or the scope of work specified in our contract with the program owner / administrative agency has been completed or terminated, our access to such storage will be terminated as well.
With respect to data to which we have access, we maintain the following safeguards:
Maintain access audit logs of roles accessing the systems
Data from the platform implementation/program is not stored in our own systems. We have access to this data only through devices or systems approved by the program owner / administrative agency.
Data from the platform implementation/program to which we have access is not shared with any third parties, or in any channel, medium, or forum not specified within our contract with the program owner / administrative agency.
We use this data to enable the program owner with the requirements for which they have contracted our services, such as setting up hardware, customising, extending, configuring and installing the software, assisting in implementation, training etc. Specifically:
We access data (either anonymised/metadata) for:
Studying the feasibility of the requirements asked for by the Service providers
Creating an implementation plan ( the plan would have no PD, but to design the implementation plan, a study of PD would be undertaken)
Measuring performance and adoption metrics
Training and awareness-building activities
We may access and process data for conducting research or analysing user preferences and demographics if asked to by the service providers ( statistical data and not any individual data)
We access PD for:
Testing to be deployed or integrated hardware and software at identified urban local bodies or jurisdictions - as instructed to us by the service providers.
Master data cleaning and validation before deployment and integration
For resolving any disputes, troubleshooting any problems, and solving for critical bugs that may arise with respect to the use of the platform.
Share data in order to comply with the law or any legal process, including when required in judicial, arbitral, or administrative proceedings.
We will not process, disclose, or share your data except as described in this policy, or as otherwise authorized by the program owner / administrative agency.
By default, we do not display, share or store any PD. Only persons with the appropriate authorisation can access PD. We log each such request, thus creating a non-repudiable and auditable trail of each such access to PD[4]. We do not share any PD except as specified in our contract with the program owner / administrative agency.
Yes, this policy is subject to change at any time and without notice, provided that such changes are consistent with our contract with the program owner / administrative agency. This page will be updated with any such modified policy, and such changes will not be deemed to take effect until and unless they are shown on this page. You are reading the current version of the policy.
In case of any grievances, you may send your complaints for redressal over the Grievance Portal available at <LINK TO GRIEVANCE PORTAL / MECHANISM>.
An agency that deploys and configures a platform for the program owner (see below) is an implementing agency (IA). An IA may: set up the hardware necessary for the program; customise, extend, configure, and install/set up the software (platform) as per the needs of the program owner; train staff or contractors of the program owner on how to use the platform; perform other such functions to ensure program readiness as may be agreed upon between the implementing agency and the program owner and/or administrative authority responsible for such platform implementation. ↑
An IA is an agency that deploys and configures a platform for the program owner ↑
Once the PDP Bill is introduced, a formal/detailed Privacy Policy can be drafted and linked here. ↑
As above, this can be detailed in the Full Privacy Policy. ↑
Privacy Law in India: What does it mean for eGov?
The Constitution of India does not explicitly provide for the right to privacy.
Privacy has been read into the right to life and personal liberty (Art. 21) in the judgement of Puttaswamy v. Union of India. Following the judgement, the right to privacy is an inalienable and inherent right under the Constitution, though still an implied one (not explicitly mentioned as such). The judgement created a 4-fold test on the basis of which privacy practices can be created.
The Supreme Court has identified a that can measure what potentially affects privacy. eGov should aim at satisfying these tests for privacy compliance.
To satisfy the tests of -
Legality (sanctioned by law)
Every dataset collected, stored, transmitted, analyzed or shared must be done on the basis of legal authority i.e. the entity that is collecting / storing / processing / sharing the data must be able to point to a legal instrument that gives it the power to do those things. For Urban Local Bodies (ULBs), the main source of authority would be Art. 12 of the Constitution, read with Part IXA (Art. 243P-243ZG) and the 12th Schedule.
Legitimate aim
For each item of data collected, stored, processed, or shared, there should be a clear purpose identified; this purpose must flow from a legitimate task that the entity collecting it (i.e. a ULB) is mandated & authorized to perform (hence, legitimate), and this purpose must be communicated to the citizen. This is closely related to the , , and .
Proportionality
Any form of data handling must be tested from a risk-benefit lens. Based on this assessment, we should ask the question: “Is there a less intrusive or lower-risk way to do this?” If yes, we should adopt that method.
Appropriate safeguards
The processes and assessments involved in all of these decisions must be documented. In addition, we can look at multiple layers of safeguards:
Role-based access controls
Indelible logs and audibility
Incident/breach management systems, including notice to legal / investigating authorities and to citizens
Consent frameworks
Security audits (software and process)
The IT Act creates a class of entities known as intermediaries, and places obligations upon them with respect to the receipt, storage, transmission, and processing of data.
Intermediary is defined as -
Sec 2 (w) ―intermediary, with respect to any , means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes.
An intermediary is thus defined as an entity which, on behalf of another person,
Receives an electronic record
Stores the electronic record
Transmits the electronic record
‘provides any service with respect to electronic records’.
NOTE: The exact interpretation of ‘service with respect to electronic records’ has not been established yet. To the extent that eGov can demonstrate that it does not interact with any citizen’s data, this provision may not apply to eGov – i.e. eGov is not an intermediary.
The IT Act places certain obligations and penalties on any/all persons, irrespective of .
Sec 43 holds any person up for penalties and compensation for “damage, unauthorised access, illegal downloads, disruption, denial of access, the introduction of the virus among others to a computer, computer system, etc.”.
Sec 72 makes disclosure of electronic records, information, etc. without consent from the relevant person or authority punishable with imprisonment up to 2 years &/or a fine up to Rs. 1 lakh.
Sec 72A makes disclosure of information without consent and in breach of a lawful contract with an intent to cause wrongful loss or gain punishable with imprisonment up to 3 years &/or a fine of Rs. 5 lakhs.
NOTE: This section is relevant to all employees and contractors at eGov, in their personal capacity, as well as eGov as an organization. If such a breach occurs due to the actions of an eGov employee/contractor, eGov may be liable to fines.
Tighten access controls
Tighten security
Build in consent-taking mechanisms to avoid liability of wrongful disclosure
Maintain clear contractual liabilities and strictly abide by the contract.
If eGov does interact with citizen data (provides any service with respect to electronic records) then a few obligations as an intermediary are to be complied with -:
Sec 67C - Preservation and retention of information by intermediaries–(1) Intermediary shall preserve and retain information ‘as prescribed by the rules’. Intentional contravention of this section can lead to imprisonment of up to 3 years and a fine.
NOTE: The Rules relevant to this Section of the IT Act have not been prescribed yet; in any event, the section will not apply if eGov is not considered an intermediary.
Sec 79 of the Act allows intermediaries to be exempt from liability for third-party information – where such information is found to be illegal, criminal, harmful etc. – that they stored, transmitted etc., under certain conditions (sometimes known as ‘safe harbour’). The key to safe harbour is that the intermediary was not aware of the information, did not in any way modify or edit it, and did not make decisions about its transmission.
NOTE: To the extent that eGov would be looked at as an intermediary for processing data, it would fall outside the protection of this provision; however, the question of whether ‘services’ extend to automated processing is still in debate. In any event, the section will not apply if eGov is not considered an intermediary.
Sec 43A read with the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 mandates bodies corporate to provide a privacy policy, to collect, transfer, and disclose information in a mandated manner, and to maintain reasonable security practices and procedures as provided in the Rules.
NOTE: eGov is not a ‘body corporate’ within the meaning of the IT Act/Rules. Nonetheless, given that our software is intended to be used by governments, and will be used to collect/store/process large volumes of citizens’ personal data (including sensitive personal data), eGov should abide by the guidelines as a matter of responsibility & good practice.
The new changes in the IT law have now removed the 2011 rules and replaced them with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These rules obligate intermediaries to -:
Publish - in English language or any , their privacy policy, user agreement for access and use of our products and services as well as the rules and regulations that apply to anyone using our products and services.
The rules eGov sets for the usage of its products and services must be designed on the following principles -:
No one is allowed to host, display, upload, modify, publish, transmit, store, update, or share any information -
that one does not have any rights over and does not belong to that person (third-party information),
defamatory, obscene, pornographic, paedophilic, invasive of another‘s privacy, including bodily privacy, insulting or harassing on the basis of gender, libellous, racially or ethnically objectionable, relating or encouraging money laundering or gambling, or otherwise inconsistent with or contrary to the laws in force;
is harmful to any child;
infringes any patent, trademark, copyright or other proprietary rights;
violates any law for the time being in force;
deceives or misleads the addressee about the origin of the message or knowingly and intentionally communicates any information which is patently false or misleading but may reasonably be perceived as a fact;
impersonates another person;
threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign States, or public order, or causes incitement to the commission of any cognisable offence or prevents investigation of any offence or is insulting other nation;
contains software virus or any other computer code, file or program designed to interrupt, destroy or limit the functionality of any computer resource;
is patently false and untrue, and is written or published in any form, with the intent to mislead or harass a person, entity or agency for financial gain or to cause any injury to any person;
Inform users that non-compliance with any rules/privacy policy or user agreement would lead to immediate termination of access/usage of eGov products/services and deletion of such content.
Presently, the ‘Act’ enters into force as a stand-alone legislation to protect digital personal data i.e. all personal data available in a digital form.
Definitions -:
means any data about an individual who is identifiable by or in relation to such data.
defines bodies such as the data fiduciary, data processor and significant data fiduciary.
means a wholly or partly automated operation or set of operations performed on digital personal data and includes operations such as collection, recording, organisation, structuring, storage, adaptation, retrieval, use, alignment or combination, indexing, sharing, disclosure by transmission, dissemination or otherwise making available, restriction, erasure or destruction.
As we wait for further clarification and rules to be passed to better understand the law, we explore the parts of the law that are relevant to eGov, when it plays different roles in its functions.
As a platform owner, eGov would not deal with data at all. It would simply create the code base and hand over the platform to the administering authority or implementation agency for integration.
There would be no interaction with data - no personal data would be touched and therefore this Act would not be attracted by eGov.
eGov may decide to take up the role of a Supporting agency or Implementation agency.
As with any of the above two roles, eGov would be a data processor i.e. it would on behalf of or on the instructions of the or program owners.
As an Implementation agency (IA): Where eGov is contracted to deploy and configure its platform into the administrative authority or program owner's systems, functioning as an implementing agency, eGov would be involved in setting up the hardware necessary for the program; OR customise, extend, configure, and install/set up the software (platform) as per the needs of the program owner; OR train staff or contractors of the program owner on how to use the platform; OR perform other such functions to ensure program readiness as may be agreed upon between the implementing agency and the program owner and/or administrative authority responsible for such platform implementation. As an IA, eGov would have access to data (the extent of such access to data may be defined in the agreement between the administering authority and eGov as an IA). Till the time eGov does not decide the purposes and the means of processing, it will remain a processor and not become a data fiduciary.
As a Supporting agency (SA): As an SA, eGov would provide support in any functional aspect required by the program owner with respect to that platform implementation (e.g. assistance in the maintenance of the platform, technical or operational problem-solving, bug/error resolution). SA will have access to such data as is necessary to perform their functions, and this shall normally be specified in the agreement/contract between the supporting agency and the program owner / administrative authority.
Being an IA and a SA would make eGov a processor of data under the Act. (Refer to the definition of a data processor above).
C.1.3.1 Indirect obligations:
The Act holds the data fiduciary responsible for the actions and functions of the data processor. The fiduciary would hire the processor to conduct the relevant processing.
It is to be assumed that an indirect burden of obligations under this law for the data fiduciary is also applicable to the data processor (Section 8 (1) is applicable to the data fiduciary and in parallel to data processors). Hence the must-do’s for processors are to be widely read into the obligations of the data fiduciaries as well.
There are obligations that are applicable to the data fiduciary but involve the function of the processor and include processing. It may then become an indirect obligation on the data processor as well)
eGov may be instructed or mandated to do the below by the data fiduciary -:
Maintain the completeness, accuracy, and consistency of personal data [ Section 8(3)]
Implement appropriate technical and organizational measures to implement the Act [Sec 8(4)]
Intimate the data fiduciary on any personal data breach [so that the data fiduciary can inform the Board and data principal about such a breach - Sec 8(6)]
C.1.3.2 Direct obligations/Must do’s for eGov as a data processor
Below are a few specific obligations the law provides for to be followed by the data processors ( therefore to eGov as a processor)
Process any data only if there is a valid contract between the administering authority and eGov ( data fiduciary & processor) [Sec 8(2)]
Maintain security safeguards to prevent personal data breach [Sec 8(5)]
Follow the instructions given by the data fiduciary on data deletion
Follow processing standards issued through Central government policies ( as issued under Sec 7(b)(ii) - yet to be issued)
Maintain a record of data processed ( to assist the data fiduciary i.e. the relevant administering authority with obligation Sec 11 of the Act).
Draft data privacy policy for administrative authority and/or program owner
This document is a sample data privacy and protection policy for the administrative authority and/or the program owner to pick up and replicate on their user-facing web pages. Entities using this document may make modifications as relevant to the context in which they are using this document.
This document is a draft for reference and does not have any legal effect in and of itself.
eGovernments Foundation does not guarantee that this document will correctly represent all relevant laws or legal obligations, as these can vary across jurisdiction and time.
eGovernments Foundation does not guarantee that the use of this draft, with or without modifications, will cover any or all legal obligations of a specific entity in a specific jurisdiction.
Any entity or individual that uses this draft, with or without modifications, does so at their own discretion, and at their own risk.
By using the draft policy provided, you acknowledge and agree that:
The draft policy is provided for informational purposes only and does not constitute legal advice.
The draft policy is a general template and may not be suitable for your specific needs or circumstances. It is your responsibility to review and modify the draft policy to meet your requirements.
We make no representations or warranties, express or implied, regarding the accuracy, completeness, or reliability of the draft policy. We do not guarantee that the draft policy is up-to-date or compliant with current laws or regulations.
You assume all risks and liabilities associated with the use of the draft policy. We shall not be held liable for any direct, indirect, incidental, consequential, or special damages or losses arising from the use or reliance on the draft policy.
We recommend consulting with a qualified legal professional to obtain advice tailored to your specific circumstances before implementing any policy based on the draft provided.
By using the draft policy, you agree to release and discharge us from any claims, demands, liabilities, actions, or causes of action arising out of or in connection with the use of the draft policy."
It is crucial to seek legal advice to ensure that this policy meets your specific requirements and is enforceable in your jurisdiction.
<PROGRAM NAME> (“we” or “us” or “our”) respects the privacy of our users (“user” or “you” also referred to as ‘your’). Hence, we maintain the highest standards for secure activities, user information/data privacy and security.
<PROGRAM NAME> is a collaboration between <PARTNER(S) NAME(S)> and eGovernments Foundation. It is powered by DIGIT, which is an open-source software platform developed by eGovernments Foundation.
This privacy policy describes and determines how we deal with your personal and usage information in accordance with the applicable laws of India.
<PROGRAM NAME> refers to the services being provided through <PROGRAM NAME> website, mobile App, <OTHER CHANNELS AS RELEVANT>.
Through <PROGRAM NAME>, you can access and avail of services offered by <STATE OR ULB NAME> Government departments, Central Government department, Local bodies & their agencies and corporate/private bodies (utility services).
You can use <PROGRAM NAME> website/application/services in different ways such as service discovery, availing services, registering grievances, and so on.
The purpose of this policy is to maintain the privacy of and protect the personal information of users, employees, contractors, vendors, interns, associates, and partners of <PROGRAM NAME>. The <PROGRAM NAME> ensures compliance with laws and regulations applicable (partner to insert a list of laws they have to comply with ) to <PROGRAM NAME>.
We adhere to the principles of accountability, transparency, purposeful and proportional collection, usage, storage and disclosure of personal data (“PD”)[1].
We collect and/or access, store, process, use, and share information/data (“data”) to improve and provide better services to you. We collect and process PII such as your first name, last name, parent’s / guardian’s name, address, email address, telephone number, age, gender, and identification documents. We may collect your educational, demographic, location, device and other similar information.
We collect information such as Internet Protocol (IP) addresses, domain name, browser type, Operating System, Date and Time of the visit, pages visited, IMEI/IMSI number, device ID, location information, language settings, handset make & model etc. However, no attempt is made to link these with the true identity of individuals visiting <PROGRAM NAME> app, website, Whatsapp Chatbot etc.
The information collected by us shall depend on the services being used by you and may vary from time to time, which will be informed through changes in this policy.
We also collect, store, process, and share information pertaining to employees of local government or other government agencies. Specific information can include name, age, gender, details of spouse or dependents, address, administrative details such as employee ID or other reference number, as well as information on bank accounts (used for processing salaries or pensions). As employers, local governments may be required to maintain certain information on their employees and pensioners for tax purposes, such as PAN numbers; this information may also be recorded and shared.
We derive, store, process, and share transactional data, which describes the progress of any ongoing task (e.g. any service requested by a resident) through the systems of the local government or other government agency. This can include information about the specific role to whom a given task is assigned (or with whom it is pending), the amount of time elapsed on processing / completing a given request (or task / sub-part thereof), and whether this task has been completed within the benchmark or designated amount of time. It can also include details about the channel through which a particular request was received, such as the ULB counter, service centre, website, helpline, mobile app, chatbot, etc.
We derive, store, process, and share aggregated data. These aggregates are derived from the transactional data. This includes data such as aggregate or cumulative revenue collection, aggregate or cumulative number of service requests, percentage of requests resolved within the benchmark time period, etc. The above data may be further analysed and presented in terms of the type of collection, request, or complaint, the channel through which it was received, etc.
We may collect, store, process, and share telemetry data, which studies how much time is spent on a given screen or field in a workflow/user interface (UI). Such data is normally processed and shared in aggregate, and will not be used to identify specific individuals. In the event that specific individuals are sought to be identified, such as for user research, their consent will be sought for the further processing or sharing of their data.
We collect data directly from you (when you use our services) as and when you register and login into the app/website. We may also collect data from Union, State, and Local governments, including their agents/employees as well as receive data that is available openly for public use.
Your data is stored in a secure manner. <PROGRAM NAME> is powered by the DIGIT platform, which has embedded privacy settings (such as encryption), which do not allow your data to be visible to anyone, except persons who are authorised to do so by virtue of their official role. Unless indicated otherwise, this data will be retained for a minimum period of <AS PER UNION / STATE / ULB LAWS> and a maximum period of <AS PER UNION / STATE / ULB LAWS>. You can review and edit your data, as well as delete your data from the app/website by following the procedures <AS PER UNION / STATE / ULB LAWS>.
You may delete your account at any time you wish. In case of deletion, we will remove all your PD from the system, so that it is not visible and/or accessible from any regular operation. However, the PD may be retained in an encrypted manner for the purpose of legal requirements/compliances <AS PER UNION / STATE / ULB LAWS> from the date of deletion/termination.
After deletion, in case you wish to recreate your profile, the same is permissible and none of the previously captured information will be populated automatically. You need to register as a fresh user.
If you simply delete/remove the application from your mobile device but do not delete your profile or unregister yourself from the app/website, you shall continue to be a registered user of the app and we shall continue to send you all communications that you have opted for unless and until you opt-out of such communications, or <AS PER UNION / STATE / ULB LAW>.
In case you surrender/disconnect your <PROGRAM NAME> registered mobile number it is recommended to delete your profile or unregister yourself from the application also.
With respect to such data, we ensure the following safeguards:
Define roles for our employees, and grant access only to such data as they require to perform these roles (role-based access controls)
Maintain access audit logs of entities accessing data in our systems
Maintain safe retention/storage of datasets in authorised cloud systems
Conduct vetting of contractual agreements with third-party vendors/implementing agencies for data access and processing functions
We use this data to serve you with the best civic experience, such as providing grievance redressal mechanisms, efficient service delivery, creating dashboards of ULB activities, etc. Specifically:
We process this data as necessary to provide you with the services you are requesting.
We may process, disclose, or share certain metadata, as well as aggregated and anonymised data, in order to assess and improve the status of such service delivery over time.
We may disclose or share this data to/with employees and/or contractors of the urban local body, state government, or other government agencies, or service providers, whose role requires them to view or use this information in order to perform their official duties, including providing you the service(s) you are requesting.
Resolving any disputes that may arise with respect to the transactions/deals that you may conduct using the app/website.
Monitoring user activity and preferences as evidenced by user’s activity on the app to provide a better experience in future.
Detecting, investigating and preventing activities that may violate our policies or that may be illegal or unlawful.
Conducting research or analysis of the user preferences and demographics as statistical data and not as individual data.
We may disclose or share this data in order to comply with the law or any legal process, including when required in judicial, arbitral, or administrative proceedings.
Payments made through the <PROGRAM NAME> App/website are processed via secure payment gateways.
We will not process, disclose, or share your data except as described in this policy or as otherwise authorized by you.
By default, we display PII in a masked format. Persons with the appropriate authorisation can request for this data to be unmasked. We log each such request, thus creating a non-repudiable and auditable trail of each such access to PD[2].
Yes, this policy is subject to change at any time and without notice. This page will be updated with any such modified policy, and such changes will not be deemed to take effect until and unless they are shown on this page. You are reading the current version of the policy.
By using this service, you confirm that you have read, understood, and accepted this Policy, and that we may collect, process, disclose, and/or share your data as described in this Policy.
In case of any grievances, you may send your complaints for redressal over the Grievance Portal available at <LINK TO GRIEVANCE PORTAL / MECHANISM>.
Once any data-related legislation is introduced, a formal/detailed Privacy Policy can be drafted and linked here. ↑
As above, this can be detailed in the Full Privacy Policy. ↑
Draft data privacy policy for supporting agencies
This document is a sample data privacy and protection policy for support agencies[1] to pick up and replicate on their user-facing web pages. Entities using this document may make modifications as relevant to the context in which they are using this document.
This document is a draft for reference and does not have any legal effect in and of itself.
eGovernments Foundation does not guarantee that this document will correctly represent all relevant laws or legal obligations, as these can vary across jurisdiction and time.
eGovernments Foundation does not guarantee that the use of this draft, with or without modifications, will cover any or all legal obligations of a specific entity in a specific jurisdiction.
Any entity or individual that uses this draft, with or without modifications, does so at their own discretion, and at their own risk.
By using the draft policy provided, you acknowledge and agree that:
The draft policy is provided for informational purposes only and does not constitute legal advice.
The draft policy is a general template and may not be suitable for your specific needs or circumstances. It is your responsibility to review and modify the draft policy to meet your requirements.
We make no representations or warranties, express or implied, regarding the accuracy, completeness, or reliability of the draft policy. We do not guarantee that the draft policy is up-to-date or compliant with current laws or regulations.
You assume all risks and liabilities associated with the use of the draft policy. We shall not be held liable for any direct, indirect, incidental, consequential, or special damages or losses arising from the use or reliance on the draft policy.
We recommend consulting with a qualified legal professional to obtain advice tailored to your specific circumstances before implementing any policy based on the draft provided.
By using the draft policy, you agree to release and discharge us from any claims, demands, liabilities, actions, or causes of action arising out of or in connection with the use of the draft policy."
It is crucial to seek legal advice to ensure that this policy meets your specific requirements and is enforceable in your jurisdiction.
<SUPPORT AGENCY NAME> as a SUPPORT AGENCY (SA)[2] (“we” or “us” or “our”) for <PROGRAM NAME> respects the privacy of the users (“user” or “you” also referred to as ‘your’). Hence, we maintain the highest standards for secure activities, user information/data privacy and security.
<PROGRAM NAME> is implemented through a collaboration between the <ADMINISTERING AUTHORITY NAME>, and/or <IMPLEMENTATION AGENCY NAME>> and eGovernments Foundation. It is powered by DIGIT, which is an open-source software platform developed by eGovernments Foundation.
This privacy policy describes and determines how we deal with your personal and usage information in accordance with the applicable laws of India.
<PROGRAM NAME> refers to the services being provided through <PROGRAM NAME> website, mobile App, <OTHER CHANNELS AS RELEVANT>.
Through <PROGRAM NAME>, you can access and avail services offered by <STATE OR ULB NAME> Government departments, Central Government department, Local bodies & their agencies and corporate/private bodies (utility services) (Service Providers).
You can use <PROGRAM NAME> website/application/services in different ways such as service discovery, availing services, registering grievances, and so on.
The purpose of this policy is to maintain the privacy of and protect the personal information of users, employees, contractors, vendors, interns, associates, and partners of <PROGRAM NAME>. The <PROGRAM NAME> ensures compliance with laws and regulations applicable (partner to insert a list of laws they have to comply with ) to <PROGRAM NAME>.
We adhere to the principles of accountability, transparency, purposeful and proportional collection, usage, storage and disclosure of personal data (“PD”)[3].
We only access, store, process, use, or share any information/data (“data”) to assist the program owner/implementation agency. None of the above takes place without a written authorization agreement with the program owner/administering authority.
Data includes PII such as your first name, last name, parent’s / guardian’s name, address, email address, telephone number, age, gender, and identification documents. We may collect your educational, demographic, location, device and other similar information. The nature of the data would change as per the needs of the service providers/ULB/State or Central Government. We do not define the scope of the datasets, the Service Providers do.
We do not collect any data directly from you. We only access, process, use, store, and share datasets given to us by the Administering authority/Program owner/Service providers/Implementing Agencies.
We access data that is required to provide for the functional working, upkeep, troubleshooting and/or any functional support required by the program owner or administering authority .
We process data as a byproduct of the work we undertake for the program owner or administering authority.
We may store and share data that is required to execute the responsibilities given to us. Such storage will be temporary till the period of our work ends. Such sharing shall be restricted to internal teams, for third-party sharing of data, we would have authorizations in place by administering authorities or the program owners.
We also access, store, process, and share information pertaining to employees of our own as well as local government or other government agencies. Specific information can include name, age, gender, details of spouse or dependents, address, administrative details such as employee ID or other reference number, as well as information on bank accounts (used for processing salaries or pensions).
We also access, store, process, and share transactional data, which describes the progress of any ongoing task (e.g. any service requested by a resident) through the systems of the local government or other government agency. This can include information about the specific role to whom a given task is assigned (or with whom it is pending), the amount of time elapsed on processing / completing a given request (or task / sub-part thereof), and whether this task has been completed within the benchmark or designated amount of time. It can also include details about the channel through which a particular request was received, such as the ULB counter, service centre, website, helpline, mobile app, chatbot, etc.
We derive, store, process, and share aggregated data. These aggregates are derived from the transactional data. This includes data such as aggregate or cumulative revenue collection, aggregate or cumulative number of service requests, percentage of requests resolved within the benchmark time period, etc. The above data may be further analysed and presented in terms of the type of collection, request, or complaint, the channel through which it was received, etc.
We may collect, store, process, and share telemetry data, which studies how much time is spent on a given screen or field in a workflow/user interface (UI). Such data is normally processed and shared in aggregate, and will not be used to identify specific individuals. In the event that specific individuals are sought to be identified, such as for user research, their consent will be sought for the further processing or sharing of their data.
We are either handed over such data by the Service providers or are given authorized permissions to access, process or use such data. We may also collect data from Union, State, and Local governments, including their agents/employees as well as receive data that is available openly for public use.
We do not collect any PD directly from you. We are given authorized access logins by the Program Owners or the Service providers.
We stop accessing, processing, using and storing any PD from the time our role or functions cease i.e. either end of the agreed period or till the final handover.
We do not directly store any data in our systems. We are given limited-time access to the storage of the service providers. Such access is required for our functional purposes. Once our purpose is served, our access to such storage ceases to exist.
With respect to the data that we access, we adopt the following safeguards:
Define roles for our employees, and only grant access to such data as is needed for them to perform their roles
Maintain logs of access to the program’s systems by these roles, and enable the program owner / administrative agency or other authorised agencies to audit these access logs periodically
We do not store data from the platform implementation/program in our own systems; the employees who have access to this data work on the devices and/or systems of the program owner
We do not share data from the platform implementation/program with any third party, except as specified in our contract with the program owner / administrative authority.
We use this data to enable the service providers with the requirements they appoint us for, such as setting up hardware, customising, extending, configuring and installing the software, assisting in implementation, training etc. Specifically:
We access data (either anonymised/metadata) for:
Studying the feasibility of the requirements asked for by the Service providers
Creating an implementation plan for the administering authority ( the plan would have no PD, but to design the implementation plan, a study of PD would be undertaken)
Measuring the performance and adoption metrics
Training and awareness-building activities
We may access and process data for conducting research or analysing user preferences and demographics if asked to by the service providers ( statistical data and not any individual data)
We access PII for:
Testing for to be deployed or integrated hardware and software at identified urban local bodies or jurisdictions - as instructed to us by the service providers.
Master data cleaning and validation before deployment and integration
For resolving any disputes, troubleshooting any problems, and solving for critical bugs that may arise with respect to the use of the platform.
Sharing data in order to comply with the law or any legal process, including when required in judicial, arbitral, or administrative proceedings.
We will not process, disclose, or share your data except as described in this policy or as otherwise authorized by the Service Providers..
By default, we do not display, or store any PD. Only persons with the appropriate authorisation can access PII. We log each such request, thus creating a non-repudiable and auditable trail of each such access to PD[4]. We do not share any PD unless asked to do as under a contractual understanding with the Service provider.
Yes, this policy is subject to change at any time and without notice. This page will be updated with any such modified policy, and such changes will not be deemed to take effect until and unless they are shown on this page. You are reading the current version of the policy.
By using this service, you confirm that you have read, understood, and accepted this Policy, and that we may access, process, disclose, and/or share your data as described in this Policy.
In case of any grievances, you may send your complaints for redressal over the Grievance Portal available at <LINK TO GRIEVANCE PORTAL / MECHANISM>.
A ‘Supporting agency’ is one that provides support in any functional aspect required by the program owner with respect to that platform implementation (e.g. assistance in maintenance of the platform, technical or operational problem-solving, bug/error resolution). ↑
An IA is an agency that deploys and configures a platform for the program owner ↑
Once the PDP Bill is introduced, a formal/detailed Privacy Policy can be drafted and linked here. ↑
As above, this can be detailed in the Full Privacy Policy. ↑
For all the roles we have defined here, find the draft templates of privacy policies listed below which can be used for future purposes. These templates are for reference purposes only. Please consider legal advice before adopting them.
Global standards per se vary from stakeholder natures, functioning and deliverables. The core agenda of this exercise was to find a few globally certified standards in the DPP space that fit for each of these roles:
Platform owner - a platform owner is an entity that owns, governs, or controls the platform's codebase. They are responsible for its architecture design, roadmap, and versions.)
Implementing agencies - An agency that deploys and configures a platform for the program owner is an implementing agency (IA)
Program owners - a ‘program owner’ is the entity responsible for the delivery of specific public goods, services, or social welfare. A Program owner is usually a government entity.
They per se require no certified standard to follow. The law of the land complied with is sufficient for them to showcase their proactive steps on DPP
Depending on the nature of the work the platform owner undertakes the NIST Privacy Framework could be looked at as a direction for standardization.
What is it?
The privacy framework is composed of three parts: Core, Profiles, and Implementation Tiers.
Each component reinforces privacy risk management through the connection between business and mission drivers, organizational roles and responsibilities, and privacy protection activities.
The Core enables a dialogue—from the executive level to the implementation/operations level—about important privacy protection activities and desired outcomes.
Profiles enable the prioritization of the outcomes and activities that best meet organizational privacy values, mission or business needs, and risks.
Implementation Tiers support decision-making and communication about the sufficiency of organizational processes and resources to manage privacy risk
The advantages of NIST are:
It pushes for Privacy engineering functions to be embedded in the design of the software
It promotes transparency as the guidelines are clearly communicated to IA and POs
It enhances trust as it encourages proactive privacy measures to be taken from the design stage itself
It streamlines operations by embedding privacy into the functional and design practices, avoiding costly retroactive changes
An IA’s core responsibility is to deploy the platform. Its functions require hands-on functions of customisation, configuration, training and support. The IA ideally has complete access to the data of citizens.
For an IA getting certified under ISO 27701 is recommended. This certification requires a certification to ISO 27001 as a first step.
Why ISO 27701?
The upcoming Digital Personal Data Protection Bill would require companies that are eligible to be an IA to undergo steps similar to those in the ISO 27701.
The steps / key components of ISO 27701's Privacy Information Management System (PIMS) are :
Privacy risk management: ISO 27701 would require an IA to identify and assess privacy risks associated with the processing of Personally Identifiable Information (PII) and implement appropriate controls to mitigate these risks.
Privacy policy and procedures: ISO 27701 requires an IA to develop and implement privacy policies and procedures that are aligned with the administering authority’s overall information security policies and procedures.
Data subject rights: ISO 27701 requires the IA to establish procedures for handling data subject requests, such as access, rectification, and erasure of personal data. With such a feature embedded, the citizens would be given a chance to exercise their right to privacy.
Privacy training and awareness: ISO 27701 requires an IA to provide privacy training and awareness programs to employees and other stakeholders to ensure that they understand their roles and responsibilities in protecting PII.
Incident management: ISO 27701 requires an IA to establish procedures for managing privacy incidents, including breach notification, investigation, and remediation.
Third-party management: ISO 27701 requires an IA to establish procedures for managing third-party relationships that involve the processing of PII, including due diligence, contract management, and monitoring.
Assurance: ISO 27701 provides assurance to senior members of administrative authorities, and other stakeholders, such as citizens and partners that the organization is committed to protecting Personally Identifiable Information (PII) and has implemented international best practices for privacy management.
Trust: ISO 27701 can help organizations build trust with stakeholders by providing tangible evidence of their commitment to protecting PII.
Compliance: ISO 27701 supports compliance with globally recognised data protection and privacy regulations such as GDPR, CCPA, and others.
Risk management: ISO 27701 helps the IA identify and mitigate privacy risks, reducing the likelihood of data breaches, reputational damage, and financial losses.
Global standard: ISO 27701 is a respected global standard for privacy information management and can be used by agencies of all sizes and from all sectors.
Integration: ISO 27701 is an extension to ISO 27001, meaning it can be integrated with an existing Information Security Management System (ISMS) to enhance privacy management and compliance efforts.
By getting certified under ISO 27701, implementing agencies can demonstrate their commitment to protecting PII, build trust with stakeholders, comply with data protection and privacy regulations, and improve their privacy risk management efforts.
Data protection and privacy guidelines for DIGIT implementations for implementing agencies
DIGIT, an open-source platform, enables governments and service providers to provide interdepartmental coordination and citizen-facing service delivery systems - currently, in urban governance, sanitation, health, and public finance management.
As citizen data is collected and used for such governance services, data privacy and protection measures are required to ensure this data is managed responsibly and safely.
This document is created to be an online guide, providing guidelines for Implementing agencies to maintain data privacy and protect individuals’ data.
Readers can use this to identify the steps they must take, in their capacity as implementing agencies, to ensure data privacy and protection in the context of a DIGIT or DIGIT-like implementation.
It can also provide source material for privacy policies, which should be included in each portal & application.
This is not a technical reference or documentation. It serves as a policy guideline.
References made to DIGIT are also applicable to other platforms similar to DIGIT. Not all parts of the guidelines or featured content may match the reader's platform or context, hence this document is open to be referred to in parts as needed.
These guidelines are to be read through the eyes of roles that are part of the Implementation Agencies (IA) offices in the journey of adopting a DIGIT-based system or platforms similar to DIGIT in a government entity/ies.
As per the (DPDP Act) an IA would be a data processor. If the IA gets involved in deciding the purpose and means of the data processing, then it would become a . The guidelines below cover measures to be in compliance with the DPDP Act.
If a government authority adopts DIGIT as a citizen service platform, then these guidelines are apt. Some points in the guidelines may not be relevant to platforms other than DIGIT in the governance ecosystem. Hence these guidelines have to be read as advisory.
The previous document in this series covered the guidelines for platform owners (PO), and administering authorities (AA).
For this document to understand what each program owner should do to safeguard data privacy and protection (DPP), it is important to understand what IA does at each phase of the implementation of DIGIT.
What is a program?
A program can be a delivery of any government service/s which the AA is mandated to provide to citizens for which it requires a platform. Defining the scope of the program is within the power of an AA.
A Memorandum of Understanding is signed between the AA and the platform owners. A Prog can also be a party to the MoU or maybe an equal power holding or subordinate entity of the AA (which signs the MOU).
The AA appoints a State Program Head/Nodal Officer
Resources and funding for the program are identified.
The program-specific procurement process is defined.
IA team onboarding is initiated.
At this stage, the IA becomes a part of the program.
An official MoU or contract is entered into detailing the terms and conditions between the IA and the AA or Prog
IA begins to understand the needs of the program
IA begins making an implementation plan, that shall be published in the next stage
Must-haves:
IA must ensure there is an authorization document/proof/contract (MoU) - validating and authorizing the IA’s access to future data and its related compliances ( in compliance with Sec 8 of the Digital Personal Data Protection Act,2023)
IA presents its own data management and privacy policy to the AA or Prog. This would make the IA’s stand on DPP very clear and easier for the AA or Prog to design a data sharing/access agreement with the IA
The clauses and language in the MoU/ data access/sharing agreement with the AA or Prog must include:
Data will always be controlled by the AA or the Prog, and IA will never have data-controlling power (IA must not decide the purpose and means of the processing of the data)
IA will be restricted from third-party data sharing without authorization from the AA or Prog
IA will not collect personal identifying information (PII) from citizens directly or indirectly without written permission by the AA or Prog
Access to PII by the IA team should be role-based, through strict logins audited and reported to the AA or Prog
IA will access PII only for purposes specified and authorized by the AA or Prog
IA will not keep any PII backup or secondary copy of such data
Data breach consequences - who holds accountability for data breaches
In the implementation plan, the IA must push for maintaining the data safely and securely from the beginning of the program life cycle to avoid any data or confidential breach. For example - the IA can detail a data-sharing mechanism that masks direct PII from being visible to IA representatives
The IA should make clear the access, processing and sharing of data in the implementation plan to avoid future confusion on data accountability
At every step of the implementation plan, the IA must reduce or eliminate its access to PII
Privacy enhancing features like encryption, privacy by default steps including purposeful processing of data, data deletion post use and strictly restricted access to PII must find a big space in the implementation plan
Preferable practices:
Assist/advise the AA/Prog in mapping out resources and funding needs for maintaining safe data protection and security structures ( hardware and software)
Embedding DPP practices in the implementation plan. For example, in the processes of data migration and data processing, the system does not permit sensitive data to be visible to unauthorized roles, strict logins are maintained, and IA employees are trained in safe data handling.
Help the AA or Prog make a program-specific data privacy policy (if they don’t have one made already for the specific program).
Publishing of the program charter and implementation plan.
Master data collection begins in Pilot (selected) ULBs (Urban Local Body)
Cloud Infrastructure is procured
Program branding is done (name, logo, tagline etc.)
Here data starts to be shared with the IA for the deployment of the modules
The IA and the AA/Prog publish the implementation plan
IA team begins looking for resources for the deployment of the modules
Must-haves
The IA restricts or disallows any direct PII from being sent to it. The IA intimates the AA/Prog representatives to mask or encrypt the data in the manner
IA trains AA and its own employees in data best practices like purpose-based data access, strict password controls and data sharing hygiene and makes all aware of the legal consequences of .
To follow the DPDP Act :
the IA maintains an audit log of the data ( to provide a summary of personal data processed to the data fiduciary)
Maintain the completeness, accuracy, and consistency of personal data [ Section 8(3)]
Implement appropriate technical and organizational measures to implement the Act [Sec 8(4)]
Intimate the data fiduciary on any personal data breach [so that the data fiduciary can inform the Board and data principal about such a breach - Sec 8(6)]
Preferable/Good practices
IA encourages AA or Prog to:
Collect data only if it is needed for a specific legitimate reason and defined purpose (, ).
Proactively inform the citizens about the legal basis and reason/purpose for their data being collected (when collected directly from the resident)
Data is encrypted or masked when data is being migrated from paper to digital or old or new digital systems
Strategies for safe storage of data (on paper or digitally) are set.
Paper-based data is destroyed after a defined migration period (AA or Prog to define a data deletion period post-migration).
Create a data dashboard to show the nature of data collected and their corresponding purposes and uses (for transparency and awareness of citizens).
IA onboards a team with appropriate Data privacy and protection safeguarding skill sets
The implementation kickoff workshops include training on purposeful master data collection (for the next stage) in an informed and transparent manner (letting the resident know why they are collecting the data).
Standardized ontologies (uniform terminology for easier understanding), processes and workflows are created.
Master data collected in the desired format.
Agreement on program-specific product customisations is required.
A detailed program plan is made and the tracking mechanism is finalized.
Product specifications with AA are finalized
IA begins the process of adopting the ontologies, designing/re-designing modules and workflow creations as per the needs of AA or Program.
Must-haves
In workflows and processes-
PII is kept in an encrypted/ masked manner through the workflows.
Strict data access requirements are in place (audit logs, restricted access points)
Data is maintained in secure storage
Data sharing is restricted through permitted devices, channels and to selected roles
Preferable/Good practices
IA conducts a risk assessment of the customizations asked for by the AA or Prog checking () risks and harms that may cause a breach of data privacy and confidentiality. will take into consideration the impact that data use may have on an individual(s) and/or group(s) of individuals, whether known or unknown at the time of data use[8].
Include security checks at each level of implementation of the platform for data to be kept secure and safe.
A configured/customized product is created that is ready for UAT.
Monitoring Reports and Dashboards are ready (to understand the rollout of modules).
Product artefacts like user guides are created.
Identification of participants for the UAT session.
Delivers the product to the relevant team of the Pilot for User acceptance testing (UAT)
Helps the AA/Prog team deploy the product module/s in the ULB for testing
Assists in creating user guides for the Prog team to implement the product
Must-haves
Make the privacy policy visible on the product webpage
Ensure the above data safety and privacy enabling measures are incorporated in the implementation of the product
If the AA instructs, be ready to delete data that no longer serves any purpose [as per Section 8(7)]
Preferable/Good practices
Guides the nodal officers in data privacy and protection practices. Makes them aware of the importance of data privacy and protection and the legal consequences of breach.
Check for feedback from employees on access mechanisms and delivering services with proposed levels of data access, masking, etc (Use this sheet as an activity to assess how they are ensuring the privacy rights of residents).
The user acceptance test is conducted, a sign-off and go-live permission is given for identified Pilot ULBs.
Setup of review & monitoring cadence.
Helps the prog in organizing employee training workshops
Implements review and monitoring processes
Must-haves
Conduct data breach and security checks before the AA/Prog signs off on the UAT.
A data security checklist should include-
Personally identifying information (PII) data is encrypted/masked when shared
Data is stored in safe databases
Employees don’t openly share access logins
Limited - documented roles have access to PII,
Employees trained in incident reporting,
Data protection policy for hardware protection, external media devices
The monitoring and evaluation cadence has data privacy and protection as a threshold for security checks. A report is submitted to Prog as part of the review and monitoring cadence for DPP.
The privacy policy is uploaded and displayed/
The privacy policy clearly states who is responsible for the personal data and how that official can be contacted.
Assessments for data breaches and security checks are planned to be regularly performed.
Data processing and sharing agreements have been established with all third parties that will process personal data.
The software and infrastructure regularly undergo security risk and threat analysis.
The program has privacy education/awareness training.
SOP for security incidents affecting personal data is established.
The amount of personal data that can be collected has been minimized.
The purpose of data collection has been defined to be as specific as possible.
The data is retained only till there is a need for it.
There are checks on data sharing, with verification that sharing is legally authorised and approved by the appropriate official.
Preferable practices
IA continues to check for any issues in the data governance of the modules.
Statewide Rollout in batches
Help desk effectiveness assured
Critical bugs fixed
Program success metrics tracking kick-started
The IA finishes their implementation function and starts transitioning out of the program
Begins handovers and closing gaps if any
Must-haves
Hand over all data they hold, without making a second copy
Provides an authorized letter to the Prog of such handover for credibility
Employees of IA begin surrendering logins and role controls
IA leaves no endpoint access for itself ( unless permitted by the AA or Prog)
Preferable/Good practices
Avoids allowing its employees to see PII even while helping AA/Prog employees.
The first batch of ULBs have been made live after the Pilot.
There is the adoption of the platform in the program’s jurisdictional zone and amongst its ULB employees and citizens.
IA implements and leaves the program
Must-haves
IA completely detaches itself from the program system ( no backdoor entry/logins, no roles accessing PII, no backup of data).
Preferable/Good practices
IA documents how it enables privacy-preserving implementation modules and makes them available for other players in the implementation ecosystem to pick from.
Data privacy policy for eGov
eGov Foundation is a philanthropic mission that cares deeply about making the lives of ordinary citizens better, including health, sanitation, public finance management systems, and access and delivery of services from the government, by creating open digital infrastructure and ecosystems.
Digital Infrastructure for Governance, Impact & Transformation (DIGIT) is an open-source platform created by eGov (). It helps governments collect revenues, deliver services, and manage operations and workflows in an efficient, reliable, and transparent manner.
The DIGIT platform is built and designed to:
Improve the delivery of services for citizens, making such services more accessible, inclusive, and easy to use
Simplify the work of frontline workers, through streamlined workflows and easy-to-use interfaces
Enable government employees to transition daily routine and repeatable work items from manual to digital systems, driving efficiency in public service delivery
Provide tools to improve local government revenue mobilisation, through dashboards, systemized demand generation, and collection of taxes, user charges, fees, licences etc.
Empower government administrators / to access relevant data for timely decision-making and action.
For a full list of products/services available on DIGIT, see here.
For a description of data security measures built into DIGIT, see here.
As an open-source platform, DIGIT in & of itself does not collect, store, use, process, or share data.
DIGIT is deployed by national, state, or local government bodies. Such deployment may be done by in-house teams of the government, third-party service providers, or – in a small number of cases – by eGov.
Where eGov is participating in a deployment, whether as an implementing agency or supporting agency, this is on the basis of a Memorandum of Understanding (MoU) or agreement with the relevant government entity.
eGov is compliant with the relevant laws and regulations of India.[4]
When the DIGIT platform is implemented in a particular place, that particular implementation will collect, store, use, process, and share data relevant to the specific purposes for which it is being used (i.e. the services it is enabling the local government or other service providers to provide). The party that is given the authority to implement DIGIT at a government entity level is called the implementing agency. Each implementation is carried out at a program level by program owners. The administering authority defines what the scope of a program can be[5].
All data collected, stored, used, processed, or shared by a DIGIT implementation is the property of the individuals to whom it pertains. The administering authority that is responsible for the implementation is also responsible for the management of this data.
This data is not, under any circumstances, the property of the implementing agency or the supporting agency.
A DIGIT implementation/ program collects and processes personal data[6] such as the first name, last name, parent’s / guardian’s name, address, email address, telephone number, age, gender, and identification documents. It may collect your educational, demographic, location, device and other similar information.
The DIGIT implementation may also collect information such as Internet Protocol (IP) addresses, domain name, browser type, Operating System, Date and Time of the visit, pages visited, IMEI/IMSI number, device ID, location information, language settings, handset make & model etc.
A DIGIT implementation may derive, store, process, and share transactional data, which describes the progress of any ongoing task (e.g. any service requested by a resident) through the systems of the local government or other government agency. This can include information about the specific role to whom a given task is assigned (or with whom it is pending), the amount of time elapsed on processing / completing a given request (or task / sub-part thereof), and whether this task has been completed within the benchmark or designated amount of time. It can also include details about the channel through which a particular request was received, such as ULB counter, service centre, website, helpline, mobile app, chatbot, etc.
A DIGIT implementation can derive, store, process, and share aggregated data. These aggregates are derived from the transactional data. This includes data such as aggregate or cumulative revenue collection, aggregate or cumulative number of service requests, percentage of requests resolved within the benchmark time period, etc. The above data may be further analysed and presented in terms of the type of collection, request, or complaint, the channel through which it was received, etc.
A DIGIT implementation can also collect, store, process, and share telemetry data, which studies how much time is spent on a given screen or field in a workflow/user interface (UI). Such data is normally processed and shared in aggregate, and will not be used to identify specific individuals. In the event that specific individuals are sought to be identified, such as for user research, their consent will be sought for the further processing or sharing of their data.
Administering authorities and Program Owners - State or Central government, government department employees and third-party contractors or agents( hired by governments): interact with the citizens, provide services, collect payments, and collect feedback, view real-time or compiled/aggregated data on city operations, which can inform performance management and policy-making.
Implementing agencies and Supporting agencies: implement, customise, and maintain and support a DIGIT implementation.
eGov employees, contractors, and consultants - when eGov is appointed as an implementing agency for implementing DIGIT in a few States, otherwise for support and maintenance.
eGov may access, use, and share any data made public by government entity/ies, including in combination with aggregate and/or derived data as described below.
eGov does not collect, use or share any personal data (other than of its own employees)
eGov may access information about usage of the DIGIT platform, including any specific product or feature, for the purposes of understanding usage of the platform/product/feature, and for making or testing improvements to these.
Such information may include metadata; aggregate statistics on usage, revenue, and service requests; aggregate statistics on the status of a particular service/request/complaint type; and aggregate or anonymised information about user behaviour in navigating workflows or screens (‘telemetry data’).
Such information may also include derived data, e.g. averages or trends calculated from the aggregate statistics described above. This data may be derived by eGov, by the government entity, or by any authorised third party.
eGov may have additional access to data under three circumstances:
eGov may request access to contact information of specific individuals/users, in order to contact them as participants in research conducted or commissioned by eGov or eGov’s collaborators.
Upon receiving permission from the relevant government entity to access and use this information, eGov may contact these individuals, and seek their consent for participation in the research in question.
eGov may share this information with research partners, upon the signing of suitable data sharing/data use agreements, which impose upon such partner obligations and conditions for data access/use/sharing that are at least no less stringent than those applicable to eGov.
eGov may serve as an implementing partner for part or the entirety of a DIGIT implementation. For such areas where and for such duration when eGov is an implementing partner, as part of the implementation process, eGov collects, store, process, use, and share all data relevant to the implementation, including personal data.
All terms and conditions related to such data access shall be included in a formal agreement between eGov and the relevant government entity responsible for that implementation.
eGov shall not collect, store, process, use, or share such data for any purpose other than that specific implementation, and shall not share such data with any third party unless specifically authorised to do so in writing by the relevant government entity, or unless required to do so by law.
In all cases where eGov has access to personal data, eGov shall ensure that:
Such access is authorised on the basis of an agreement with the relevant government entity, or otherwise authorised in writing by the relevant government entity.
The purpose of such access and the nature of data that may be accessed are specified in the document that authorises such access.
The channel through which such data may be accessed/shared is specified in the document that authorises such access, and such channel is generally accepted as a secure channel for data sharing.
eGov may collect, store, process, use, access or share data from a DIGIT implementation to support said implementation.
All terms and conditions related to such data access shall be included in a formal agreement between eGov and the relevant government entity responsible for that implementation.
In addition to the data described above, eGov has access to two broad categories of data.
Employee data: eGov may collect, store, process, use, and share data pertaining to employees, contractors, and consultants of eGov. This includes:
Name, address, phone number, address, contact information, date of birth
Login IDs and passwords for work-related software
Information pertaining to academic and/or professional history
Information necessary for processing payments and taxes, including bank account, PAN, GST, Aadhar, and Provident Fund information
Information necessary for providing insurance, including some health-related information of the employee; name, age, date of birth, and health-related information of the employee’s dependents or nominees.
Partner or contact data: eGov may collect, store, process, use, and share data of organisations or individuals with whom eGov has contact in the ordinary course of its business.
This includes name, organisation, designation, contact details, etc.
eGov shall use this information for normal business purposes, such as sharing business communications (emails, newsletters, etc.)
This policy is subject to change.
Please refer to this webpage for the latest version of the policy.
By using the products of eGov, you confirm that you have read, understood, and accepted this Policy, and that we may collect, process, disclose, and/or share your data as described in this Policy.
Name:
Address:
Phone Number:
E-mail:
Date:
This entity creates the infrastructural design of the platform (architects of the platform). They are responsible for the design of the platform ↑
Government bodies mandated to provide a government service usually an agency or department of the State government agencies have the power to decide who has access and control of the DIGIT systems ↑
For definitions of key terms – data, PII, platform, platform implementation, program, platform owner, implementing agency, support agency, program owner, administrative authority – please see GIU_DPP_DEFINITIONS_V1_04_07 ↑
As defined in the Digital Personal Data Act, 2023 - Sec 2 (t) “personal data” means any data about an individual who is identifiable by or in relation to such data ↑
Create a reference source of globally adopted principles and practices for data privacy and protection (DPP) in e-government services;
Present recommendations for eGov and eGov partners (including state governments) to adopt in order to better align with global best practices
Audience: Internal - ExCo, especially CTO, COO, and Head of Product/External - data policy & privacy researchers, senior bureaucrats.
Create a succinct evidentiary-based research depository of globally adopted principles, requirements and practices for data privacy and protection (DPP) in e-government services
It is a base to advise state governments and local bodies to adopt principles and practices for DPP.
Requirement for DPG Certification.
Assess eGov’s readiness for adopting such principles, requirements and practices ( checklist questions in Table 2 can be used as a readiness/ assessment checklist as well)
Privacy, like the freedom of speech, is a fundamental human right that is recognized in the (Gritzalis, 2004). Historically in Europe and North America, and based on the Fourth Amendment of the US Constitution, the right to privacy is seen as a defence against any “unreasonable” physical intrusion upon one’s private home, private papers, personal belongings and one’s body. Over the years, the legal and societal definition of the concept has broadened to encompass various types of information that could be available about an individual. These types of information include behavioural, financial, medical, biometric, consumer, and biographical. Additionally, privacy also constitutes information that is derived from the analysis. This means that privacy interests are also linked to the gathering, control, protection, and use of information about individuals and the deliberate invasion of those .
In the absence of a single globally accepted definition of privacy, principles play an important role. A principle is a shared value upon which regulations, rules, and standards can be built for the protection and . In many jurisdictions, laws have been shaped around such privacy principles.
The three pillars of India i.e. the Legislature, Judiciary and the Executive have had their own journeys in discovering the role of privacy in India.
Legislature:
The legal-making body of India is in the process of bringing the data protection law. With the absence of a clear standalone law for data protection, the Information Technology Act of 2000 along with the IT Rules and Regulations are read into as baseline legal compliances to provide for data protection. Laws such as the Right to Information,2000; Indian Registration Act,1908; the Aadhaar Act,2016; Telecom Regulatory Authority of India Act,1997; among others are relied on for protecting the privacy of Indian residents/citizens.
Judiciary:
The Supreme Court through the J.Puttaswamy v. Union Of India declared the right to privacy as a fundamental right for the citizens of India. The popular judgement serves as a detailed guideline for the executive and legislature to create a data protection regime in India on the basis of legality, legitimate state aim and proportionality.
Executive:
The Central government along with the State governments through various executive orders/circulars issued under the power given by the Constitution read with the IT Act, Aadhaar Act, TRAI Act, and Credit Information Companies Regulation Act (among others) have provided for data protection, security and the right to privacy for citizens.
E‐government is defined as the use of information technology (IT), especially the Internet, to deliver government services and information to citizens, businesses, and other . One of the benefits of internet use for service delivery is the possibility to easily collect, store, process, and disseminate citizens' personal information accurately and in real time. Personal information is information that makes specific .
Government organizations rely on such information to increase the efficiency and effectiveness of , enhance transparency and accountability in service delivery, and empower innovation. Although government organizations gain benefits from extended collection, storage, and dissemination of personal information, these activities often raise users' concerns about . As information technology advances, are getting bigger as more data are collected and used for various purposes including more in‐depth analyses and to make services more efficient.
shows that ensuring the privacy of citizens' information and addressing their privacy concerns are crucial for the adoption of e‐government, as it influences users' trust and willingness to use e‐government services. To fully unleash the potential and benefits of e‐government, government organizations and third parties participating in the e-government ecosystem (such as eGov Foundation) need to adopt information privacy protection (IPP) practices to ensure citizens that their personal information is protected. In developed countries, are reflected in national and international regulations and are commonly used to protect users' privacy.
Information privacy principles concern conditions and/or guidelines for developing and have been identified as an essential baseline for assessing such practices. There exist numerous sets of such principles with different scopes. On a review of assessing IPP practices in different countries and regions, we found that the most commonly used sets of principles are:
(1) the Organization for Economic Co‐operation and Development (OECD) principles,
(2) the Fair Information Practice Principles(FIPP),
(3) The developed by ISO,
(4) the General Accepted Privacy principles by the Canadian Institute of Chartered Accountants and the American Institute of Certified Public Accountants (GAPP), and
(5) the European General Data Protection Regulation (GDPR).
These five sets of principles are widely recognized. In particular, globalization has increased the importance of shared principles and led to the GDPR, affecting most organizations in the world as it sets mandatory principles for not only the EU but for every organization doing any business with the EU. Each set is applied and considered important in a certain context or region.
Based on when considering which data privacy principles may be the most important to address in any standards or procedures, the twelve that occur the most frequently include: notice, use restriction, quality, retention, minimization, security, enforcement, access, consent, participation, transparency, and disclosure. Those that occur infrequently include information flow, context, identifiability, consolidation, sensitivity, confidentiality, breach, and accountability. The specific application of data privacy principles will vary given the context and goals of an individual country, jurisdiction, industry, or organization. However, by basing these applications upon an agreed acceptance of principles, the privacy rights of the individual can be acknowledged.
The above ‘sets’ of principles when analyzed together have 7 individual principles commonly running across them. The most frequently occurring, important principles are enlisted in the Table below ( Table 1)
They are:
Notice and awareness
Differentiated Access
User Control
Storage limitation
Safeguard, accuracy and security
Enforcement
Accountability
Column 2 of the table contains the broadly accepted meaning of each of the corresponding principles. Detailed definitions of all these principles are provided in Appendix A.
To Note: In some cases, elements of one principle could be included in more than one category. For example, elements of, openness, transparency, and notice were categorized into notice and awareness as well as accountability. Also, the category of notice and awareness includes all elements concerning notifying users (data subject, here citizens) about activities related to the collection, use, or extended use of their information, together with an explanation of why the information is used. Notice and awareness includes elements of principles notice and collection present in principle number 2 of the GAPP principles, purpose specification, individual participation and openness of principle number 3,6, and 7 of the OECD principles, openness, transparency, and notice in principle number 7 of ISO, and notice and awareness of principle number 1 of FIPP ( all of the principles can be read in Appendix A).
In this way, we formulated a new set comprising seven principles presented in Table 1 which eGov could refer to as common principles table for DPP references.
Appendix B consists of a checklist for entities such as eGov Foundation as well as other e-government service-providing organisations/government departments to assess their readiness/compliance with the common data privacy and protection principles ( including the 7 we defined in Table 1 ). The checklist consists of questions one can ask to assess the presence or absence of common practices in the data privacy and protection space.
References used in Table 1:
Entities - local governing bodies providing e-government services, third-party egovernment service providers (including eGov Foundation) and implementing partners (e.g.- EY and KPMG in Punjab)
Individuals - citizens, data subjects, users of e-government services ( can be individuals or groups). The term user and citizen is interchangeably used.
Abbreviations used below:
GAPP: General Accepted Privacy;
OECD: Organization for Economic Co‐operation and Development;
ISO: International Standards Organization; FIP: Fair Information Practice;
GDPR: General Data Protection Regulation
Table 1- Informational Privacy Principles & Practices - Definition & Practices
DEFINITION OF PRIVACY PRINCIPLES
A1.
Fair Information Practice Principles (FIPP) principles In 1973, the USA Department of Health, Education, and Welfare conducted a study, which brought out a set of privacy principles known as the FIPP. Later, the FIPPs became the core of the US Privacy Act of 1974. Currently, it is mirrored in the laws of many states of the United States, as well as many other nations and international organizations. FIPP is the most comprehensive privacy principle that was sufficiently influential to propagate governmental, private sector, and self‐regulatory approaches to privacy policymaking in the United States. The FIPPs include:
1. Notice or awareness: Data collectors must alert individuals of the potential for capture, processing, and use of their information.
2. Choice or consent: The individual must be allowed options to control how their information is used. They must also be allowed to make the final decision before the collection and use of their information.
3. Access or participation: Individuals must be allowed to view the collected information and verify and contest its accuracy. This access must be inexpensive and timely in order to be useful to the consumer.
4. Integrity or security: Information collectors should ensure that the data they collect are accurate and secure.
5. Enforcement or redress: The data controller must identify enforcement measures and policies and adhere to those policies while processing and collecting users' information
A2. Organization for Economic Co‐operation and Development
The OECD presented a set of privacy guidelines in 1980, revised in 2013. The first attempt of OECD principles was to protect information privacy at the global level. OECD set is composed of eight principles defined as follows:
1) Principle of collection limitation: There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
2) Principle of data quality: Personal data should be relevant to the purposes for which they are to be used and, to the extent necessary for those purposes, should be accurate, complete, and kept up‐to‐date.
3) Principle of purpose specification: the purposes for which personal data are collected should be specified not later than at the time of collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
4) Principle of use limitation: Personal data should not be disclosed, made available, or otherwise used for purposes other than those specified in accordance with (principle 3) except: with the consent of the data subject or (b) by the authority of law.
5) Principle of security safeguards: Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification, or disclosure of data.
6) Principle of openness: There should be a general policy of openness about developments, practices, and policies with respect to personal data. Means should be readily available for establishing the existence and nature of personal data and the main purposes of their use as well as the identity and usual residence of the data controller.
7) Principle of individual participation: An individual should have the right (a) to obtain from a data controller or otherwise, confirmation of whether or not the data controller has data relating to him, (b) to have the data relating to him communicated to him (c) to be given reasons if a request made under subparagraphs (a) and is denied, and to be able to challenge such denial, and (d) to challenge data relating to him and, if the challenge is successful, to have the data erased, rectified, completed, or amend.
8) Principle of accountability: a data controller should be accountable for complying with measures, which give effect to the principles stated above
A3.
The American Institute of Certified Public Accountants, Inc. and the Canadian Institute of Chartered Accountants introduced 10 information principles, known as the General Accepted Privacy Principles (GAPP). GAPP was first published in 2003 and revised in 2004 and in 2006.
Although GAPP was developed to address information privacy protection on a global level, it is commonly used in the United States and Canada (Dayarathna, 2013). The 10 GAPP principles are:
1) Management: The entity defines, documents, communicates, and assigns accountability for its privacy policies and procedures.
2) Notice: The entity provides notice about its privacy policies and procedures and identifies the purposes for which personal information is collected, used, retained, and disclosed.
3) Choice and consent: The entity describes the choices available to the individual and obtains implicit or explicit consent with respect to the collection, use, and disclosure of personal information.
4) Collection: The entity collects personal information only for the purposes identified in the notice. 5) Use, retention, and disposal: The entity limits the use of personal information to the purposes identified in the notice and for which the individual has provided implicit or explicit consent. The entity retains personal information for only as long as necessary to fulfil the stated purposes or as required by law or regulations and thereafter appropriately disposes of such information.
6) Access: The entity provides individuals with access to their personal information for review and update.
7) Disclosure to third parties: The entity discloses personal information to third parties only for the purposes identified in the notice and with the implicit or explicit consent of the individual.
8) Security for privacy: The entity protects personal information against unauthorized access (both physical and logical).
9) Quality: The entity maintains accurate, complete, and relevant personal information for the purposes identified in the notice.
10) Monitoring and enforcement: The entity monitors compliance with its privacy policies and procedures and has procedures to address privacy-related complaints and disputes.
A4.
In December 2011, the ISO published an international standard for privacy principles. The principles were derived from existing principles developed by various states, countries, and international organizations (ISO, 2011). Those privacy principles are:
1) Consent and choice: presenting to the personal information subject, the choice of whether or not to allow the processing of her personal information.
2) Purpose legitimacy and specification: ensuring that the purpose(s) complies with applicable law.
3) Collection limitation: limiting the collection of personal information to that which is within the bounds of applicable law and strictly necessary for the specified purpose(s).
4) Data minimization: minimizing the personal information, which is processed and the number of privacy stakeholders and people to whom personal information is disclosed or who have access to it. 5) Use, retention, and disclosure limitation: limiting the use, retention, and disclosure (including transfer) of personal information to that which is necessary in order to fulfil specific, explicit, and legitimate purposes.
6) Accuracy and quality: ensuring that the personal process is accurate, complete, up‐to‐date (unless there is a legitimate basis for keeping outdated data), adequate, and relevant for the purpose of use.
7) Openness, transparency, and notice: providing personal information principals with clear and easily accessible information about the personal information controller's policies, procedures, and practices with respect to the processing of personal information.
8) Individual participation and access: giving data subjects the ability to access and review their personal information, provided their identity is first authenticated.
9) Accountability: assigning to a specified individual within the organization the task of implementing the privacy‐related policies, procedures, and practices.
10) Information security: protecting personal information under an organization's control with appropriate controls at the operational, functional, and strategic level to ensure the integrity, confidentiality, and availability of the PII and to protect it against risks such as unauthorized access, destruction, use, modification, disclosure, or loss.
11) Privacy compliance: verifying and demonstrating that the processing meets data protection and privacy safeguards (legislation and/or regulation) by periodically conducting audits using internal or trusted third‐party auditors.
A5. General Data Protection Regulation (GDPR )
In 2016, the European Parliament and the European Council adopted the EU data protection framework in the form of a regulation, called GDPR. This framework replaced the EU Directive 95/45/EC which all member nations were required to implement national privacy legislation in compliance with. This new regulation can be applied in member nations without national regulations in regard to data protection. The GDPR entered into force on May 24, 2016, and took effect on May 25, 2018. This framework contains principles on the protection of persons with regard to the processing of personal information and the flow of such information. These principles mainly concern:
1) Lawfulness, fairness, and transparency: Personal data shall be processed lawfully, fairly, and in a transparent manner in relation to the data subject.
2) Purpose limitation: Personal data shall be collected for specified, explicit, and legitimate purposes and not further processed in a manner that is incompatible with those purposes.
3) Data minimization: Personal data shall be adequate, relevant, and limited to what is necessary in relation to the purposes for which they are processed.
4) Accuracy: Personal data shall be accurate and, where necessary, kept up‐to‐date.
5) Storage limitation: Personal data shall be kept in a form, which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed. 6) Integrity and confidentiality: Personal data shall be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss, destruction, or damage, using appropriate technical or organizational measures.
7) Accountability: The controller shall be responsible for and be able to demonstrate compliance with the GDPR.
Table 2
Checklist for data privacy and protection assessment
Common requirements for the IPPs and questions to assess whether an organization meets these requirements are represented in Table 2 below.
Note
-The term ‘organization’ here can be interchanged for eGov Foundation ( any third-party service provider to the local government bodies for providing e-government services) or the local government body/municipal body itself.
-Grey boxes are not eGov relevant but stand relevant to local government bodies using e-government systems.
Table 2 (Requirement/ Practices for Principle implementation)
For the purposes of this document, data is any information shared by citizens or received from existing databases to enable government service delivery and government operations. It could be name, address, mobile number, age etc.
Under Indian law, The current Digital Personal Data Act of 2023 defines data as:
Sec 2(h) - “Data” means a representation of information, facts, concepts, opinions or instructions in a manner suitable for communication, interpretation or processing by human beings or by automated means.
Very similar to what is defined as ‘data’ in the Information Technology Act of 2000 as:
Data means a representation of information, knowledge, facts, concepts or instructions which are being prepared or have been prepared in a formalized manner and are intended to be processed, is being processed or has been processed in a computer system or computer network, and may be in any form (including computer printouts magnetic or optical storage media, punched cards, punched tapes) or stored internally in the memory of the computer.
For the purposes of this document, PII and personal data are to mean the same.
Personal data is defined under the Digital Personal Data Act,2023 as:
Sec 2 (t) - “Personal data” means any data about an individual who is identifiable by or in relation to such data;
To add, a breach of personal data is also defined in Section 2 (u) as -:
“Personal data breach” means any unauthorised processing of personal data or accidental disclosure, acquisition, sharing, use, alteration, destruction or loss of access to personal data, that compromises the confidentiality, integrity or availability of personal data;
Another definition of personal data is “any data that allows one to be recognised either directly or indirectly. It is defined as any information that relates to a natural person, which, either directly or indirectly, in combination with other information available or likely to be available with a body corporate, is capable of identifying such a person.”
Both the above definitions of personal data are to be read together until the government fixates on maintaining one.
Sensitive Personal data of a person means “...such personal information which consists of information relating to:
password;
financial information such as Bank account, credit card, debit card or any other payment instrument details;
physical, physiological and mental health conditions;
sexual orientation;
medical records and history;
biometric information;
any detail relating to the above clauses as provided to the body corporate for providing service;
and any of the information received under the above clauses by the body corporate for processing, stored or processed under lawful contract or otherwise
Provided that, any information that is freely available or accessible in the public domain or furnished under the 5 or any other law for the time being in force shall not be regarded as sensitive personal data or information[...].”
For the purposes of this document, ‘platform’ that can be used by a government entity, or by a contractor performing any tasks on behalf of that government entity. The term ‘platform’ refers to the software (code) itself, and NOT to any implementation of that code. Therefore, a platform does not collect, store, process, use, or share data.
For example, DIGIT is a platform.
For the purposes of this document, a ‘Platform Implementation’ refers to each instance of a platform/software system that has been implemented.
During the implementation of a platform, the implementing agency may collect, store, process, use, and share such data as is necessary for implementing the platform. This will normally be specified in the contract or agreement between the implementing agency and the administrative authority responsible for that implementation.
After the implementation of a platform is complete, the implementing agency should cease to have access to data from the platform implementation, except to such extent as may be agreed between the implementing agency and the administrative authority responsible for that implementation.
In cases where the implementing agency performs the role of a support agency with respect to any platform implementation, the implementing agency may have access to data as specified for that role.
The activity of platform implementation involves roles having access to datasets flowing through the platform.
For example, MSeva (in Punjab), NagarSewa (in Uttarakhand), eChhawani (in Cantonment Boards), and SUJOG (in Odisha) are platform implementations – they are implementations of the DIGIT platform.
Any ongoing or to-be-executed delivery of government service or defined government operation is a program.
During the operation of a program, the program owner (and its staff and contractors) will collect, store, process, use, and share such data as is necessary for performing their tasks, and/or which they are required to do under prevailing law.
For example, when ULBs in Punjab use the MSeva platform to collect revenues, deliver ULB services, and redress grievances, this is a program.
We describe a platform implementation as progressing across 7 stages.
For the purposes of this document, a platform owner is an entity that owns, governs, or controls the platform's codebase. They are responsible for its architecture design, roadmap, and versions.
As a platform is a code that has NOT been implemented, a platform owner has no access to data.
When a platform is implemented, becoming a platform implementation, the platform owner may have access to such data as is being collected, stored, processed, used, or shared by that platform implementation as may be agreed upon by the platform owner and the administrative authority responsible for that implementation.
In cases where a platform owner performs the roles of an implementing agency or support agency with respect to any platform implementation, the platform owner may have access to data as specified for those roles (see below).
For example, eGovernments Foundation is a platform owner.
An agency that deploys and configures a platform for the administrative authority or program owner (see below) is an implementing agency (IA). An IA may:
set up the hardware necessary for the program;
customise, extend, configure, and install/set up the software (platform) as per the needs of the program owner;
train staff or contractors of the program owner on how to use the platform;
Perform other such functions to ensure program readiness as may be agreed upon between the implementing agency and the program owner and/or administrative authority responsible for such platform implementation.
During the implementation of a platform, the implementing agency may collect, store, process, use, and share such data as is necessary for implementing the platform. This will normally be specified in the contract or agreement between the implementing agency and the administrative authority responsible for that implementation.
After the implementation of a platform is complete, the implementing agency should cease to have access to data from the platform implementation, except to such extent as may be agreed between the implementing agency and the administrative authority responsible for that implementation.
In cases where the implementing agency performs the role of a support agency with respect to any platform implementation, the implementing agency may have access to data as specified for that role (see below).
For example, if a given state government signs a contract with ‘Company L’ to implement the DIGIT platform in that state, Company L is an IA.
For the purposes of this document, a ‘program owner’ is the entity responsible for the delivery of specific public goods, services, or social welfare. A program owner is usually a government entity (though they may contract private entities to perform some or all of these tasks on their behalf).
In the context of a platform implementation, the program owner is the primary client of the implementing agency, as they will use the platform implementation to perform their tasks.
A program owner (and its staff and contractors) will collect, store, process, use, and share such data as is necessary for performing their tasks, and/or which they are required to do under prevailing law.
A program owner has primary responsibility for ensuring that all relevant legal provisions and good practices with respect to data security, data protection, and privacy are being followed in its programs.
A program owner may perform the roles of implementing agency and/or support agency or may contract those roles out to third-party implementing agencies and support agencies respectively. If they are performing these roles, they will have access to such data as is specified for these roles.
A program owner is typically subordinate to the administrative authority in the official/administrative hierarchy. It is also possible that the administrative authority and program owner are the same entity.
For example, if a given state government has initiated a program to use the DIGIT platform to reform urban governance in that state, either each Urban Local Body in that state or the Housing and Urban Development Department of that state is a program owner.
For the purposes of this document, a ’Support Agency’ is one that provides support in any functional aspect required by the program owner with respect to that platform implementation (e.g. assistance in the maintenance of the platform, technical or operational problem-solving, bug/error resolution).
A support agency would normally be engaged once the platform has gone live, i.e. during stages 5 and/or 6 of platform implementation (see above).
A support agency will have access to such data as is necessary to perform their functions, and this shall normally be specified in the agreement/contract between the supporting agency and the program owner / administrative authority responsible for that platform implementation / the implementing agency responsible for that platform implementation (in cases where the implementing agency sub-contracts support functions to the supporting agency).
In cases where a platform owner, implementing agency, or program owner performs the role of a support agency, they may have access to such data as is specified for that role.
For example, if a given state government signs a contract with ‘Company L’ to implement the DIGIT platform in that state, and Company L sub-contracts support/maintenance/helpdesk activities to ‘Company M’, Company M is an IA.
For the purposes of this document, an administrative authority (AA) is a government entity that has the authority to enter into contracts/agreements with the platform owner, implementing agency, and supporting agency.
An administrative authority has the power, under prevailing law, to permit other entities to access (collect, store, process, use, and/or share) data, including the PII of individuals within the territorial jurisdiction of that AA.
For the purposes of this document, a platform owner, implementing agency, or supporting agency cannot access data unless authorised to do so by the administrative authority. Such authorisation may be specified in an agreement/contract between the administrative authority and these entities. Such authorisation must be in keeping with prevailing laws and shall include such provisions and safeguards as are required under prevailing law.
An administrative authority may be a program owner or maybe a superior agency of a program owner in the administrative hierarchy.
For example, when a given state government decides to implement the DIGIT platform in that state, the specific department of that state government which signs MOUs and/or contracts with eGov Foundation (as platform owner), and with an IA to implement the DIGIT platform, is an AA.
This document focuses on deciding whether Personal identification information collected in each rainmaker module is used for data security and privacy purposes.
Target Audience: This document is intended for Engineering (tech team), Product Management and Implementation team to agree on requirements for data privacy and data security.
As a product provider to the government, we should be responsible for the data security of individuals and organizations who are using our products. The first step in data privacy and security is to identify personal identification information (PII) which will then decide our approach to data security. The personal identification information listed in this document is decided with the help of ‘WHITE PAPER OF THE COMMITTEE OF EXPERTS ON A DATA PROTECTION FRAMEWORK FOR INDIA’. The remarks and provisional views of the committee are given below:
All information about an individual is not personal data. As stated earlier, the protection of identity is central to informational privacy. So the information must be such that the individual is either identified or identifiable from such information. In statutes or instruments which use both these terms “identified or identifiable” such as the EU GDPR, it refers to states in which the data can exist. Data could be in a form where individuals stand identified or in other cases, it is possible that they could be identified. Whether an individual is identifiable or not is a question of context and circumstances. For instance, a car registration number, by itself, does not reveal the identity of a person. However, it is possible that with other information, an individual can be identified from this information.
Provisional views of the committee on Personal Data:
It is data about/relating to an individual that may be the subject matter of protection under the law. Data in this context ought to include any kind of information including opinions or assessments irrespective of their accuracy.
Data from which an individual is identified or identifiable/reasonably identifiable may be considered to be personal data. The identifiability can be direct or indirect.
New technologies pose considerable challenges to this distinction based on identifiability. This standard may have to be backed up by codes of practice and guidance notes indicating the boundaries of personal information having regard to the state of technology.
On the basis of the above comments potential information from rainmaker modules i.e. PGR, PT and TL were identified and the storage of information in each module was analysed as below.
Primary PII: With the help of given information individual can be directly identified
Secondary PII: With the help of given information an individual can not be identified directly but an individual can be identified if this information is available with one of primary PII.
Independent PII: With the help of given information individual cannot be identified directly but this information can help the receiver to identify an individual through other means like search for property tax/ trade license or electricity connections
Sensitive info: Password, Gender, Bank account number is sensitive information which needs to be protected
Module-wise data points required to secure are given below:
Decryption Service:
Role-based decryption with the jurisdiction of employee
Service-based decryption for citizens. Example: billing and collection service
Bulk Search in every Module:
Search should not be enabled for citizen
Bulk search in any module should not show more than 10 entries at a time
PII should be masked in search results
Employees can request to view PII in this case
The declaration should be made by the employee: about ethical use and
The entry should be audited with the Name and Mobile number of the employee
Notification about audit entry to the viewer
Governments raise revenues, create and manage public goods, deliver public services, and provide social welfare. Each function involves multiple offices and officials operating at various levels, national, subnational, and local. The entities have to follow multiple processes while coordinating with each other and with other stakeholders and individuals.
Local government bodies are obligated to fulfil crucial responsibilities, including urban planning, regulation of land use, and overseeing the construction of buildings, roads, and bridges. Additionally, they are tasked with providing essential services such as water supply, public health, sanitation, and solid waste management. Municipal government operations encompass a broad spectrum of functions, spanning more than two dozen areas. These include handling grievances, managing local taxes and fees, facilitating water and sewage connections, and maintaining records related to vital events such as births, deaths, marriages, and more.
Many of these functions can be optimally carried out using digital tools. eGov's DIGIT platform empowers government entities to enhance their capabilities, alleviate administrative burdens, and elevate governance by providing faster, more reliable, and efficient services to citizens.
In the process of implementing and utilizing digital tools like DIGIT, government entities and their contractors are prone to collecting, storing, processing, utilizing, and sharing data. This may include personally identifiable information (PII) of individuals.
This section offers insights into the regulations and best practices governing data protection and privacy. It includes guidelines for implementing these rules and practices, along with templates and draft documents that can be incorporated into data privacy and protection processes.
With the recent enactment of the Digital Personal Data Protection Act of 2023, it is now a legal obligation for governments to safeguard the personal data of citizens and uphold their right to privacy. This legislation underscores the imperative for robust measures and practices in managing and protecting individuals' personal information.
Click on the links below to browse through the details:
Principle | Definition | Practices/Examples |
---|---|---|
Platform | Platform Implementation | Program | |
---|---|---|---|
# | Stage name | What happens at this stage | Who handles data at this stage |
---|---|---|---|
Platform Owner (PO) | Implementing Agency (IA) | Program Owner (Prog) | Support Agency (SA) | |
---|---|---|---|---|
For eg - Study of Utrecht (emphasised the importance of transparency and awareness of citizen towards information held by local body of Utrecht)
Any entity using an individual’s data should publish and inform such an individual in a clear and easy to understand manner - why and how they are using that data and the potential uses of such data.
Have a privacy policy
Publish privacy policy (in a prominent place)
Make privacy policy clear, easy to understand, and short
Provide notice in case of changes to privacy policy
Assess awareness of privacy policy accessibility & visibility to users
Make data processing transparent - provide FAQ’s and contact person details for further doubts)
Make privacy practices visible on UI interface for visibility
Any entity should -:
As a ground rule collect and use only data that has a ‘purpose’ defined for it
Establish access controls/ mechanisms to distinguish between users / persons accessing such data & using data
Control or restrict access based on identity / authorisation.
Take clear, informed and explicit consent of the individual.
Commonly held datasets by .
This principle should be read along with 'User control '.
Clearly display (FAQ, additional UI page) who has access to what datasets (especially sensitive datasets )
Create data flow of all forms of datasets accessed by an entity (parallelly map out the purpose of data collected)
Identify roles for access to restricted and private data (if log-in credential then create log of such credentials used) ( RBAC)
Create individual preferences/ approval controls to allow individuals to choose the roles that have access ( opt-in options)
Restricted / sensitive data should not be accessed in its raw form (anonymise, de-identify such data, use it only for defined purposes)
Create a data deletion policy to restrict access to historical data
All stakeholders are to be bound by the same contractual terms of access and use of data (all parties are bound by similar terms in the MoU on access and use of data policy)
Example of PRE_EGOV Framework (suggests a PRE_EGOV framework after conducting a gap analysis on the privacy frameworks of the , ,, and , which ranked the highest in the for their performance. It concluded that all of the above frameworks lacked a feature of ownership rights management, enabling the user to have control over their data, lack of considering cultural, political and social factors into consideration when framing the privacy frameworks for their respective e-governments.)
Entity should give individual’s control on how their information is used, shared, viewed, and changed to provide ownership of data to individuals.
A choice to allow or disallow use of their data must be given to the users.
Users should be able to access their information to be able to review and verify it or to ask for deletion of the information from the entities systems.
Create UI interface for individual privacy preferences control ( adjust privacy settings )
Allow users to submit inquiries on why data is collected.
Entity to provide clear reasons for use
User should be able to challenge the reasoning and disallow further sharing if user is unsatisfied with the stated use reason given by the entity
Allow users to put in requests to anonymise / delete datasets ( eg - Chatbot request taking)
Provide users option to restrict sharing data further
The entity should retain data in a safe manner ( de-identified/anonymised) and only till the data serves a defined/laid down purpose.
Need to set frameworks for purposeful retention of data ( includes defining categories of purposes and corresponding periods of retention)
Need to create a policy for deletion/retention of data (historical data - deletion)
Set retention periods
Create frequent checks to see if data is deleted, and if retained then the purpose should be mentioned.
Identify safe storage locations ( cloud)
The entity should keep the data accurate, complete, up-to-date, adequate, and relevant for the purpose of a defined use.
The entity should ensure that the data is protected and secured from any unauthorized access or change
Accuracy - Define a policy for frequency and quality of updation of data and checks and balances for updation
Allow users to raise requests on inaccuracy or wrongful nature of data
Safeguard -
Set a framework policy on categories of datasets (what nature of data falls into sensitive, restricted or public data)
Only keep sensitive data in an anonymised / de-identified form
Avoid collecting and using restricted data
Create strict access controls for restricted/sensitive data
The entity must comply with all the above principles as well as the legal obligations established nationally, followed by partners in other countries and internal policies to avoid non-compliance and penalties.
The entity must provide a legal basis for actions taken on data collected and used to maintain legality and fairness.
Assessment of data pipelines and systems with legal compliances ( currently eGov requires compliance with the legal obligations given here
Before partnering with entities in other countries , an assessment of their legal requirements is a must ( for example , before partnering with any entity in the European Union, requires reference basis to standard contractual clauses.
Assessment of users potential privacy claims must be undertaken
The entity should be held accountable to comply with data protection and privacy measures ( follow all the principles and measures mentioned above) , to gain the trust of users .
Designate an org-wide person in charge for compliance checks and accountability of DPP principles practiced and implemented
Create awareness within one’s organisation for data privacy and protection practices (eg- conducting quarterly awareness sessions, hosting knowledge exchange sessions)
Make the policies more visible and regularly shown to internal teams for more visibility
Designate a senior managing committee that is actively involved in establishing the privacy protection measures within one’s organisation.
Requirements for principles
Questions to ask/ Practices to adopt
Does eGov/other entity satisfy this question/ perform this practice?
( Yes, No, Partially)
1.
1. Principle of notice and awareness
N1.
Organization has a privacy policy or any other policy/law/regulation that can be used on its behalf
Does the organization have privacy policies or other related regulations to regulate information privacy protection? If yes, which one? (please, enclose the copy)
N2.
Organization has a privacy notice
How is it communicated to users (citizens)?
N3.
Organization assesses the awareness of the privacy notice
How does the organization ensure that users are aware of the privacy notice?
N4.
Organization informs users about contact information collected and stored
Does the organization inform users about collected and stored contact information (email, phone, etc.)? If yes, How? What are the strategies/plan/instructions/procedures that address this task?
N5.
Organization informs users about computer information collected and stored
Does the organization informs users about computer information collected and stored (IP address, browser type, OS, etc)? If yes, how? What are the strategies/plan/instructions/procedures that address this task?
N6.
Organization informs users about interaction information stored and collected (historical search, browser behavior, etc
Does the organization inform users about interactive information stored and collected (historical search, browser behavior, etc)? If yes, how? What are the strategies/plan/instructions/procedures that address this task?
N7.
Organization informs users about sensitive information stored and collected (criminal record, health status, etc
Does the organization inform users about sensitive information collected and stored (health status, criminal records, etc)? If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
N8.
Organization informs users about geolocation information collected and stored
Does the organization inform users on geolocation information collected and stored? If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
N9.
Organization informs users about financial information stored and collected
Does the organization inform users about financial information collected and stored If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
N10.
Organization informs users about used cookies
Does the organization informs users about used cookies? If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
N11.
Organization informs users on personal information that would be used internally
Does the organization identify recipients of shared information for citizens? If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
N12.
Organization informs customers on personal information that would be shared for context specific
Does the organization inform users about personal information that would be shared for context specific (information would be shared in order to get the required service) If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy
N13
Organization identifies recipients of shared information for users
Does the organization inform users about recipients of their shared information? If yes, how? What are the strategies/plan/instructions/procedures that address this task? (please enclose a copy)
2.
Principle of access
A1.
Affiliates, subsidiaries, and third parties are bound with the same organization privacy policy
Who are the bodies bound by the organisations privacy policy? Please enlist
A2.
Organization has contracts with third parties establishing how closed data can be used (“closed data” means data can only be accessed by its subject, owner or holder)
Does the organization have a contract with parties establishing how closed data can be used (“closed data” means data can only be accessed by its subject, owner or holder) (if yes, enclose a copy)
A3.
Organization has identified which persons or employee categories who have access to a certain users' information
Has the organization identified the individuals or employee categories who have access to a certain users' information? If yes, how does it work? Provide examples (please enclose a copy of related document)
A4.
Organization depicts the flow of information within the organization
Does the organization depicts the flow of information within the organization? (ie, a scheme, which shows how information circulate inside the organization) If yes, how does it work? Provide examples (If yes, please enclose a copy of any related document)
A5.
Organization establishes users' consent (approval or preferences) mechanisms on sharing their information
Does the organization establishes users' consent (approval or preferences) mechanisms on sharing their information? If yes, what are those mechanisms? Please, enclose a copy of related documents.
3.
Principle of users' control
C1.
Organization allows users to adjust privacy settings
Does the organization allow customers to adjust privacy settings? If yes, what are the strategies/plan/instructions/procedures that address this task? (please enclose a copy
C2.
Users are allowed to access personal information collected
Are citizens allowed to access personal information collected? If yes, how does it work? Provide a scenario as an example. What are the strategies/plan/instructions/procedures that address this task? (Please enclose a copy)?
C3.
Users request certain information to be deleted or anonymized
Can users/citizens request certain information to be deleted or anonymized? If yes, what are the strategies/plan/instructions/procedures that address this task? (please enclose the copy)
4.
Principle of safeguard, security, and accuracy
S1.
Does the organization have procedures to ensure to users information accuracy
Does the organization has procedures to ensure to customers information accuracy? If yes, what are those procedures? (please enclose a copy)
S2.
Organization discloses protected users' information to comply with the law or prevent a crime
Can the organization discloses protected information to comply with the law or prevent a crime? If yes, in what circumstances? (provide examples)
S3.
Organization reserves right to disclose users' information to protect own rights
Does the organization reserves right to disclose personal information to protect own rights?
S4.
Organization uses privacy enhancing technologies (PETs
Does the organization identifies means of privacy enhancing technologies (eg,: encryption tool)? If yes, provide list of used tools.
5.
Principle of storage limitation
SL1
Organization has strategies regarding management of user's information when his/her account is closed)
Does the organization has strategies (policy) regarding personal information when account is closed? If yes, what are those procedures? (please enclose the copy)
SL2
The organization states limit for data retention
Does the organization states limit time for data retention? If yes, what are the strategies/plan/instructions/procedures that address this task? (please enclose the copy
6.
Principle of enforcement
E1.
Organization provides procedures for users' privacy concerns and complaints
Does the organization provides procedures for users' privacy concerns and complaints? If yes, how? What are those procedures? (please enclose the copy)
E2.
Organization provides disclaims for failure of privacy measures
Does the organization provides disclaims for failure of privacy measures?
E3.
Organization has a regulatory agency for users' privacy complaint
Does the organization has a regulatory agency for customers' complaint? If yes, what is that agency? Does the organization provides contact information for that agency? How does the organization works with that agency to comply with users' complaint?
E4.
Organization periodically assesses users' potential privacy concerns
Organizations periodically assess customers' potential privacy concerns (If yes, enclose a copy of an example)
7.
Principle of accountability
AC1.
Organization has designated staff in charge of customers' privacy protection
Did the organization designate a personal in charge of customers' privacy protection? If yes, identify the position and its description.Has the designated staff been given clear authority to oversee the organization's information handling practices?
AC2.
Organization arranges staff trainings in regard of information privacy protection and awareness of relevant policies regarding users' privacy protection
Is the organization staff trained in the requirements for protecting personal information and aware of the relevant policies regarding user's privacy protection?
AC3.
The senior management committee is actively involved in establishing privacy protection measures within the organization
Is the senior management actively involved in establishing privacy protection measures within the organization? If yes, clearly describe their role.
Definition
Refers to a software system that can be used by govt entities or contractors. The term ‘platform’ refers to the software (code) itself, and NOT to any implementation of that code.
Refers to each instance of a platform / software system that has been implemented.
Any delivery of govt services or other govt operations or reforms can be a program. In the context of this document, a program deploys and/or leverages a platform implementation.
Does it process data?
No
Yes
Yes
Example
DIGIT
MSeva, SUJOG, eChhawani
ULBs in Punjab / Odisha / Cantonment Boards delivering services using MSeva / SUJOG / eChhawani respectively.
Stage 0
Program Set-up
Resources, budgets, procurement and infrastructure is identified and an implementation partner is onboarded.
None
Stage 1
Program Kickoff
Implementation starts and data is collected from a few identified jurisdictions for testing
Yes.
IA, Prog
Stage 2
Solution design
State-specific configurations are made with processes and workflows being designed. Policy decisions are made at this stage.
Yes.
IA , Prog
Stage 3
Customisations and Configurations
Adoption and performance of the program is measured. UAT (Use acceptance testing) is conducted here.
Yes
IA,Prog
Stage 4
UAT and Go live
The UAT testing is completed, feedback is received and final platform deployment is carried out at all identified jurisdictions.
Yes
IA, Prog
Stage 5
Statewide rollout
Phase-wise implementation of the platform begins. Troubleshooting, support with errors and critical bugs are fixed.
Yes
IA, Prog, SA
Stage 6
Sustenance and Ongoing Improvement
Sustenance and Ongoing Improvement - platform adoption teams are set, adoption is tracked and awareness and reviews on adoption are conducted.
Yes
Prog, SA
Definition
The entity that owns, governs, or controls the platform's codebase.
The entity that deploys and configures a platform for the AA / Prog
The entity that is responsible for delivery of specific public goods, services, social welfare
The entity that provides support in any technical / functional aspect required by the Prog, with respect to that platform implementation
Access to Data
No, except to extent agreed with Prog / AA
Yes, during implementation only
Yes
Yes, to the extent needed for support
Example
eGovernments Foundation
Systems integrators (e.g. PwC, Transerve)
State HUDD, each ULB in the state
Any third party contracted for IT/program support
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Citizen
Name
Primary
Identified
User Service
Citizen
Mobile Number
Primary
Identified
User Service
Citizen
City
Secondary
Identifiable
User Service
Citizen
Password
Sensitive info
Sensitive info
User Service
Citizen
Street Name/ Locality
Secondary
Identifiable
Property Module
AO
Name
Primary
Identified
User Service
AO
Mobile Number
Primary
Identified
User Service
AO
City
Secondary
Identifiable
User Service
AO
Password
Sensitive info
Sensitive info
User Service
LME
Name
Primary
Identified
User Service
LME
Mobile Number
Primary
Identified
User Service
LME
City
Secondary
Identifiable
User Service
LME
Password
Sensitive info
Sensitive info
User Service
Admin
Name
Primary
Identified
User Service
Admin
Mobile Number
Primary
Identified
User Service
Admin
City
Secondary
Identifiable
User Service
Admin
Password
Sensitive info
Sensitive info
User Service
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Trade Location Details
Property ID
Independent
Identifiable
Property Module
Trade Location Details
Electricity Connection No.
Independent
Identifiable
TL Module
Owner Information
Name
Primary
Identified
User Service
Owner Information
Mobile No.
Primary
Identified
User Service
Owner Information
Father/Husband's Name
Primary
Identified
User Service
Owner Information
Primary
Identified
User Service
Owner Information
PAN No.
Independent
Identifiable
User Service
Owner Information
Correspondence Address
Primary
Identifiable
User Service
Owner Information
DOB
Secondary
Identifiable
User Service
Documents
Owner's ID Proof
Independent
Identified
Files store
Documents
Ownership Proof
Independent
Identified
Files store
Documents
Owner’s photo
Independent
Identified
Files store
Documents
Payer Information
Name
Primary
Identified
Collections
Payer Information
Mobile No.
Primary
Identified
Collections
Counter Employee
Name
Primary
Identified
User Service
Counter Employee
Mobile Number
Primary
Identified
User Service
Counter Employee
City
Secondary
Identifiable
User Service
Counter Employee
Password
Sensitive info
Sensitive info
User Service
Approver
Name
Primary
Identified
User Service
Approver
Mobile Number
Primary
Identified
User Service
Approver
City
Secondary
Identifiable
User Service
Approver
Password
Sensitive info
Sensitive info
User Service
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Property Address
City
Secondary
Identifiable
Property Module
Property Address
House No
Secondary
Identifiable
Property Module
Property Address
Building Name
Secondary
Identifiable
Property Module
Property Address
Door No
Primary
Identifiable
Property Module
Property Address
Street Name Locality
Secondary
Identifiable
Property Module
Property Address
Pincode
Secondary
Identifiable
Property Module
Property Address
Existing Property ID
Independent
Identifiable
Property Module
Owner Information
Name
Primary
Identified
User Service
Owner Information
Gender
Sensitive info
Sensitive info
User Service
Owner Information
Mobile Number
Primary
Identified
User Service
Owner Information
Father/Husband’s Name
Primary
Identified
User Service
Owner Information
Relationship
Secondary
Identifiable
User Service
Owner Information
Special Category
Primary
Identifiable
User Service
Owner Information
ID of Document belonging to special category
Primary
Identified
User Service
Owner Information
Email ID
Primary
Identified
User Service
Owner Information
Correspondence Address
Secondary
Identifiable
User Service
Property Tax
Property unique ID
Independent
Identifiable
Property Module
Payer Information
Name
Primary
Identified
Collections
Payer Information
Mobile No.
Primary
Identified
Collections
Counter Employee
Name
Primary
Identified
User Service
Counter Employee
Mobile Number
Primary
Identified
User Service
Counter Employee
CIty
Secondary
Identifiable
User Service
Counter Employee
Password
Sensitive info
Sensitive info
User Service
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Property Address
City
Secondary
Identifiable
Property Module
Property Address
House No
Secondary
Identifiable
Property Module
Property Address
Building Name
Secondary
Identifiable
Property Module
Property Address
Plot/House/Survey No
Secondary
Identifiable
Property Module
Property Address
Property ID
Independent
Identifiable
Property Module
Property Address
Street Name Locality
Secondary
Identifiable
Property Module
Property Address
Pincode
Secondary
Identifiable
Property Module
Owner Information
Name
Primary
Identified
User Service
Owner Information
Gender
Sensitive info
Sensitive info
User Service
Owner Information
Mobile Number
Primary
Identified
User Service
Owner Information
Guardian Information
Primary
Identified
User Service
Owner Information
Relationship
Secondary
Identifiable
User Service
Owner Information
Owner Category
Secondary
Identifiable
User Service
Owner Information
Correspondence Address
Primary
Identified
?
Owner Information
Email ID
Primary
Identified
User Service
Connection Details
Meter ID
Independent
Identifiable
W&S Module
Connection Details
Consumer Number (OId/New)
Independent
Identifiable
W&S Module
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Applicant info
Applicant name
Primary
Identified
User service
Applicant info
Applicant mobile number
Primary
Identified
User service
Owner Information
Name
Primary
Identified
User Service
Owner Information
Gender
Sensitive info
Sensitive info
User Service
Owner Information
Mobile Number
Primary
Identified
User Service
Owner Information
Father/Husband’s Name
Primary
Identified
User Service
Owner Information
Relationship
Secondary
Identifiable
User Service
Stakeholder registration
Name
Primary
Identified
User Service
Stakeholder registration
Gender
Sensitive info
Sensitive info
User Service
Stakeholder registration
DOB
Secondary
Identifiable
User Service
Stakeholder registration
Mobile Number
Primary
Identified
User Service
Stakeholder registration
Email ID
Primary
Identified
User Service
Stakeholder registration
PAN number
Independent
Identifiable
User Service
Stakeholder registration documents
ID Proof
Primary
Identified
File store
Stakeholder registration documents
Educational certificate
Primary
Identified
File store
Stakeholder registration documents
Experience certificate
Primary
Identified
File store
Stakeholder registration documents
Photograph
Independent
Identified
File store
Stakeholder registration documents
Income tax statement
Independent
Identified
File store
Stakeholder registration documents
License registration doc
Independent
Identified
File store
OBPS Application documents
Identity proof
Primary
Identified
File store
OBPS Application documents
Address proof
Primary
Identified
File store
OBPS Application documents
Land tax receipt
Primary
Identified
File store
OBPS Application documents
Property deed
Primary
Identified
File store
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Property Address
City
Secondary
Identifiable
Property Module
Property Address
House No
Secondary
Identifiable
Property Module
Property Address
Building Name
Secondary
Identifiable
Property Module
Property Address
Door No
Secondary
Identifiable
Property Module
Property Address
Street Name Locality/Mohalla
Secondary
Identifiable
Property Module
Property Address
Pincode
Secondary
Identifiable
Property Module
Property Address
Existing Property ID
Independent
Identifiable
Property Module
Owner Information
Name
Primary
Identified
User Service
Owner Information
Mobile Number
Primary
Identified
User Service
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Employee Information
Name
Primary
Identified
User Service
Employee Information
Mobile Number
Primary
Identified
User Service
Employee Information
Gender
Sensitive info
Sensitive info
User Service
Employee Information
Guardian’s name
Primary
Identified
User Service
Employee Information
Relationship
Secondary
Identifiable
User Service
Employee Information
Date of Birth
Secondary
Identifiable
User Service
Employee Information
Email ID
Primary
Identified
User Service
Employee Information
Correspondence Address
Primary
Identified
User Service
Segment
Data Point
Primary or Secondary PII
Identified/ identifiable
Information stored in
Employee
Name
Primary
Identified
User Service
Employee
City
Secondary
Identifiable
User Service
Contractor
Code
Secondary
Identifiable
Contractor Master
Contractor
Name
Primary
Identified
Contractor Master
Contractor
Correspondence Address
Primary
Identified
Contractor Master
Contractor
Permanent Address
Primary
Identified
Contractor Master
Contractor
Contact Person
Primary
Identified
Contractor Master
Contractor
Primary
Identified
Contractor Master
Contractor
Mobile
Primary
Identified
Contractor Master
Contractor
GST/TIN No
Independent
Identifiable
Contractor Master
Contractor
Bank Account No
Secondary
Identifiable
Contractor Master
Contractor
PAN No
Independent
Identifiable
Contractor Master
Contractor
EPF No
Independent
Identifiable
Contractor Master
Contractor
ESI No kg
Independent
Identifiable
Contractor Master
Supplier
Code
Secondary
Identifiable
Supplier Master
Supplier
Name
Primary
Identified
Supplier Master
Supplier
Correspondence Address
Primary
Identified
Supplier Master
Supplier
Permanent Address
Primary
Identified
Supplier Master
Supplier
Contact Person
Primary
Identified
Supplier Master
Supplier
Primary
Identified
Supplier Master
Supplier
Mobile
Primary
Identified
Supplier Master
Supplier
GST/TIN No
Independent
Identifiable
Supplier Master
Supplier
Bank Account No
Independent
Identifiable
Supplier Master
Supplier
PAN No
Independent
Identifiable
Supplier Master
Supplier
EPF No
Independent
Identifiable
Supplier Master
Supplier
ESI No
Independent
Identifiable
Supplier Master