Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
New release features, enhancements, and fixes
DIGIT 2.8 release includes 1 new module, several new products, functional and security features, regression testing, bug fixes of previously released modules and a few non-functional changes.
New Module
Citizen engagement - Survey service
Functional Changes
OBPS Tabular Reports in old UI/UX, Trade Licence Tabular Reports in old UI/UX
Notification based on channels - PGR, OBPS & mCollect
Help/FAQ section in Citizen UI - common framework - integration with all the modules
Bill Genie UI/UX revamp
Survey module
Water & Sewerage UI/UX Revamp, Bill amendment workflow update & UI/UX revamp, W&S disconnection
Water & Sewerage Privacy Exemplar, Standard Reports for W&S - Receipt Register, Collection Register, Defaulters Report - in old UI/UX
Birth & Death Tabular Reports - in old UI/UX
Enhancement on DSS, Finance Module State DSS, NSS Adaptor, Enhancements in National Dashboard- FAQ & About page.
Non-functional Changes
Navigation re-direction from New UI to OLD UI
Bulk Bill PDF generation and performance testing
W&S Bulk Bill generation and performance testing
https://docs.google.com/document/d/1b4plZ0TFMJC53MDt72_sx-Wif2bdR4I5dnD9DTN40Ss/edit
Adoption of new features will be tracked on the basis of the following metrics:
Updated docs
2.8 release master migration document
W&S Bill Amendment workflow config migration
Follow the steps mentioned in the W&S Bill Amendment Workflow config migration document.
Verify once if workflow name is hardcoded except edited parts of UI, Backend - check in indices and report - checked not present except config
Verify if wf config is there for tenant level or state level only
An update has to be added as a patch to the first release of the amendment
Note : Replace the existing persister files with the latest files linked above.
Water modify connection
Sewerage modify connection
Sewerage Disconnection
Water Disconnection
By default encryption & privacy is disabled for 2.8
To disable privacy for Property and W&S make changes in the MDMS- DataSecurity -> SecurityPolicy.json file.
For models- Property,PropertyDecrypDisabled, WnSConnection, WnSConnectionOwner, WnSConnectionDecrypDisabled, WnSConnectionOwnerDecrypDisabled, WnSConnectionPlumber, WnSConnectionPlumberDecrypDisabled; insert only one attribute in the attributeList as follows:
Insert the name and jsonPath which are not actually expected to be present in the RequestBody/ResponseBody
Files to be updated
Masking patterns - needed only if enc is enabled and carried our for old data.
Once the above changes are done deploy the following builds:
Ws-services
Property-services
Sw-services
Inbox
Restart egov-user service
Impact of privacy framework
The privacy implementation masks the PII information based on the MDMS config (Security Policy)
The masking/unmasking of data is entirely dependent on the roles of the users in the system, adding or removing these roles from the config will enable masking for each role.
There is no module-level control here, so if a W&S role is added to an employee of TL then data will be masked so cross-using of roles should be avoided.
The agreement with Manish is that the privacy be disabled until redesign with the module filter - Done
The policy should be defined for adding or removing fields from the encryption process.
W&S privacy (Skip over if encryption is not carried out)
Deploy the latest builds provided for the 2.8 release {builds}
Add privacy configs to MDMS (SecurityPolicy.json)
Restart the MDMS service followed by the encryption service
Add the temporary index config update index with encrypted data
This file will fetch data from the Encryption of old data and upload it to the existing index so that the index will be encrypted or masked as per config.
Trigger the encryption of old data through the provided APIs
Validating the encryption of old data {Doc Link}
Disable & enable decryption.privacy
water.decryption.abac.enabled=
true
sewerage.decryption.abac.enabled=
true
PT privacy (Skip over if encryption is not carried out)
Deploy the latest builds provided for the 2.8 release
Add privacy configs to MDMS (SecurityPolicy.json)
Restart the MDMS service followed by the encryption service
Add the temporary index config update index with encrypted data
Add the new index config for the fuzzy search index
Trigger the encryption of old data through the provided APIs
Validating the encryption of old data {Doc Link}
Disable & enable decryptionprivacy
property-decryption-abac-enabled:
"true"
Bill genie
No changes required
Bulk Bill PDF Generation and Performance Testing
Enhancement is part of the new build release.
W&S Bulk Bill Generation and Performance Testing
privacy integration - PT
Covered as part of privacy above
Notification based on channels - PGR
Localisation added as part of the release KIT
Notification based on channels - mCollect
Localisation added as part of the release KIT
OBPS/ BPA (Notifications only)
Localisation added as part of the release KIT
National Dashboard Adaptor Integration Testing
The Docs are specific to Punjab implementation, generic product document is in the pipeline.
NationalDashBoard 3 New KPI
MasterDashboardConfig.json
Finance Module State DSS
MDMS & Configs -
BackWard Compatibility Results -
UI Changes
Please add all MDMS changes before deploying the UI code.
W&S: W&S UI/UX Revamp
No changes are needed, it will work automatically if we deploy. There are two enhancements in the W&S flow, i.e Edit by normal user and Edit by config user, it is based on the MDMS configuration (Link). This is applicable in all 3 flows.
W&S disconnection
No changes needed, it will work automatically if we deploy.
Bill amendment W&S UI/UX RevampI
No changes needed, it will work automatically if we deploy.
Bill Genie UI/UX revamp
No changes needed, it will work automatically if we deploy.
OBPS Tabular Reports in old
No changes needed, it will work automatically if we deploy. It is based on the configuration of the backend.
TL Tabular Reports
No changes are needed, it will work automatically if we deploy. It is based on the configuration of the backend.
Standard Reports for WnS
No changes are needed, it will work automatically if we deploy. It is based on the configuration of the backend.
Core UI Changes (Banner Images of Each module)
No changes are needed, it will work automatically if we deploy. It is based on the configuration of the MDMS (Link)
We need to add the below images in the S3 bucket:PGR: https://egov-uat-assets.s3.amazonaws.com/PGR.png
Navigation re-direction from New UI to OLD UII. Mdms Changes are required for this (Link).
Old UI
Only Fire Noc and Tabular reports are PO signed off in the current release for the OLD UI, the rest of the modules are covered as part of backward compatibility tests shared above.
This release provides UI/UX revamp of existing features of water and sewerage modules in addition to applying for disconnection, data privacy, and 3 standard reports.
UI/UX revamp of water and sewerage covers the features given below.
My Connections
My Applications
My Payments
My Bills
Search and Pay
Apply for a new connection
Apply for disconnection
View Connection Details
View Consumption Details
View Application Details
Pay Water and Sewerage Bill
Pay Application Fee
Water Inbox
Sewerage Inbox
Apply for a new connection
Modification of connection
Bill Amendment
Application Workflows
Search Connection
Search Application
View Connection Details
View Consumption Details
Add Meter Reading
View Application Details
Collect Application Fee
Collect Water and Sewerage Bill
Connection Details
Application Acknowledgement/ From
Estimation Notice
Sanction Letter
Disconnection of water and sewerage connection.
Disconnection Notice PDF
Application acknowledgement/form PDF
Receipt Register
Collection Register
Defaulters Report
Masking of PII data
Option to unmask the PII data
Audit trail of the unmasking of PII data
None
OBPS Release notes for Urban DIGIT 2.8 release.
This release provides the UI/UX revamp of OBPS , workflow timeline changes and reports.
The functionality in OBPS remains the same - only the UI/UX has changed.
Two reports added for administrative purposes.
Changes made to the workflow timeline for clear representation of application status to the users.
Notiification based on channels.
Stakeholder registration and employee approval flow
Architect flow (permit application creation) and employee approval flow
Architect occupancy certificate creation and employee approval flow
Fire NOC employee approval flow
Open link for stakeholder registration flow
Daily collection report
OBPS application status report
Workflow timeline changes in application flow - The image below the one illustration on the right displays the old status view. Here the application status shows pending even if the task is completed. As per the current release, the application displays only the current state where action is marked as pending. The remaining status is marked as done or completed or in submitted state.
Notification based on channels - The OBPS application allows the user to configure different messages for different channels (In app, mobile messages, email) for the same event in the application process flow.
None
Trade License release notes for Urban DIGIT 2.8 release.
Few reports are added to the TL module for administrative purposes.
TL Daily collection report
TL Application status report
Trade license registry
Trade license renewal pending report
TL Daily collection report
TL Application status report
Trade license registry
Trade license renewal pending report
None
DIGIT 2.8 release has a new module, new features, bug fixes and a few non-functional changes.
Punjab W&S is using the old UI. So W&S new features, which are using new UI, cannot be used in Punjab
The following features are part of the UPYOG Product Roadmap and will be released as part of UPYOG 2.0 by end of March 2023.
*As per current demand from NUDM-enabled states
Adoption of new features will be tracked on the basis of the following metrics:
This release provides PRDs of 3 property tax features and a few bug fixes.
PRDs of 3 property tax features and bug fixes.
Properties amalgamation
Property bifurcation
Capital value system
Property mutation bug fixes.
None
Bug
None
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Feature | Description |
---|---|
Updated Feature | Description |
---|---|
Key Features | Description |
---|
Link |
---|
Water & Sewerage UI/UX Revamp
Employee W&S UI/UX revamp
Citizen UI/UX revamp
Water & Sewerage disconnection
Added W&S disconnection feature:
Apply for disconnection (Permanent/Temporary)
Proceed with the workflow and Execute disconnection.
Events and Notifications for each step
Added new screens for disconnection workflow.
Privacy Exemplar -
(The feature is disabled in the 2.8 release)
Added Data Privacy for water & sewerage for masking/unmasking PII data.
W&S Bill Amendment Workflow update
Code change in billing service, workflow change and data update in workflow transition table.
New KPIs for National Dashboard
Added 3 New KPIs - 2 on the National Dashboard Overview screen (Total Non-Tax Collection, Total Non-Tax Revenue Contribution), 1 in Property Tax (Average Property Tax Collected)
Added handover document
Bill amendment UI/UX revamp
Enhancements in National Dashboard- FAQ & About
Updated - About, Purpose page for National Dashboard
B&D Tabular Reports - in old UI/UX
Added - Birth Count Report, Death Count Report, Birth Count Report by District, Death Count Report by District, Birth and Death Certificate Payment Report.
Bill amendment UI/UX revamp
Search Amendment
Create a new Amendment for W&S
Inbox screen for bill-amendment
Notification based on channels - PGR, OBP and mCollect
Enhanced PGR, OBPS and mCollect notifications to incorporate different notifications based on Channels.
Bulk Bill PDF Generation and Performance Testing
Updated _jobscheduler API for enhancing performance for bulk records.
TL Tabular Reports in old UI/UX
Enhanced TL Report: TLDailyCollectionReport, TLApplicationStatusReport, TradeLicenseRegistryReport
Navigation re-direction from new UI to old UI
Login screen navigation to new UI irrespective of and login screen(old/new)
Standard Reports for W&S - Receipt Register, Collection Register, Defaulters Report - in old UI/UX
Added W&S reports: Receipt Register, Collection Register, Defaulters Report.
Bill Genie UI/UX revamp
Added:
Search bills
Group bills
Download bills
National Dashboard Adaptor Integration Testing
Airflow setup for all modules integrated all modules with an adapter.
OBPS Tabular reports in old UI/UX
Enhanced OBPS Report: OBPS Daily Collection Report
Survey
Create surveys, update surveys, view survey results, fill in survey forms.
Metric
Description
Estimate
Time to Value
Time taken from Gate 2 to all features of DIGIT 2.8 to be released as part of UPYOG 2.0
1.5 months
No. of states using the feature
No. of states implementing the feature/module within next 6 months
Refer Account Wise Feature Demand
No. of ULBs using the feature
No. of ULBs implementing the feature/module within next 6 months
Refer Account Wise Feature Demand
Frequency of Use(Survey)
No. of Surveys completed in next 6 months
Target: 3
Metric | Description | Estimate |
Time to Value | Time taken from Gate 2 to all features of DIGIT 2.8 to be released as part of UPYOG 2.0 | 1.5 months |
No. of states using the feature | No. of states implementing the feature/module within next 6 months | Refer Account Wise Feature Demand |
No. of ULBs using the feature | No. of ULBs implementing the feature/module within next 6 months | Refer Account Wise Feature Demand |
Frequency of Use(Survey) | No. of Surveys completed in next 6 months | Target: 3 |
Inbox | UI/UX revamp of employee inbox and service wise separate inboxes. |
View Application | UI/ UX revamp of the application details view. |
Doc Links |
|
|
CITIZEN |
|
Apply for new connection | UI/UX revamp and change in application form to remove the information which is to be filled by employee. |
Search and Pay | UI/UX revamps and added by Door no. and Owner’s Name. |
My Payments | UI/UX revamp of payment history to My Payments. |
My Bills | Added My Bills to list all the bills due for payment. |
My Connections | UI/UX revamp of Search Connection and Card view of Connection Details. |
View Connection Details | UI/UX revamps of complete view connection details. |
View Consumption Details | UI/UX revamps of complete view of consumption details. |
My Applications | UI/UX revamps of Search Application to My Applications and Card view of application details. |
View Application Details | UI/UX revamps of complete view application details. |
Pay Application Fee | UI/UX revamps of complete flow of payment of application fee. |
Apply for disconnection | A new feature added to allow the consumer of the service to apply for is disconnection. |
Data Privacy Audit Report | This report is added to see the details who all have unmasked to see the logged in user’s PII data. |
EMPLOYEE |
|
Employee Inbox | UI/ UX revamp of inbox and separating out the water and sewerage inboxes. |
Apply for new connection | UI/ UX revamp of and change in application form to remove the information which is to be filled by Field Inspector. |
Search Applications | UI/ UX revamp of Search Application. |
View Application Details | UI/ UX revamp of Search Application details. |
Application Workflow | UI/ UX revamp of all the states of application flow and the option to edit the application by the FI only to add the additional details |
Collect Application Fee | UI/ UX revamp of Collect Application Fee. |
Search Connection | UI/ UX revamp of Search Connection. |
Collect Water/ Sewerage Bills | UI/ UX revamp of Collect the Water/ Sewerage Bills. |
View Connection Details | UI/ UX revamp of view connection details |
View Consumption Details | UI/ UX revamp of view consumption details |
Add Meter Reading | UI/ UX revamp of add meter reading. |
Modify Connection | UI/ UX revamp of modify connection. |
Bill Amendment | UI/ UX revamp of bill amendment. |
Apply for disconnection | New feature to apply for disconnection has been added. |
Data Privacy | Data privacy aspect is added to mask the PII data of a consumer and capturing the information up on unmasking of PII to generate the report.
|
Reports |
|
Doc Links | Description |
|
|
Accounts | No. of ULBs | Current Version | Features to be use from digit 2.8 | Expected Timeline |
1 | UKD | 93 | Digit 2.5 | TL Reports , bulk bill generation | Q1 2023-24 |
2 | PUNJAB | 166 | Digit 2.2 | Finance rollout dashboard, TL Reports , bulk bill generation | Q1 2023-24 |
3 | ODISSA | 114 | Digit 2.3 | Finance rollout dashboard, TL Reports , bulk bill generation | Client decision on upgrade |
4 | NIUA- UPYOG | NA | Digit 2.7 | All features of digit 2.8 , build APK and publish | March 27th |
5 | NIUA-Kerala | 93 | Digit 2.7 | All features of digit 2.8 | Client decision on upgrade |
6 | Partners(e.g. BEL) | NA | As per demand | Partners' decision to update |
Module | Feature | Partner User* | Timeline |
B&D | B&D Tabular Reports - in old UI/UX | Kerala | To be decided by partner |
Data Privacy | PII, data masking | To be Decided | To be decided by partner |
OBPS | OBPS Tabular Reports in old UI/UX | Haryana, Chhattisgarh | To be decided by partner |
Platform | Bill Genie UI/UX revamp |
| To be decided by partner |
Platform | Bill amendment UI/UX revamp |
| To be decided by partner |
Platform | Help/FAQ section in Citizen UI - common framework | To be Decided | To be decided by partner |
Platform | Citizen survey feature | To be Decided | To be decided by partner |
TL | TL Tabular Reports in old UI/UX | Kerala | To be decided by partner |
W&S | Disconnection Notice (With Reason) | Chhattisgarh | To be decided by partner |
W&S | Closure (Permanent/Temporary) | Chhattisgarh | To be decided by partner |
W&S | Payment/ Billing History | Chhattisgarh | To be decided by partner |
W&S | W&S UI/UX Revamp | Chhattisgarh | To be decided by partner |
W&S | W&S disconnection | Chhattisgarh | To be decided by partner |
W&S | W&S Privacy Exemplar | Chhattisgarh | To be decided by partner |
W&S | Standard Reports for WnS - Receipt Register, Collection Register, Defaulters Report - in old UI/UX | Chhattisgarh | To be decided by partner |
@Sl No | Checklist | Yes/No/Partially | Reference link | Owner | Date(mm/dd/yyyy) | Remarks |
1 | Product Release Notes | Yes | Kavi |
| We are not doing migration of data for PT and W&S module as privacy is disabled by default. |
2 | List of services that needs to be upgraded | Yes | Kavi |
| Verified. |
3 | List of SMS template changes | No |
| Kavi |
| Not provided |
4 | Verify Configs, MDMS, devOps Changes | Yes | Kavi |
| Verified |
5 | Verify test cases for all enhancements/updates | Yes |
| Gurjeet/Neha |
| No test case available for :
|
6 | Verify list of issues reported by partners |
| Vinoth/Pradeep |
| Not added in release doc. |
7 | Verify performance testing report | Yes |
| Gurjeet/Neha |
| No report found |
8 | Verify backward compatibility testing report | Yes | Gurjeet/Neha |
| More than 40% test cases failing from v2.6 and v2.7 |
9 | Frontend and Backend code merge | Yes |
| Vinoth/Chinmay/Gurjeet |
| Verified developer guide and airflow adapter code. |
10 | Increase semversion,Prepare builds from UPYOG Code base and Deploy |
| Vinoth/Chinmay/Gurjeet |
|
|
11 | Upgrade Upyog UAT plan | Yes | Vinoth/Chinmay/Gurjeet |
|
|
12 | Loading the localisation codes in UAT | Yes |
| Gurjeet/Neha |
| Prepared scripts |
13 | Backward compatibility testing passed |
|
| Navyashree/Kavi |
| Backward compatibility tested using API’s |
14 | List of bugs identified by Impl Team(Known issues) |
|
| Impel Team |
|
|
15 | Regression in UPYOG-UAT environment |
| Gurjeet/Neha |
|
|
16 | Functional Sign off on UPYOG UAT |
| NIUA Team /Ajay |
|
|
17 | FMS integration testing with UPYOG UAT |
| NIUA Team /Ajay |
|
|
18 | Take go ahead from the Program Team |
| Pradeep/Ajay |
|
|
19 | Confirmation for production code merge |
| Pradeep/Ajay |
|
|
20 | Verify Localization changes | Yes | Gurjeet/Neha |
|
|
21 | Deployment steps on UPYOG Production |
|
| Vinoth/Gurjeet |
|
|
22 | Upgrade Upyog documentations |
|
| Ajay |
|
|
23 | Testing/Sanity on UPYOG production |
| Gurjeet/Neha |
|
|
24 | Production Sign-off | Yes |
| NIUA Team /Ajay |
|
|
25 | State dashboard/national dashboard testing |
|
| Gurjeet/Neha |
|
|
26 | Builk bill generation and performance testing |
| Kavi |
|
|
27 | Rollback Plan | Yes |
| Pradeep |
|
|
28 | Staging upgradation plan |
|
| Pradeep/Vinoth |
| After upyog upgradation, will start this task after getting confirmation from marketing team. |
29 | Compare code with security audit : Verify whether similar errors possible because of new code changes |
|
| Kavi/Engineering team |
|
|
30 | Push images required by new UI changes to s3 bucket and update tenant.json |
|
| Pradeep/NIUA |
|
|
31 | Verify changes required in hardware/AWS/space issues for upgradation | Yes |
| Pradeep/Vinoth | Kavi | Increase memory in pdf service to support bulk bill generation. Increase heap memory for filestore service. Max filesize allowed to be verify. Increase pdf service replica based on client requirement. |
32 | Apk referring to UAT to be publish for new UI. The same verified by Tester |
|
| @JaganKumar |
|
|
33 | Register new SMS formats with SMS provider |
|
| NIUA |
|
|
@Sl No | Checklist | Yes/No/Partially | Reference link | Owner | Reviewer | Remarks |
1 | Development is completed for all the features that are part of the release. | Yes |
| Kaviyarasan P | Sathish P | Code was frozen on 10-Feb-2023 |
2 | Test cases are documented by the QA team, reviewed by the product owners and test results are updated in the test cases sheet. | Yes | Navyashree G Keerthi Bhaskara | P. Sankar | Test cases documents are reviewed and updated |
3 | The incremental demo of the features showcased during the sprint showcase and feedback incorporated. If possible list out the JIRA tickets for feedback. | Yes |
Kaviyarasan P | P. Sankar Nirbhay Singh | Yes, incremental demos are shown during the development sprints. |
4 | UI/UX Audit review by UX Architect is completed along with feedback incorporation for any changes in UI/UX. | Yes | JaganKumar Antriksh Kumar Vamshikrishna Kole | P. Sankar | UI/UX audit is done and review comments are incorporated. |
5 | Incremental demos to the product owners are completed as part of the sprint and feedbacks are incorporated. | Yes | Kaviyarasan P | P. Sankar | Incremental demos are done as per sprint |
6 | QA signoff is completed by the QA team and communicated to the product owners. All the tickets QA signoff status is updated in the JIRA. | Yes |
| Navyashree G Keerthi Bhaskara | P. Sankar | QA signoff was completed. Incremental sign-off dates 3rd Nov 2022
13th Dec 2022
2nd Feb 2023
|
7 | UI, API Technical documents are updated for the release along with the configuration documents. | Yes | Kaviyarasan P Vamshikrishna Kole |
| Featured in the release page and master migration document |
8 | UAT promotion and regression testing from the QA team is completed. QA team has shared the UAT regression test cases with the product owners. | Yes | Navyashree KeerthiBhaskara-eGov |
P. Sankar |
9 | API Automation scripts are updated for new APIs or changes to any existing APIs for the release. API automation regression is completed on UAT, the automation test results are analyzed and necessary actions are taken to fix the failure cases. Publish the list of failure use cases with a reason for failure and the resolution taken to fix these failures for the release. | No |
| Navyashree G KeerthiBhaskara-eGov |
| Not picked up in this release due to lack of resources |
10 | The API backward compatibility testing is completed. | Yes | Navyashree G | Kaviyarasan P | Use cases ran with the existing scripts, no new API structure changes in the new release. new scripts for 2.8 will taken up in upcoming sprints |
11 | The communication is shared with the product owners for the completion of UAT promotion and regression by the QA team. The product owners have to give a Product signoff within one week of this communication. | Yes |
| Navyashree G Keerthi Bhaskara | P. Sankar | UAT sign-off was completed. Incremental sign-off dates 13th Dec 2022
6th Feb 2023
6th Jan 2023
12th Jan 2023
9th Feb 2023
|
12 | UAT Product Signoff communication is received from the Product owners along with the release notes and User guides (if applicable). | Yes | Kaviyarasan P Product Owners | P. Sankar | All prods are signed off , release notes and user guides are updates
|
13 | The GIT tags and releases are created for the code changes for the release. | Yes | Kaviyarasan P | Sathish P |
|
14 | Verify whether the Release notes are updated | Yes |
| Kaviyarasan P | Sathish P |
|
15 | Verify whether all UAT Builds are updated along with the GIT tag details. | Yes | Kaviyarasan P | Sathish P |
|
16 | Verify whether all MDMS, Configs, InfraOps configs updated. | Yes | Kaviyarasan P | Sathish P |
|
17 | Yes |
| Kaviyarasan P Anjoo Narayan | Sankar Sathish P |
|
18 | Verify whether all test cases are up to date and updated along with necessary permissions to view the test cases sheet. The test cases sheet is verified by the Test Lead. | Yes | Navyashree G Keerthi Bhaskara | Sankar |
|
19 | Verify whether the UAT credentials sheet is updated with the details of new Users and Roles if any | Yes | Navyashree G Keerthi Bhaskara | Sankar | UAT creds shared internally to product & impel team |
20 | Verify whether all the localisation data was added in UAT including Hindi and updated in Release Kits. | Yes | Navyashree G Keerthi Bhaskara | Kavi |
|
21 | Verify whether the product release notes and user guides are updated and published | Yes | Nirbhay Singh Satish N Chandra Kiran Sankar Abhishek | P. Sankar | Release notes and user guides are published in gitbooks. |
22 | The Demo of released features is done by the product team as part of the Sprint/Release showcase. | Yes | Nirbhay Singh Satish N Chandra Kiran Sankar Abhishek | P. Sankar | Yes - all features of 2.8 are demoed to the eGov team on 15th Feb 2023 |
23 | Technical and Product workshops/demos are conducted by the Engineering and Product team to the implementation team (Implementation handover) | Yes |
| Kaviyarasan P | Pradeep Kumar | Migration process has been explained, all docs are added to the master migration doc based on feedback from impel team |
24 | Architect SignOff and Technical Quality Report | Yes |
| Kaviyarasan P | Kaviyarasan P | Sign off is given with privacy disabled and encryption process disabled for W&S, PT modules. Redesign of privacy will be taken up by DPG based on the shortcomings shared in Tech council. |
25 | Success Metrics and Product Roadmap | Yes | Nirbhay Singh P. Sankar | Sankar | Product road map post digit 2.8 is not clear yet as we have less visibility and trimming down the operations/resources in urban team. |
26 | Adoption Metrics | Yes | Ajay Rawat P. Sankar | Sankar |
|
27 | Program Roll-out Plan | Yes | Ajay Rawat | Chandar |
|
28 | Implementation checklist | Yes | Pradeep Kumar | Elzan |
|
29 | Implementation Roll-out plan | Yes | Pradeep Kumar | Elzan |
|
30 | Gate 2 | In-progress | Meeting scheduled on 20th Feb 2023 | Kaviyarasan P, Sathish P, Pradeep Kumar, Ajay Rawat, Sankar, Satish N, Nirbhay Singh, Anjoo Narayan | Approver will be added post meeting |
31 | The Internal release communication along with all the release artefacts are shared by the Engineering/Product team. | In-progress |
| Kaviyarasan P, Sankar P, Sathish P | Sankar | Communication will be shared once Gate 2 is passed |
32 | Plan for upgrading the staging/demo instance with the release product - within 2-4 weeks based on the period where no demos are planned from staging for the previous version of the released product. |
|
| Pradeep Kumar | Elzan | Staging will be upgraded after the release |
33 | The Release communication to partners is shared by the GTM team and the Webinar is arranged by the GTM team after the release communication - within 2-4 weeks of the release. |
|
| Vibhor Bansal |
|
|
This module gives citizens more information on a service delivery module than just links to access the service.
Product Specific Pages Module consists of the following:
For Citizen
For a module and tenant that is configured.
FAQs
List of FAQs for each module
How it works
List of user manuals, help videos etc (Also in local language)
Link to access service via WhatsApp
Helpline Numbers
Address of Service Centres
Link to navigate to google maps
Static data
Ex. Days to process applications/ Amount to pay while applying etc
Dynamic Data
Number of citizens applied for service in last n months/ Amount collected etc
This module allows employees to create surveys for citizens and let them fill out surveys. Later survey results can also be seen on the Survey results dashboard.
The Survey module consists of the following
For Employees
Surveys section on Home Page
Surveys inbox
Create survey
Meta Data - Title, Description, Survey start & end date/time
Questions
Short Answer
Paragraph
Single Answer
Multiple choice questions
Date
Time
Survey Results
Aggregated view of all survey results in specific charts for each question type
Download the excel report with the survey results
Modify surveys
Survey questions
Extend the Survey date and time.
For Citizens
Notification on Survey creation.
Survey Filling
Expansion to other types of questions
More notifications to citizens prior to survey start, and survey end.
Making surveys open. (Right now citizens need to be logged in to fill out the survey. This needs to be made open)
Birth and death release note for Urban DIGIT 2.8.
Reports have been added to birth and death for administrative purposes.
Birth count report
Death count report
Birth and death certificate payment records
Birth count report
Death count report
Birth and death certificate payment report
https://digit-discuss.atlassian.net/browse/UM-5256
None
DIGIT is an open-source platform licensed under the MIT license (https://opensource.org/licenses/MIT) compliant with the NUIS digital blueprint.
Detailed mapping of DIGIT’s capabilities with the core requirements mentioned in the NUIS digital blueprint has been done below:
Key Principles | Description |
---|---|
Data specifications/models are available for domain entities. DIGIT is designed as an API-first platform wherein data specs/models are created for all key entities thus ensuring interoperability through open APIs and open standards. Taxonomies are available for the key domain entities/registries. These can later be harmonised with standard taxonomies in the domain as and when they are made available.
DIGIT data models and APIs are published as Open APIs freely available to everyone in the ecosystem. In Punjab, the DIGIT module was easily integrated with 3rd party payment apps like Paytm, Airtel Money, BBPS, etc to increase citizen access and improve collections. At present, DIGIT provides at least 3 key distinct APIs for all domain entities - create, update and search.
Deactivation/Cancellation of key entities in DIGIT is achieved by updating their status to inactive as per their defined specification/API contracts. Given the API-first and micro-services-driven nature of DIGIT, current APIs and models can be quickly harmonised with national standards as and when they are made available. DIGIT strives to leverage established domain standards (national/international) wherever available.
Data privacy capabilities are available to mark and protect sensitive data. The core service layer of DIGIT includes signing and encryption service as one of the core services that provide capabilities to sign/encrypt/mask sensitive data. It is designed such that it can work against software key stores and can be extended to integrate with any kind of hardware key store to store and protect signing and encryption keys.
Encryption requirements can be defined and adhered to for the storage of sensitive data. DIGIT requires the User PII data to be stored in its User service which is by default enabled for encryption of sensitive data as User Data Vault. All other services in DIGIT are required to access PII data by explicitly calling the User service - which in turn audits all access to PII. In addition, individual services in DIGIT can leverage DIGIT’s signing and encryption service (which is what User Service leverages to create User Data Vault) to further protect additional sensitive data available with the services.
DIGIT provides the capability to define workflows for data modification that can be configured to have approval steps to get needed consent for any data modification activities. DIGIT currently provides RBAC (Role-Based Access Control) based access control for access (search) to data.
Appropriate access controls can be defined in the APIs to ensure authorised access to sensitive data. DIGIT is designed to handle authentication and authorisation as perimeter control at its API gateway layer to ensure unauthorised calls are not allowed to even contact the respective micro-services. DIGIT provides an RBAC (Role-Based Access Control) mechanism where users are explicitly provided access to relevant resources by assigning them appropriate roles. By default, DIGIT supports OAUTH-based authentication for individual users and APIs. However, the Authentication and Authorization filter on DIGIT is designed to be easily extendable to support any further Auth and Auth needs.
The perimeter security mechanism in DIGIT also helps developers in focusing on the functional developments in further services and offload the access control requirements for new resources and their APIs to the API gateway using simple configurations.
DIGIT also ensures that risks like the following are taken care of:
Privilege escalation – form field manipulation
Failure to restrict URL access
Insecure direct object references (IDOR)
Malicious file upload leads to cross-site scripting
Improper authentication
Missing account lockout
Request throttling attack
Weak encoding mechanism
Sensitive information in URL
Lack of automatic session expiration
Insecure banner implementation
Concurrent session
Clickjacking
Improper error handling
DIGIT has the capability to define key registries in OpenAPI 3.0 specs formats and easily achieve key APIs like create/update/search using its building blocks in core services mainly through configurations and using lightweight extensions on a needs basis.
DIGIT has the capability to protect person-specific sensitive data by encrypting them in the user data vault (User Registry) which allows configuration-based protection of sensitive PII. DIGIT requires additional registries to reference PII using this mechanism. In addition, registries in DIGIT can leverage its data protection (Signing and Encryption) core service to provide additional protection to registry-specific attributes.
Registry data in DIGIT can be signed for tamper-proofing using its signing and encryption core service. A proof of concept for this has already been done on the ePass module that was built on the DIGT platform. All key data modifications in DIGIT are access logged to provide an audit trail, which can be accessed through APIs. The upcoming version of DIGIT is planning to bring in the concept of immutable event logs to further strengthen this capability. DIGIT leverages open-source telemetry to provide the ability to gather telemetry data and extend it for the DIGIT-specific processing pipeline. This framework allows for additional event definitions and contextual extension of the telemetry processing pipeline thereby future-proofing this capability in DIGIT.
DIGIT platform is designed as a collection of more than 50+ atomic microservices which are bundled together in a given context to provide end solutions. Microservices in DIGIT can be mainly categorized in three categories: Data services (Registries, reference Master data management, etc.), Tech infrastructure services (Authentication, authorisation, notification engine etc.) and domain services (Assessment, NOC etc.). Citizen, employee and administrative interfaces in DIGIT use these microservices to achieve the needed functionality.
Data models and APIs in DIGIT are defined as OpenAPI 3.0 specifications and can be extended by using a combination of configuration and extension techniques. E.g. if the additional attributes are only needed to be stored with format validation, it can be a simple schema extension, while if the additional business checks/functionality need to be implemented using the extended attributes then it can be achieved using pre/post request filters or extending underlying microservices.
DIGIT allows the extension of existing capabilities without needing architectural interventions. As described above extension of existing functionality on DIGIT can be achieved using additional configurations, additional extension services or request/response filters.
Several partners have extended DIGIT modules to cater to new use cases. For instance, DIGIT mCollect module caters to the collection of fees for more than 50 services on the counter, but it did not have a citizen interface for payment of these services online. Directorate General Defence Estates (DGDE) wanted to introduce this interface for the citizens of cantonment boards in India and were able to easily enhance the mCollect module to include this capability. Similarly, Punjab has reused several DIGIT core services to develop new modules on the platform with minimum effort.
DIGIT supports single-instance multi-tenancy to enable sharing of the underlying infrastructure, also all DIGIT data models and services are designed to be multi-tenanted.
DIGIT uses API first approach in its design and development to ensure loose coupling between its various components. These APIs are clearly defined using OpenAPI 3.0 specifications to ensure clear documentation.
As described above, the extension of existing functionality on DIGIT can be achieved using additional configurations, additional extension services or request/response filters. Similarly, new functionality can be added by re-bundling existing building blocks in the context of new use cases and implementing only additionally required services without requiring any architectural overhaul. Additionally due to its loosely coupled API-driven design DIGIT allows for new components to be implemented in the technology that is most useful for that use case.
API-driven, the microservices-based architecture of DIGIT enables its components to evolve separately. On DIGIT Individual components can evolve separately to enable the heterogeneous evolution of the system.
DIGIT uses SemVer 2.0 for versioning its microservices and interfaces. Semantic versioning is a formal convention for specifying compatibility using a three-part version number: major version; minor version; and patch. More details on this can be found at this link: https://semver.org/.
DIGIT is designed to be horizontally scalable. The microservices-based architecture of DIGIT also enables it to scale only needed components/services, thereby providing resource efficiency. E.g. Billing and Collection services can be scaled separately during financial year closing if the load pattern indicates an increasing volume of bill payments during that period.
DIGIT is designed to be hardware agnostic and can be run on any hardware. It has been tested on multiple commercial clouds and state-sponsored bare metal infrastructure. Components of DIGIT that need to use underlying hardware have been carefully chosen (in cases where DIGIT is using other open-source components) or designed (DIGIT’s own components) to provide a layer of abstraction that can be extended for any type of hardware.
DIGIT is designed using API first approach, therefore enabling any user interface channel to leverage it. DIGIT’s own user interfaces (Web/mobile app, WhatsApp chatbot) are implemented using its APIs to ensure offered platform capabilities and data are accessible to any delivery channel based on configured policies. In states like Punjab and AP where DIGIT modules are being used, the citizens have been given multi-channel access - ULB counters, Web portals, Mobile App, WhatsApp Chatbot and 3rd party applications like Paytm, BBPS to avail local government services.
DIGIT’s access control mechanism can be configured to provide different levels of access based on channels and roles.
DIGIT platform and its user interfaces are completely open source. Also, all external components used in DIGIT are also Open Source. Due to its API-based and event-driven architecture DIGIT can be integrated with any existing stack. Wherever appropriate, DIGIT also provides out-of-the-box integrations with crucial stacks/platforms. The most common integrations are to payment gateways, SMS providers and SMTP email servers for a typical implementation.
More than 14 organizations have already partnered with us to implement DIGIT across multiple implementations in the country and have built more than 20 new solutions on top of the platform.
DIGIT also provides the capability to gather feedback from the ecosystem in a digital manner. Feedback capability in DIGIT can be looked at the following levels:
Service Delivery feedback on services offered through DIGIT - DIGIT provides a highly configurable and extensible Public Grievance module to enable this kind of feedback/redressal for functional users (Citizens, employees etc)
Service Usage feedback - DIGIT user interfaces include a telemetry SDK which is backed by telemetry infrastructure on the DIGIT platform. Coupled with API access logs, this enables DIGIT to gather user feedback through live action and can be used for fine-tuning interfaces and APIs
Design/Feature feedback - As an open-source project on GitHub, DIGIT provides a mechanism to provide comments/feedback on its various components using GitHub. This feedback can be leveraged to create a Point of View on the future roadmap for the platform.
DSS release note for Urban DIGIT 2.8
About page, FAQ and 3 new KPIs are added to the DSS page
About page for the dashboard is added which helps users to understand the purpose, how data is secure and from where the data is being pushed.
And 3 new KPIs were added for administrative purposes.
About page
FAQ
3 KPIS - Non-tax revenue collected, Non-tax revenue contribution % and Average PT collected.
DIGIT Infra and architecture details
DIGIT is India’s largest open-source platform for digital governance. It is built on OpenAPI (OAS 2.0) and provides API-based access to a variety of urban/municipal services enabling state governments and city administrators to provide citizen services with relevant new services and also integrate the existing system into the platform and run seamlessly on any commercial/on-prem cloud infrastructure with scale and speed.
DIGIT is a microservices-based platform that is built to scale. Microservices are small, autonomous and developer-friendly services that work together.
A big software or system can be broken down into multiple small components or services. These components can be designed, developed & deployed independently without compromising the integrity of the application.
Parallelism in development: Microservices architectures are mainly business-centric.
MicroServices have smart endpoints that process info and apply logic. They receive requests, process them, and generate a response accordingly.
Decentralized control between teams, so that its developers strive to produce useful tools that can then be used by others to solve the same problems.
MicroServices architecture allows its neighbouring services to function while it bows out of service. This architecture also scales to cater to its clients’ sudden spike in demand.
MicroService is ideal for evolutionary systems where it is difficult to anticipate the types of devices that may be accessing our application.
DIGIT follows Multilayer or n-tiered distributed architecture pattern. As seen in the illustration above there are different horizontal layers with some set of components eg. Data Access Layer, Infra Services, Business Services, different modules layers, client Apps and some vertical adapters. Every layer consists of a set of microservices. Each layer of the layered architecture pattern has a specific role and responsibility within the application.
Layered architecture increases flexibility, maintainability, and scalability
Multiple applications can reuse the components
Parallelism
Different components of the application can be independently deployed, maintained, and updated, on different time schedules
Layered architecture also makes it possible to configure different levels of security to different components
Layered architecture also helps users test the components independent of each other
to learn more about the setup basics.
This section contains documents and information required to configure the DIGIT platform
Learn how to configure the DIGIT Urban platform. Partner with us to enhance and integrate more into the platform.
Summary of DIGIT OpenSource GitRepos and it's purpose. If you are a partner/contributor you may choose to fork or clone depending on need and capacity.
All content on this page by is licensed under a .
An Urban Local Body (ULB) is defined as a tenant. The information which describes the various attributes of a ULB is known as tenant information. This detail is required to add the ULB into the system.
S. No. | ULB Name* | ULB Code* | ULB Grade* | City Name* | City Local Name | District Name* | District Code* | Region Name | Region Code |
---|
Contact Number* | Address* | ULB Website* | Latitude | Longitude | Email Address | GIS Location Link | Call Center No. | Facebook Link | Twitter Link | Logo file Path* |
---|
Data given in the table is a sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning as given in this document under section 'Data Definition'.
Make sure all the headers, its data type, field size and its definition/ description are understood properly.
In case of any doubt, please reach out to the person who has shared this document with you and discuss the same to clear out the doubts.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist by taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed once the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity. There are no checklist activities exists which are specific to the entity.
.
RESTRICTED LINK
Verify whether all docs will be Published to by the Technical Writer as part of the release.
To see common checklist refer to the page consisting of all the activities which are to be followed to ensure completeness and quality of data.
All content on this page by is licensed under a .
Feature
service Name
changes
Description
Digit 2.8 release changes.
WNS
WS disconnection notification,Ws plumber model,reindexing wns.
OBPS Report
Added obps report.
TL Reports
TL reports changes.
TL
Updated Tl Common fields.
Navigation re-direction from New UI to OLD UI
Corrected URL for video, PT:faq & how it works
Product specific page
Updated the url
Bnd tab reports
BnD tab reports changes
Surveys
Survey changes
Fire-Noc
Fire-NOC changes
Feature
Service Name
Changes
Description
Digit 2.8 release changes.
TL reports
rainmaker tl reports changes.
WNS
rainmaker WNS changes.
OBPS reports
rainmaker OBPs reports changes.
WNS Report
WNS Report changes.
DSS
Updated chartApi config.
Finance Dss
Added finance dss config.
PT reports
pt report bug fix
BnD tab reports
Updated config for Bnd queries and filters
Surveys
Surveys changes
PT
Rainmaker pt changes
TL
Rainmaker tl changes
Fire-NOC
added change to persist fire noc address details.
Category
Services
GIT TAGS
Docker Artifact ID
Remarks
Frontend (old UI) v2.8
Citizen
citizen:v1.9.0-98adbd4488-17
Employee
employee:v1.9.0-98adbd4488-17
DSS Dashboard
dss-dashboard:v1.9.0-98adbd4488-1
Digit-UI v2.8
DIGIT UI
digit-ui:v1.6.0-98adbd4488-74
Core Services v2.8
Encryption
egov-enc-service-db:v1.1.4-6cfa52c1f9-1
xState Chatbot
xstate-chatbot-db:v1.1.1-96b24b0d72-1
Searcher
egov-searcher:v1.1.6-6cfa52c1f9-1
Payment Gateway
egov-pg-service-db:v1.2.4-6cfa52c1f9-1
Filestore
egov-filestore-db:v1.2.5-6cfa52c1f9-1
Zuul - API Gateway
zuul:v1.3.2-6cfa52c1f9-1
Mail Notification
egov-notification-mail:v1.1.3-6cfa52c1f9-1
SMS Notification
egov-notification-sms:v1.1.4-6cfa52c1f9-1
Localization
egov-localization-db:v1.1.4-6cfa52c1f9-1
Persist
egov-persister:v1.1.5-6cfa52c1f9-2
ID Gen
egov-idgen-db:v1.2.4-6cfa52c1f9-1
User
egov-user-db:v1.3.0-6cfa52c1f9-2
User Chatbot
egov-user-chatbot:v1.3.0-6cfa52c1f9-1
MDMS
egov-mdms-service:v1.3.3-6cfa52c1f9-1
URL Shortening
egov-url-shortening-db:v1.1.3-6cfa52c1f9-1
Indexer
egov-indexer-db:v1.1.8-6cfa52c1f9-1
Report
report:v1.3.5-6cfa52c1f9-1
Workflow
egov-workflow-v2-db:v1.2.2-6cfa52c1f9-1
PDF Generator
pdf-service-db:v1.2.1-6cfa52c1f9-1
Chatbot
chatbot-db:v1.1.7-6cfa52c1f9-1
Access Control
egov-accesscontrol:v1.1.4-6cfa52c1f9-1
Location
egov-location-db:v1.1.5-6cfa52c1f9-1
OTP
egov-otp-db:v1.2.3-6cfa52c1f9-1
User OTP
user-otp:v1.1.6-6cfa52c1f9-2
NLP Engine
nlp-engine:v1.0.0-fbea6fba-21
Egov Document-Uploader
egov-document-uploader-db:v1.1.1-6cfa52c1f9-4
National Dashboard Ingest
national-dashboard-ingest-db:v1.0.1-6cfa52c1f9-1
National Dashboard Kafka Pipeline
national-dashboard-kafka-pipeline:v1.0.1-6cfa52c1f9-1
Egov Survey Service
egov-survey-services-db:v0.0.1-a9d54c2543-25
Business Services v2.8
Apportion
egov-apportion-service-db:v1.1.5-d93a120c25-2
Collection
collection-services-db:v1.1.7-d93a120c25-2
Billing
billing-service-db:v1.3.5-3d7f744977-1
HRMS
egov-hrms-db:v1.2.5-d93a120c25-2
Dashboard Analytics
dashboard-analytics:v1.1.8-d93a120c25-2
Dashboard Ingest
dashboard-ingest:v1.1.4-d93a120c25-2
EGF Instrument
egf-instrument-db:v1.1.4-d93a120c25-2
EGF Master
egf-master-db:v1.1.3-d93a120c25-2
Finance Collection Voucher Consumer
finance-collections-voucher-consumer-db:v1.1.6-d93a120c25-2
Municipal Services v2.8
Trade License
tl-services-db:v1.1.8-3d7f744977-1
Trade License Calculator
tl-calculator-db:v1.1.5-3d7f744977-1
Fire NOC
firenoc-services-db:v1.3.2-b72211989e-13
Fire NOC Calculator
firenoc-calculator-db:v1.2.1-3d7f744977-1
Property Services
property-services-db:v1.2.0-bb91a22308-1
Property Tax Calculator
pt-calculator-v2:v1.1.5-32caf0d992-8
Property Tax
pt-services-v2:v1.0.0-48a03ad7bb-4
Water Charges
ws-services-db:v1.7.4-bb91a22308-1
Water Charges Calculator
ws-calculator-db:v1.4.3-3d7f744977-1
Sewerage Charges
sw-services-db:v1.7.4-bb91a22308-1
Sewerage Charges Calculator
sw-calculator-db:v1.4.3-b8de94a8b0-5
BPA Calculator
bpa-calculator:v1.1.1-3d7f744977-1
BPA Services
bpa-services-db:v1.1.6-3d7f744977-2
User Event
egov-user-event:v1.2.0-32caf0d992-25
PGR
rainmaker-pgr:v1.1.4-32caf0d992-4
PGR Service
pgr-services-db:v1.1.7-3d7f744977-1
Land Services
land-services-db:v1.0.4-3d7f744977-1
NOC Services
noc-services-db:v1.0.5-3d7f744977-5
FSM
fsm:v1.2.0-98a12c2748-224
FSM Calculator
fsm-calculator:v1.1.0-32caf0d992-41
Vehicle
vehicle:v1.2.0-180a328097-97
Vendor
vendor:v1.2.0-a28b192446-64
eChallan Services
echallan-services-db:v1.1.0-3d7f744977-1
eChallan Calculator
echallan-calculator:v1.0.2-3d7f744977-1
Inbox
inbox:v1.2.2-bb91a22308-1
Turn-IO
turn-io-adapter:v1.0.1-3d7f744977-1
Birth and Death Services
birth-death-services-db:v1.0.1-3d7f744977-1
Utilities Services v2.8
Custom Consumer
egov-custom-consumer:v1.1.1-d93a120c25-1
egov-pdf:v1.2.0-f731839045-1
eDCR v2.8
eDCR
egov-edcr:v2.1.1-d4e6df6b31-97
Finance v2.8
Finance
egov-finance:v3.0.2-0d0a8db8ff-28
Configs v2.8
MDMS v2.8
Localization v2.8
QA Automation v2.8
Key Feature
Description
Search Bill
UI/UX revamps of search a bill and then downloads the bill PDF. Bill amount can also be collected.
Cancel Bill
UI/UX revamp of cancelling an active bill.
Group Bill
UI/ UX revamp of the grouping and merging of the multiple bills into single PDF.
Group W&S Bills
UI/ UX revamp of grouping and merging of related water and sewerage bill into single PDF.
Download Bill PDF
UI/ UX revamp of download PDF.
Doc Links
Interoperability
DIGIT is designed as an API-first platform and with Open APIs & Open Standard interoperability is maintained.
Along with this, taxonomies are available for the key domain entities/registries on DIGIT.
Data privacy and security by design
Data privacy and security design are a very critical part of the design of DIGIT.
Core service layer of DIGIT includes a signing and encryption service that provides capabilities to sign/encrypt/mask sensitive data.
Appropriate access controls can be defined in the APIs to ensure authorised access to sensitive data
Transparency and Accountability through data
DIGIT has
The capability to define registries, preferably through standard specifications like OpenAPI 3.0
The capability to configure registry attributes for security and protection as per the configuration.
Mechanisms to verify data and its provenance through audit logs (access and changelogs), preferably through APIs.
Reusability and Extensibility
The DIGIT platform is designed as a collection of more than 55+ atomic microservices which are bundled together in a given context to provide an end solution.
DIGIT allows the extension of existing capabilities without needing architectural interventions.
Components are designed to be independently reusable without any tight coupling.
Evolvability and Scale
On DIGIT:
Capabilities can be added without needing overall system re-architecture.
Individual components can evolve separately to enable heterogeneous evolution of the system.
Scaling can be done horizontally to handle changes in request volumes.
Individual components can be scaled independent of each other, to enable efficient resource utilisation
Multi-channel access
DIGIT allows multiple channels of solution delivery - ULB counters, Web portals, Mobile App, WhatsApp Chatbot and 3rd party applications like Paytm, tablets, etc.
DIGIT’s access control mechanism can be configured to provide different levels of access based on channels and roles.
Ecosystem-driven
DIGIT leverages open source technologies to reduce the cost of solutions.
Leverages or integrates with or extends existing platforms/stacks like IndiaStack, IUDX, ICTRA infrastructure etc.
Provides the capability to gather feedback from the ecosystem in a digital manner.
Learn how to setup DIGIT master data.
DIGIT environment setup is conducted at two levels.
1 | Sonepur Nagar Panchayat | 47 | Corp | Sonepur | Sonepur | Banka | BN47 | Bihar | BBD47 |
98362532657 | Main Hall, Sonepur | 24.8874° N | 86.9198° E | snp@bihar.gov.in |
1 | ULB Name | Text | 256 | Yes | Name of ULB. E.g. Kannur Municipal Corporation/ Saptarishi Municipal Council |
2 | ULB Code | Alphanumeric | 64 | Yes | It is a unique identifier which is assigned to each ULB. LGD (Local Government Directory) has already assigned a code urban local bodies and the same is used here |
3 | ULB Grade | Alphanumeric | 64 | Yes | Grade of ULB. e.g. Corporation, Municipality, Nagar Panchayat etc |
4 | City Name | Text | 256 | Yes | Name of city/ town which is covered by the ULB. E.g. Kannur/ Saptarishi |
5 | City Local Name | Text | 256 | No | Name of the city in the local language. e.g Telugu, Hindi etc |
6 | District Name | Text | 256 | Yes | Name of the District where the city is situated |
7 | District Code | Alphanumeric | 64 | Yes | It is a unique identifier which is assigned to each district. LGD (Local Government Directory) has already assigned code districts and the same is used here |
8 | Region Name | Text | 256 | No | Name of the region the listed district belongs to |
9 | Region Code | Alphanumeric | 64 | No | Unique code of the region to uniquely identify it |
10 | Contact Number | Alphanumeric | 10 | Yes | Contact person phone no. of ULB |
11 | Address | Text | 256 | Yes | Postal address of the ULB for the correspondence |
12 | ULB Website | Alphanumeric | 256 | Yes | URL address of the website for the ULB |
13 | Email Address | Alphanumeric | 64 | No | Email of the address of ULB where the email from the citizen can be received |
14 | Latitude | Alphanumeric | 64 | No | Latitude part of coordinates of the centroid of the city |
15 | Longitude | Alphanumeric | 64 | No | Longitude part of coordinates of the centroid of the city |
16 | GIS Location Link | Text | NA | No | GIS Location link of the ULB |
17 | Call Center No | Alphanumeric | 10 | No | Call centre contact number of ULB |
18 | Facebook Link | Text | NA | No | Face book page link of ULB |
19 | Twitter Link | Text | NA | No | Twitter page link of the ULB |
20 | Logo file Path | Document | NA | Yes | URL of logo file path to download the logo of ULB |
Tenant represents a body in a system. In the municipal system, a state and its ULBs (Urban local bodies) are tenants. ULB represents a city or a town in a state. Tenant configuration is done in MDMS.
Before proceeding with the configuration, the following pre-requisites are met -
Knowledge of json and how to write a json is required.
Knowledge of MDMS is required.
User with permission to edit the git repository where MDMS data is configured.
For the login page city name selection is required. Tenant added in MDMS shows in city drop-down of the login page.
In reports or in the employee inbox page the details related to ULB is displayed from the fetched ULB data which is added in MDMS.
Modules i.e., TL, PT, MCS can be enabled based on the requirement for the tenant.
After adding the new tenant, the MDMS service needs to be restarted to read the newly added data.
Tenant is added in tenant.json. In MDMS, file tenant.json, under tenant folder holds the details of state and ULBs to be added in that state.
To enable tenants the above data should be pushed in tenant.json file. Here "ULB Grade" and City "Code" are important fields. ULB Grade can have a set of allowed values that determines the ULB type, (Municipal corporation (Nagar Nigam), Municipality (municipal council, municipal board, municipal committee) (Nagar Parishad), etc). City "Code" has to be unique to each tenant. This city-specific code is used in all transactions. Not permissible to change the code. If changed we will lose the data of the previous transactions done.
Naming Convention for Tenants Code
“Code”:“uk.citya” is StateTenantId.ULBTenantName"
"logoId": "https://s3.ap-south-1.amazonaws.com/uk-egov-assets/uk.citya/logo.png", Here the last section of the path should be "/<tenantId>/logo.png". If we use anything else, logo will not be displayed on the UI. <tenantId> is the tenant code ie “uk.citya”.
Localization should be pushed for ULB grade and ULB name. The format is given below.
Localization for ULB Grade
Localization for ULB Name
Format of localization code for tenant name <MDMS_State_Tenant_Folder_Name>_<Tenants_Fille_Name>_<Tenant_Code> (replace dot with underscore)
Boundary data should be added for the new tenant.
Configuring Master Data for a new module requires creating a new module in the master config file and adding master data. For better organizing, create all the master data files belonging to the module in the same folder. Organizing in the same folder is not mandatory it is based on the moduleName in the Master data file.
Before you proceed with the configuration, make sure the following pre-requisites are met -
User with permission to edit the git repository where MDMS data is configured.
These data can be used to validate the incoming data.
After adding the new module data, the MDMS service needs to be restarted to read the newly added data.
The Master config file is structured as below. Each key in the Master config is a module and each key in the module is a master.
The new module can be added below the existing modules in the master config file.
Please check the link to create new master Adding New Master
For creating a new master in MDMS, create the JSON file with the master data and configure the newly created master in the master config file.
Before proceeding with the configuration, make sure the following pre-requisites are met -
User with permission to edit the git repository where MDMS data is configured.
After adding the new master, the MDMS service needs to be restarted to read the newly added data.
The new JSON file needs to contain 3 keys as shown in the below code snippet. The new master can be created either State-wise or ULB-wise. Tenant id and config in the master config file determine this.
The Master config file is structured as below. Each key in the Master config is a module and each key in the module is a master.
Each master contains the following data and the keys are self-explanatory
MDMS stands for Master Data Management Service. MDMS is One of the applications in the eGov DIGIT core group of services. This service aims to reduce the time spent by developers on writing codes to store and fetch master data ( primary data needed for module functionality ) which doesn’t have any business logic associated with them.
Before you proceed with the configuration, make sure the following pre-requisites are met -
Prior Knowledge of Java/J2EE.
Prior Knowledge of Spring Boot.
Prior Knowledge of REST APIs and related concepts like path parameters, headers, JSON, etc.
Prior knowledge of Git.
Advanced knowledge of how to operate JSON data would be an added advantage to understand the service.
The MDMS service reads the data from a set of JSON files from a pre-specified location.
It can either be an online location (readable JSON files from online) or offline (JSON files stored in local memory).
The JSON files will be in a prescribed format and store the data on a map. The tenantID of the file serves as a key and a map of master data details as values.
Once the data is stored in the map the same can be retrieved by making an API request to the MDMS service. Filters can be applied in the request to retrieve data based on the existing fields of JSON.
For deploying the changes in MDMS data, the service needs to be restarted.
The changes in MDMS data could be adding new data, updating existing data, or deletion.
The config JSON files to be written should follow the listed rules
The config files should have JSON extension
The file should mention the tenantId, module name, and the master name first before defining the data
Example Config JSON for “Billing Service”
MDMS supports the configuration of data at different levels. While we enable a state there can be data that is common to all the ULBs of the state and data specific to each ULBs. The data further can be configured at each module level as state-specific or ULB’s specific.
Before you proceed with the configuration, make sure the following pre-requisites are met -
Prior Knowledge of Java/J2EE.
Prior Knowledge of Spring Boot.
Prior Knowledge of REST APIs and related concepts like path parameters, headers, JSON, etc.
Prior knowledge of Git.
Advanced knowledge of how to operate JSON data would be an added advantage to understand the service.
State Level Masters are maintained in a common folder.
ULB Level Masters are maintained in separate folders named after the ULB.
Module Specific State Level Masters are maintained by a folder named after the specific module that is placed outside the common folder.
For deploying the changes(adding new data, updating existing data or deletion) in MDMS, the MDMS service needs to be restarted.
The common master data across all ULBs and modules like department, designation, etc are placed under the common-masters folder which is under the tenant folder of the MDMS repository.
ex: egov-mdms-data/data/pb/common-masters/ Here “pb” is the tenant folder name.
The common master data across all ULBs and are module-specific are placed in a folder named after each module. These folders are placed directly under the tenant folder.
ex: egov-mdms-data/data/pb/TradeLicense/ Here “pb” is the tenant folder name and “TradeLicense“ is the module name.
Module data that are specific to each ULB like boundary data, interest, penalty, etc are configured at the ULB level. There will be a folder per ULB under the tenant folder and all the ULB’s module-specific data are placed under this folder.
ex: egov-mdms-data/data/pb/amritsar/TradeLicense/ Here “amritsar“ is the ULB name and “TradeLicense“ is the module name. All the data specific to this module for the ULB are configured inside this folder.
SSL is Secure Sockets Layer is an encryption-based network security protocol developed for the assurance of privacy, authenticity and data integrity in internet communications.
Ideally, the domain name configuration and the SSL certification are obtained consecutively without fail from the state’s IT team.
No data is needed from the state team for this.
Not Applicable
Not Applicable
Not Applicable
Not Applicable
Not Applicable
Not Applicable
Whenever an android mobile App is developed it has to be published on the Google play store in order to let the users avail its service. This page provides information about configuring the google play store account to make DIGIT mobile apps available for easy download.
In order to start the configuration for the google play store following would be required:
Sr. No | Email Id | Password |
---|---|---|
Data given in the table is sample data.
Sr. No. | Column Name | Data Type | Data Size | Mandatory | Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Ask the state team/client to create an email account on Gmail.
Ask the client to log in to the google play console here and make the required payment so that further tasks could be processed.
Ask the client to share the email id and password in the template.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
Key configurations at the state level include -
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Content of pages within this document is designed to help implementation parties and end-users in providing the required data in minimal interaction and iterations and ensure the quality, consistency and shape of data needed to configure into the system.
This page is intended to help stakeholders as given below on data gathering activities.
State Team
eGov Onsite Team/ Implementation Team
ULB Team (Nodal and DEO)
Implementation Partners
The artefacts of this document are the data template of a configurable entity, a page with content defining the entity template and helping on how to fill the template with required data.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
An email account of the client/state team has to be set up in order to receive/send the email notifications.
In order to achieve the functionality, an email account has to be set up at there server since most of the states would defer from creating an account with the Gmail/public server. Further, this email account has to be integrated with the various DIGIT modules.
In order to achieve the above functionality, we require the below-mentioned details
Sr. No. | Email ID | Your Name | Account Type | Incoming Mail Server | Outgoing Mail Server (SMTP) | Password | Incoming Server POP3 Port | Outgoing server SMTP Port | Encrypted Connection Type | Days after which the email should be removed from the server |
---|---|---|---|---|---|---|---|---|---|---|
The values mentioned here are sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|---|---|---|---|---|
Below steps could be followed in order to fill the template:
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Ask the state to gather all the data related to the technical configuration from the email server settings.
Get the attached template filled from the state and a sample data is provided in the data table section for reference.
The data would be available in the POP and IMAP account settings at the server level.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Not Applicable
The SMS service is a way of communicating necessary information/updates to the users on their various transactions on DIGIT applications.
In order to update the users, there are certain notification parameters that are system configured for various steps in the application process. These configurations can be changed/reconfigured based upon the ULB requirements.
We have the below-mentioned parameters which we use for configuration:
Sr. No. | Parameter | Value |
---|---|---|
The data given in the above table is sample data. The parameters and its values are SMS service provided specific and may vary accordingly.
For the SMS service to be integrated there are various things for which the vendor more or less guides us for the steps to be followed but below mentioned are a few basic steps and the generic data definitions which could be followed.
Below mentioned are the descriptions of the parameters which are needed for configuration:
Sr. No. | Column Name | Data Type | Data Size | Mandatory | Description |
---|---|---|---|---|---|
Parameter names could differ from vendor to vendor.
Since the SMS service is a vendor delivered service for which the below steps would have to be followed:
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
The SMS vendor has to provide the data in the data template attached.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
This is the next step after collating all the boundary hierarchies which are being used in the state. In a hierarchy, there are certain types of boundary classification and in all the levels there will be a mapping which we could define as a parent-child mapping in order to link certain levels of the classification.
For example, a hierarchy could be:
Administration Hierarchy: City/ULB → Zone → Ward → Locality
In the above-mentioned hierarchy, a City/ULB is being divided into different into zones followed by zones into wards and at the end wards into the locality.
Data has to be collected for every boundary hierarchy type and boundary type with a mapping between the boundary code and its parent boundary code. Following is the table which is to be used across all the hierarchy types.
Sr. No. | Boundary Code* | Boundary Name* (In English) | Boundary Name* ( In Local Language) | Parent Boundary Code* | Boundary Type* | Hierarchy Type Code* |
---|
Data given in the table is a sample data.
Following is the definition of the data columns which are being used in the template:
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|
Following are the steps which should be used to fill the template:
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
After Identifying all the boundary hierarchy, get the sub-classification of all the hierarchies.
Figure out the codes for all the sub-classification for a particular city/ULB.
Start filling the template from the top of the hierarchy in a drill-down approach.
A parent-child mapping code has to be created for every boundary level except for the top level.
Follow the steps until you reach the last sub-classification.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
The departments are defined as different sections within the ULB based on which the functions performed by ULBs and employees in ULB are grouped. The budgets details of the ULBs are also defined by the department. It is suggested that the ULB across the state adopt the same department naming terminology. This document will help you in filling the department detail in the template provided.
Sr. No. | Department Code* | Department name (In English)* | Department Name (In Local Language)* |
---|
Data given in the table is a sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning given in this document under section 'Data Definition'.
Make sure all the headers, its data type, field size and its definition/ description are understood properly.
In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Identify all the departments in ULB well before start filling then into the template.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed after the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity. There is no entity-specific checklist is applicable for this entity.
DIGIT has modules which require the user to pay for the service that he/ she is availing for example property tax, trade license etc. . In order to achieve the functionality, we have a common payment gateway developed which acts a liaison between DIGIT apps and external payment gateways (which depends on the client requirements).
This module facilitates payments and lookup of transaction status.
Following are the details required from the payment gateway vendor in order to configure the payment gateway:
Sr. No | Integration Kit | API Documentation | Redirect Working Key | Merchant Id | Test credential of Debit Card/ Net banking |
---|
Data given in the table is a sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|
The payment gateway is a vendor oriented service that is integrated with different modules in order to facilitate the transactions. Below mentioned are the steps which are followed:
The client has to finalize a payment gateway vendor (for example PAYU, Paytm, HDFC, AXIS atc.) depending upon the requirements.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
After which the details/ documents mentioned in the template would be provided by the vendor.
These details are to be received separately for both prods as well as UAT.
Get the IP address for UAT and Production environments whitelisted from the vendor.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
Point of Sales (POS) machine is a machine that helps in handling transaction processing. This machine accepts and verifies the payments which are made by citizens for prevailing the services of DIGIT.
POS facilitates a middleware app developed in order to verify the payment process between the DIGIT module and the payment.
In this case, no data is required from the state team.
Not applicable.
Not applicable.
Not applicable.
Not applicable.
All content on this page by is licensed under a .
The domain name is the address through which the internet users can access the website rather than entering the whole IP address in the search bar of the browser.
This domain name is ideally chosen by the state/client since its a product which has to be used for/by them.
Following is the table through which the information can be shared.
Sr. No. | Domain Name | EXTERNAL-IP |
---|
Data given in the table is a sample data.
Since all state governments/clients prefer to host the websites on their servers, this activity is ideally done by them.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|
Following are the steps which are to be followed:
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
If the state agrees to host the website on their server, provide them with the 2 columns mentioned in the attached template.
If the state disagrees to host on their server, then a domain name has to be purchased by any of the external vendors and the EXTERNAL-IP address has to be mapped with them.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity:
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Description | Link |
---|---|
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Description | Link |
---|---|
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|---|---|
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Activity | Example |
---|
Sr. No. | Activity | Example |
---|
All content on this page by is licensed under a .
To see the common checklist refer to the page consisting of all the activities which are to be followed to ensure complete and quality data.
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
Sample Master file
Sample Master configuration
Title
Description
tenantId
Serves as a Key
moduleName
Name of the module to which the master data belongs
MasterName
The Master Name will be substituted by the actual name of the master data. The array succeeding it will contain the actual data.
Reference Doc Link 1
MDMS-Service
Reference Doc Link 2
MDMS-Rewritten
API Contract Reference
API Contract Reference
1.
*******
1
Email Id
Alphanumeric
NA
Yes
Gmail account id through which the app would be published on the google play store
2
Password
Alphanumeric
NA
Yes
Password for the Gmail account
1
Make sure that each and every point in this reference list has been taken care of
1
Make sure that the email account is created on Gmail since the play store works on Google accounts only
-
2
Email Id and Password is required in order to login to the google play store for configuration
-
1
Bihar
POP3
SMTP
SMTP
****
192.172.82.12
192.172.82.12
Auto
14
1
Email ID
Alphanumeric
N/A
Yes
Email id which is being configured
2
Your Name
Text
256
Yes
The name on behalf of which the email would be sent in order to receive the updates
3
Account Type
Alphanumeric
64
Yes
The type of email account type protocol which will be used to download messages
4
Incoming Mail Server
Numeric
(12,2)
Yes
The IP address of the email server through which messages would be received
5
Outgoing Mail Server(SMTP)
Numeric
(12,2)
Yes
The IP address of the email server through which messages would be sent
6
Password
Alphanumeric
64
Yes
The password of the email server
7
Incoming Server POP3 Port
Numeric
(12,2)
Yes
The port number through which the emails are received
8
Outgoing server SMTP Port
Numeric
(12,2)
Yes
The port number through which the emails are to be sent
9
Encrypted Connection Type
Alphanumeric
64
Yes
The encryption type which is used for the connection
10
Days after which the email should be removed from the server
Numeric
(12,2)
Yes
The number of days after which the email should be deleted from the server (not from the local device)
1
Make sure that each and every point in this reference list has been taken care of
1
sms.provider.url
www.xyz.com
2
sms.username.parameter
mnsbihar@001
3
sms.username.value
***
1
Parameter
Alphanumeric
64
Yes
The parameter required to be configured
2
Value
Alphanumeric
64
Yes
The corresponding value of the parameter
1
Make sure that each and every point in this reference list has been taken care of.
1
Make sure that the vendor should support multiple language functionality and especially the local language of the state.
-
1 | Every boundary type of data should be filled separately | - |
1 | ACC | Accounts | लेखा |
2 | PHS | Public Health And Sanitation | सार्वजनिक स्वास्थ्य और स्वच्छता |
3 | REV | Revenue | राजस्व |
4 | TP | Town Planning | नगर नियोजन |
1 | Department Code* | Alphanumeric | 64 | Yes | Unique code for the department to identify a department |
2 | Department Name ( In English)* | Text | 256 | Yes | The name of the department in the ULB in English |
3 | Department Name (In Local Language)* | Text | 256 | Yes | The name of the department working in the ULB in local language e.g. Telugu, Hindi etc. whichever is applicable |
1. | File Name | File Name | XYZ#123 | UDDUK | File Name |
1 | Integration Kit | Document | NA | Yes | This is a document that is sent by the vendor which contains information on how to integrate the service |
2 | API Documentation | Document | NA | Yes | This is a separate document which is sent by the vendor in order to help ideally helps us to retrieve the transaction status |
3 | Redirect Working Key | Alphanumeric | 64 | Yes | The working key is provided by the vendor for the generation of the redirection URL |
4 | Merchant Id | Alphanumeric | 64 | Yes | Merchant id provided by the vendor |
5 | Test credential of Debit Card/ Net Banking | Document | NA | Yes | These are the details of the debit/credit card or net banking credentials which would help us test the gateway This contains the card number/Code/Account number etc. |
1 | While finalizing a payment gateway vendor make sure that the vendor should support transactions into multiple bank accounts based on the key( which would be tenantid) | - |
2 | Do get the details for both the environments separately i.e UAT and Production | - |
No mistake should be done in providing the EXTERNAL-IP address | - |
2. | Only one domain name and its corresponding IP address have to be provided | - |
tenant json file
content
Sample Master config file
Sample Module folder
State Level Common-Master Data
State Level Module Specific Common-Master Data
ULB Specific Data
1 | W1 | Ward no.1 | वार्ड नंबर 1 | Z1 | Ward | ADM |
2 | W2 | Ward no.2 | वार्ड नंबर 2 | Z1 | Ward | ADM |
3 | W3 | Ward no.3 | वार्ड नंबर 3 | Z2 | Ward | ADM |
4 | W4 | Ward no.4 | वार्ड नंबर 4 | Z3 | Ward | ADM |
1 | Boundary Code | Alphanumeric | 64 | Yes | This is a code for the sub-classification for a particular boundary. Should be unique across all boundaries defined |
2 | Boundary Name (In English) | Text | 256 | Yes | The name of the boundary that is being defined in the English language |
3 | Boundary Name (In Local Language) | Text | 256 | Yes | The name of the boundary that is being defined in the local language of the state e.g. Telugu, Hindi etc. |
4 | Parent Boundary Code | Alphanumeric | 64 | Yes | This is the boundary code of the parent which identifies to which parent the child belongs to |
5 | Boundary Type | Text | 256 | Yes | The name of the boundary type i.e. Ward, Zone etc. |
6 | Hierarchy Type Code | Alphanumeric | 64 | Yes |
1 | Make sure that each and every point in this reference list has been taken care of |
1 | Make sure that each and every point in this reference list has been taken care of |
192.78.98.12 |
Domain Name | Alphanumeric | 253 | Yes | The name/address of the website being used to access the website/ module |
EXTERNAL-IP | Alphanumeric | 32 | Yes | It is the IP address that has to be mapped to the domain name |
1 | Make sure that each and every point in this reference list has been taken care of. |
At times in the different modules, there is a need to capture the address of the user’s place of residence or where the person is doing a trade, for which the user has to enter his/her full address which creates a task. In order to simplify the process, we can have google map geolocation service in place which would help us get the exact coordinates of the place on the map and help us identify the place.
This service is paid and the client has to purchase the below items:
Google Map API's
https://developers.google.com/maps/documentation/javascript/get-api-key "Maps Javascript API", "Places API" and "Geolocation API" are needed and first 200$ usages are free, once it exceeds, the price per 1000 requests as given below.
Maps JavaScript API (web-client) Return the location and accuracy radius of a device, based on Wi-Fi or cell towers. $5
Geolocation API Return the location and accuracy radius of a device, based on Wi-Fi or cell towers. $5
Places API for Web (web-server) Turn a phone number, address, or name into a place, and provide its name and address. $17
Sr. No | Google API URL* | API Key* |
---|---|---|
Note:
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|---|---|---|---|---|
The data provided is sample data
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Ask the clients to purchase the above-mentioned APIs in the Introduction section.
Get the details for the API URL and key from the client.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Not Applicable
ULB level setup involves the configuration of ULB specific data parameters such as ULB boundaries, ULB bank accounts, and hierarchy details.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
A ULB is divided into certain categories of boundaries by ULB administrative authorities in order to carry out ULB’s functions better. A ULB/City could be divided by a different set of delimitation of boundaries based on functions as given below.
Revenue - Delimitation of ULB into boundaries to perform the target setting and collection of revenue.
Administration - Delimitation of ULB into boundaries for the better administration of ULB.
Locality/ Location - Delimitation of ULB into boundaries based on the places known to citizen with names and easily identifiable by the common person.
All these authorities have designated certain levels of boundary classification for a certain ULB.
The below mention table is used to collect data for the types of hierarchy being followed:
Sr. No. | Code* | Boundary Hierarchy Type* | Description |
---|---|---|---|
The above-mentioned data for the boundary hierarchy is sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Identify all the types of boundaries which are being used in the state in order to carry out various administrative/revenue functions.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Then fill up the hierarchy types and the codes in the respective columns in the template.
Code should be created for the type of boundary being classified.
A brief description of the boundary hierarchy type would be helpful.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
A designation is an act of pointing someone out with a name, a title or an assignment. For example, someone being named president of an organization. This document is to help to gather various designations data which are generally used in ULBs.
Sr. No. | Designation Code* | Designation Name* (In English ) | Designation Name* (In Local Language) |
---|---|---|---|
Data given in the table is a sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning given in this document under section 'Data Definition'.
Make sure all the headers, its data type, field size and its definition/ description are understood properly.
In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Identify all the designations exists in the ULB, refer to governments gazette to define the designations in ULBs.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed after the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
To see the common checklist refer to the Checklist page consisting of all the activities which are to be followed to ensure complete and quality data.
This checklist covers the activities which are specific to the entity. There is no entity-specific checklist is applicable for this entity.
Localization is a practice to localize various UI visible data into the local wordings according to the client's requirements. This practice of localization is enforced on various clients so that it becomes easier for the people using the service to understand the common terminology and make the best use of the available system.
The following texts (but not limited to) on the web page can be localized:
Labels
Messages: Alert messages, success messages, validation messages and other notifications etc.
Help Texts
The module-specific master data would already have been made available in the localized form while collecting the data for the respective module-specific configuration.
Sr. No. | Code* | Module* | Message (In English)* | Message (In Local Language)* |
---|---|---|---|---|
Data mentioned in the data table is a sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Present the client the full sheet of codes as well as the English language for which the localized texts are required.
Ask the client to fill the localized text in the last column which is the message(local language) column.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
This checklist covers all the activities which are common across the entities.
Not Applicable
It is a ULB bank account which is operative at least to receive or deposit the day to day revenue collection done by the ULB. It is used by online payment integrator to disburse the amount in ULBs accounts which have been collected through a payment gateway into a pool account managed by the payment gateway.
Below given data table represents the excel template attached. Data given in the table is a sample data.
Sr. No. | Code* | ULB Name* | Bank Name* | Branch Name* | Account Number* | Account Type* | IFSC* |
---|
Data given in the table is sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory | Definition/ Description |
---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning of them by referring 'Data Definition' section.
Make sure all the headers, its data type, field size and its definition/ description is understood properly. In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Identify the bank account which is to be used to transfer the amount which is collected online for various services.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every checklist point/ activity mentioned in the checklist.
The checklist is a set of activities to be performed after the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
Human Resource Management System (HRMS) is a key module, a combination of systems and processes that connect human resource management and information technology through HR software. The HRMS module can be used for candidate recruiting, payroll management, leave approval, succession planning, attendance tracking, career progression, performance reviews, and the overall maintenance of employee information within an organization.
HRMS module enables users to -
Create User Roles
Create System Users
Employee Information Report
All content on this page by is licensed under a .
A ULB portal is a specially designed website for a ULB that serves as the single point of access for information. It can also be considered a library of personalized and categorized content. A ULB web portal helps in search navigation, personalization, notification and information integration, and often provides features like task management, collaboration, and business intelligence and application integration.
This section tells about the template and table given below represents the template. Full template to fill with the portal content is attached with this page at the last into attachments sections.
Sr. No. | Section Name | Section Content |
---|
Data given in the table is a sample data.
This section consists the information about the meaning of each and every section in the template and then how to fill the templates in a few easy steps.
Below table consist of a standard section of any portal. The additional section as required will have to capture as part of customization.
Sr. No. | Section Name | Data Type | Data Size | Is Mandatory? | Description/ Definition |
---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning given in this document under section 'Data Definition'.
Make sure all the headers, its data type, field size and its definition/ description are understood properly.
In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers all the activities which are specific to the entity.
This is the 3rd step that comes after the boundary data collection. Cross hierarchy mapping happens in case a child has a relationship with more than 2 parents. This double relationship between the child and parents could happen between different hierarchies as well.
For example: In Admin level boundary hierarchy a mohalla M1(child) could be a part of 2 Wards(parent) W1 and W2. In such a case a single Mohalla(child) has to be mapped to 2 Wards(parent).
Below is the data table for the Boundary:
Hierarchy Type | Hierarchy Type 1* | Hierarchy Type 2* |
---|
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Firstly Identify all the child levels which have a relation with more than 2 parent boundary types and their hierarchy types as well.
Fill up the boundary hierarchy (names/ codes) types in place of boundary type 1/2.
Then along with the codes start filling in one by one with the proper mapping between every child and parent.
The Sr. No should be in an incremental order for every new child level.
Prepare a new table for every different parent-child relation.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity. There is no entity-specific checklist activity applicable here.
State Portal is a website for the state. Any content or information which is displayed on this site needs to be provided by the State.
This document is to define a template to collect the portal content and information. And to help in filling up the content into the template.
This section talks about the template and the table given below represents the template. Full template to fill with the portal content is attached with this page at the last into attachments sections.
This section consists of the information about the meaning of each and every section in the template and then how to fill the templates in a few easy steps.
Below table consist of a standard section of any portal. The additional section as required will have to capture as part of customization.
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning given in this document under section 'Data Definition'.
Make sure all the headers, its data type, field size and its definition/ description are understood properly.
In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
States and ULBs can configure their web portal to deploy the DIGIT portal effectively. State-level and ULB level web portal configuration details are covered in this section.
All content on this page by is licensed under a .
The code of the for which this particular boundary is defined
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
To see the common checklist refer to the page consisting of all the activities which are to be followed to ensure complete and quality data.
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
1458-ASD785-987722
Google API URL
Alphanumeric
64
Yes
The URL of the API that is being purchased
2.
API Key
Alphanumeric
64
Yes
The key which the google would provide after the purchase for the API has been done
1
Make sure that each and every point in this reference list has been taken care of
1
ADM
Administration
Administration level boundary classified on the basis of administrative functions such as scrutinize certain rules and regulations
2
REV
Revenue
Revenue-based classification of a ULB is done on the basis of revenue collection
3
LOC
Locality
Location-based classification could be done in order to identify a certain place. For example, Locality of a house of a citizen could follow the below hierarchy:
House no.
Mohalla
Area
Ward
City
1
Code
Alphabet
64
Yes
Code is used to identify a certain classification of the type of boundary hierarchy
2
Boundary Hierarchy Type
Alphanumeric
256
Yes
The meaningful name to define one group of boundaries defined to perform one function
3
Description
Alphanumeric
256
Yes
A brief description of the boundary hierarchy
1
Make sure that each and every point in this reference list has been taken care of
1
Make sure that the hierarchies types should be uniform across all the ULB’s /cities in the state
-
2
Only 3 types of boundary hierarchies are allowed
-
1
ACT
Accountant
अकाउंटेंट
2
AO
Accounts Officer
लेखा अधिकारी
3
AC
Additional Commissioner
अपर आयुक्त
1
Designation Code
Alphanumeric
64
Yes
Unique Identifier for designation which is used as a reference for child configuration mapping
2
Designation Name (In English)
Text
256
Yes
Designation name in English
3
Designation Name (In Local Language)
Text
256
Yes
Designation Name in the local language. e.g. Hindi, Telugu etc. whichever is applicable
1
ACTION_TEXT_APPLICATION
Trade License
Search Trade Licenses
व्यापार लाइसेंस खोजें
2
ACTION_TEST_TL_REPORTS
Trade License
Trade License Reports
ट्रेड लाइसेंस रिपोर्ट
3
CORE_COMMON_CITY
Property Tax
City
शहर
1
Code
Alphanumeric
64
Yes
The code for which the localized language is to be provided
2
Module
Alphanumeric
64
Yes
The module in which the code belongs to
3
Message(In English)
Text
256
Yes
The English language that is being displayed on the UI
4
Message(In Local Language)
Text
256
Yes
The text in the local language that the client wants to be displayed
1
Make sure that each and every point in this reference list has been taken care of
1 | dehradun | Dehradun Municipal Corporation | SBI | Rajpur | XXXX0082XX01 | Saving | SBIX0921 |
2 | haridwar | Haridwar Municipal Corporation | PNB | Chauk | XXXX9820XX9 | Saving | PNBX8320 |
1 | Code | Alphanumeric | 64 | Yes | Unique code is given to the bank detail record e.g. dehradun |
2 | ULB Name | Text | 256 | Yes | Name of Urban Local Body |
3 | Bank Name | Text | 256 | Yes | Name of the bank where the account exists |
4 | Branch Name | Text | 256 | Yes | Name of the bank branch where the account exists |
5 | Account Number | Alphanumeric | 64 | Yes | Bank account number to be used to transfer the amount |
6 | Account Type | Text | 256 | Yes | Account type. e.g. Saving, Current etc. |
7 | IFSC | Alphanumeric | 64 | Yes | IFS code of branch as per FBI guidelines |
Sr. No. | Activity | Example |
1 | Code should not consist of any special characters | E.g. dehradun is allowed but dehradun@1 is not allowed |
2 | The account number should not consist of any special characters. | As issued by the bank |
1 | City Introduction | Kesariya Stupa is a Buddhist stupa in Kesariya, located at a distance of 110 kilometres (68 mi) from Patna, in the Champaran (east) district of Bihar, India. Kesaria Stupa has a circumference of almost 1,400 feet (430 m) and raises to a height of about 104 feet (32 m). |
2 | Mayor’s Message | It is with immense gratitude to the citizens of Kesaria for reposing their faith in me to serve them as Chairman of Kesaria Nagar Panchayat that I write this message. I shall endeavour to prove that they have made the right choice. |
. . . . |
22 | Contact Us | All details of the contact person should be added under this section. |
1 | ULB Logo | Document | N/A | Yes | Logo of resolution: 80 * 80 pixels of the ULB to be shown on the top of the website. |
2 | Slider Images | Document | N/A | Yes | Slider images of resolution 1280 * 450 pixels to be shown on the website. |
3 | City Introduction | Text | N/A | Yes | This section talks about the city hence introducing the city to be filled here to display it to the final audience/traffic onto the portal |
4 | City Map | Document | N/A | Yes | This section will have a map for the city mainly the area which the municipality/ panchayat takes care of and which indicates ULB boundary |
5 | Public Utility Services | Template | N/A | Yes | This section should include the infrastructure services provided to the citizen. E.g. Public Toilet, Govt School, Temples managed by Municipal Corporations/ Nagar Palika/ Panchayat etc. |
6 | Tourist Locations | Template | N/A | Yes | All tourist places in the city should be captured under this section. Tourist locations with pictures and other relevant information should be captured here |
7 | Mayor’s Message | Template | N/A | Yes | Message from the ULB chairman needs to be updated under this section |
8 | Commissioner’s Message | Template | N/A | Yes | Message from the ULB’s EO/commissioner needs to be updated under this section |
9 | ULB News | Template | N/A | Yes | Under this section we have will add current news about the ULB |
10 | ULB Events | Template | N/A | Yes | Under this section, we will add the Ongoing and Upcoming Events by the ULB |
11 | Recruitment Listing | Template | N/A | Yes | Recruitment listing/vacancies within the ULB needs to be mentioned in this section |
12 | Projects Info | Template | N/A | Yes | The description of the govt. Projects which ULBs take care of needs to be updated here with all other relevant details |
13 | Recent Announcements | Template | N/A | Yes | Any kind of announcements with title and description which are in public interest needs to be uploaded under this section |
14 | Home screen flash Announcement | Template | N/A | Yes | Any kind of announcements with title and description and by highlighting link which are in public interests can bee added under this section |
15 | Public Notice | Template | N/A | Yes | The notices announced by the ULB for the citizens with description, Rule and Regulation and timelines |
16 | Government Resolutions | Template | N/A | Yes | Directions, resolutions, and other legal instruction and acts issued by the department should be captured here |
17 | RTI listing | Template | N/A | Yes | All the RTI received by the ULBs shall be listed under this section |
18 | Help Documents for Online Services | Template | N/A | Yes | Under this section, we will add the Document or link with the title of online services for citizens |
19 | Required documents list for Online Services | Template | N/A | Yes | This section tells us about the list of required documents and data like old receipts or old transaction no for each service |
20 | Forms for services | Template | N/A | Yes | This section tells us about the services are not online, offline forms can be uploaded for the users to download |
21 | Tender Listing | Template | N/A | NO | All the tender issued by the ULB needs to be added under this section |
22 | Contact Us | Template | N/A | Yes | All details of the contact person should be added under this section |
1 | All the sections with data type ‘Template’, data to be filled into the section-wise template provided as an attachment | NA |
Sr. No | Section Name | Section Content |
1 | Government Logo |
2 | Chief Minister Message |
. |
. |
. |
. |
20 | About Website |
Sr. No. | Section Name | Data Type | Data Size | Is Mandatory? | Description / Definition |
1 | Government Logo | Document | N/A | Yes | Resolution: 80 * 80 pixels Logo of the state to be updated on the website |
2 | Governor’s Message | Template | N/A | Yes | Message from the governor of the state to the citizens needs to be updated under this section |
3 | Chief Minister Message | Template | N/A | Yes | Message from the chief minister needs to be updated under this section |
4 | State News | Template | N/A | Yes | Under this section we have will add current news about the state |
5 | State Events | Template | N/A | Yes | Under this section, we will add the Ongoing and Upcoming Events in the state |
6 | Recruitment Listing | Template | N/A | Yes | Recruitment listing/vacancies within the state need to be mentioned in this section |
7 | Tender Listing | Template | N/A | Yes | All the tender issued by the state government needs to be added under this section |
8 | Project Info | Template | N/A | Yes | All the Information of upcoming or ongoing project within the state should be added under this section |
9 | Recent Announcement | Template | N/A | Yes | Any kind of announcements by the state government with title and description which are in public interest needs to be uploaded under this section |
10 | Home Screen Flash Announcement | Template | N/A | Yes | Any kind of announcements by the state government with title and description and by highlighting link which are in public interests can bee added under this section |
11 | Public Notice | Template | N/A | Yes | The notices announced by the state government for the citizens with description, Rule and Regulation and timelines |
12 | Government Resolution | Template | N/A | Yes | Directions, resolutions, and other legal instruction and acts issued by the department should be captured here |
13 | RTI Listing | Template | N/A | Yes | All the RTI received by the state government shall be listed under this section |
14 | Help Document for Online services | Template | N/A | Yes | Under this section, we will add the Document or link with the title of online services for citizens |
15 | Required documents list for Online Services | Template | N/A | Yes | This section tells us about the list of required documents and data like old receipt or old transaction no for each service |
16 | Forms for services | Template | N/A | Yes | This section tells us about the services are not online, offline forms can be uploaded for the users to download |
17 | Contact Us | Template | N/A | Yes | All details of the contact person should be added under this section |
18 | List of ULBs (links to the ULB sites) | Template | N/A | Yes | All website Link of ULBs within the state should be added under this section |
19 | About Website | Template | N/A | Yes | This section talks about the all over details whatever is there on the state website |
20 | Tourist Places | Template | N/A | Yes | Under this section, we will add all the tourist place in the state with details and images |
21 | Slider Images | Document | N/A | Yes | Slider images of resolution 1280 * 450 pixels to be shown on the website |
22 | State Map | Document | N/A | Yes | This section will have a map for the State |
1 | All the sections with data type ‘Template’, data to be filled into the section-wise template provided as an attachment | NA |
Q. What if the mandatory field value is not available and not filled?
The mandatory field value is a must in order to provide, without having those filled into template data can not be accepted.
Q. What if the non-mandatory field value is not available and not filled?
It is fine not to provide the non-mandatory field values. These are the fields which are nice to have.
Q. What if the codes are not readily available for the records?
Code is a must to provide. In case, code is not readily available simple sequencing of numbers can be used as codes.
Q. What if the definition of column header is not clear?
Contact the person who has shared the template with you.
Q. Can the order of the columns be changed while filling in the data?
The order of columns must remain intact and should not be altered.
Q. What if the entities are supposed to be defined at the state level but can not be defined?
In a case where the entities which are suggested to define at the state level do not work. Then those can be defined at the ULB level but then again can not be moved to state-level once configured.
Q. What are the benefits of defining the state level?
The benefits of defining the entity at the state level are given below.
Decision Support System - State-level definition and consolidation of data make data analysis and decision-making easy.
Maintenance of such data is easy and correction can be performed quickly.
Avoids the data duplicity in the configuration for those values which are most common across the ULBs.
Support standardization of process rules across the ULBs.
1 | Make sure that each and every point in this reference list has been taken care of. |
Sr.No | Boundary Type* | Boundary Code* | Boundary Type* | Boundary Code* |
1 | Ward | W1 | Mohalla | M1 |
Ward | W2 | Mohalla | M1 |
2 | Ward | W3 | Mohalla | M2 |
Ward | W4 | Mohalla | M2 |
1 | Hierarchy Type 1 | Text | 256 | Yes |
2 | Hierarchy Type 2 | Text | 256 | Yes |
3 | Boundary Type | Text | 64 | Yes |
4 | Boundary Code | Alphanumeric | 64 | Yes |
5 | Boundary Type | Text | 64 | Yes |
6 | Boundary Code | Alphanumeric | 64 | Yes |
1 | Make sure that each and every point in this reference list has been taken care of |
1 | Make sure that each and every point in this reference list has been taken care of |
Master data templates allow users to configure the key parameters and details required for the effective functioning of the modules. This section offers comprehensive information on how to configure the master data templates for each module.
The individual master data templates for specific modules are availed in the Product & Modules section of our docs. Click on the links given below to navigate to view the specific module setup details.
Property Tax Master Data Templates
Trade License Master Data Templates
Water Charges Master Data Templates
Sewerage Charges Master Data Templates
mCollect Master Data Templates
Fire NOC Master Data Templates
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Tax is levied by the government in certain brackets, i.e there are certain components of a tax which sum up and make the final trans-actionable amount. For example, a property tax could have swatch-ta tax, fire cess and certain other components which sum up and make a final amount.
Sr. No. | Code* | Service* | Category* | Name* | Is Debit* | Is Actual Demand* | Order* |
---|---|---|---|---|---|---|---|
Data given in the table is sample data for reference.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Get all the tax heads for a particular module and then proceed to the next module.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed on the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Not Applicable
The Billing and Payments module serves the billing requirements of various ULB departments. The module caters to fulfil the demands generated by the revenue collection needs of the business services.
The module enables ULBs to -
Generate bills
Search bills
Update bills
ULB Level
None
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
A user role defines permissions for users to perform a group of tasks. In a default application installation, there are some predefined roles with a predefined set of permissions. Each role has a certain number set of tasks it is allowed to perform and these roles are Super Admin, Trade License Approver, Data Entry Admin and Trade License document verifier etc.
Sr. No. | Code* | Name* | Description |
---|---|---|---|
Data given in the table is sample data for reference.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning of them by referring 'Data Definition' section.
Make sure all the headers, its data type, field size and its definition/ description is understood properly. In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Identify all different types of user roles on the basis of ULB’s functions.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed once the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
Bill format can be configured on a module level. Few components on the DIGIT sample bill can be configured on a state level and few at ULB level. Components that can be changed on a module level can be categorized as mentioned:
Important messages: Values can be configured on a module level - state level
Sr. No. | Business | Category | Particulars |
---|---|---|---|
Data given in the table is sample data for reference.
Sr. No. | Column Name | Data Type | Data Size | Mandatory | Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Get information about the bill format followed by state
Classify the components on the bill and place it under any category
Map the particulars under each category with DIGIT sample bill
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed one the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Entity Specific Checklist is not required separately.
Tax is levied by the government in certain brackets, i.e there are certain components of a tax which sum up and make the final trans-actionable amount. For example, a property tax could have swatch-ta tax, fire cess and certain other components which sum up and make a final amount.
Sr. No. | Code* | Service* | Category* | Name* | Is Debit* | Is Actual Demand* | Order* |
---|---|---|---|---|---|---|---|
Data given in the table is sample data for reference.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Get all the tax heads for a particular module and then proceed to the next module.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
The checklist is a set of activities to be performed on the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Not Applicable
A system user is a person who uses the application service. A user often has a user account and is identified to the system by a username. A user is a person who accesses a particular application to perform a set of actions.
Each user has a certain number of set tasks, the user would be allowed to perform a task by assigning particular roles which are Super Admin, Trade License Approver, Data Entry Admin and Trade License document verifier etc.
Sl No. | Name* | Mobile No* | Father/Husband's Name * | Gender * | Date of Birth* | Correspondence Address * | ULB* | Role* | Employment Type * | Current Assignment | Status * | Hierarchy * | Boundary Type * | Boundary * | Assigned from Date* | Department* | Designation* | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Data given in the table is sample data for reference.
Sr No | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning of them by referring 'Data Definition' section.
Make sure all the headers, its data type, field size and its definition/ description is understood properly. In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed once the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity.
The workflow process is a set of steps through which information flows in sequence and the workflow roles which derives the actors are assigned to a step to complete the work defined for that level. The states of each level are derived based on the information received from the previous step.
Sr. No. | Current State | Workflow Actions | Next State | Role Name | SLA |
---|---|---|---|---|---|
Data given in the above table is sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning of them by referring 'Data Definition' section.
Make sure all the headers, its data type, field size and its definition/ description is understood properly. In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed once the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Please discuss with a relevant department head before finalizing the workflow.
Workflow action defined as an activity which is performed by a workflow user on a service request/ application during the workflow. All the workflow actions are predefined and performed a well-defined job once performed.
In its nature actions are not configurable, only the localization of actions is permissible as a configuration.
S. No. | Action | Description | Module(s) |
---|---|---|---|
Actions are standard and are not configurable, hence the template, data definition and standard procedure to fill the template are not needed. This page is created to provide the information and helping the defined workflow process.
Not applicable
Not applicable
Not applicable
Not applicable
The Decision Support System in DIGIT platform can be configured to provide customized insights and statistics on the dashboard. This section offers information on how to configure the DSS parameters for maximized efficiency.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Workflow levels are defined for a service with Rights/Role to perform a set of Workflow Actions. There would one or more than one levels involved in a workflow process. This page helps to understand and then define all the levels with its job description and fill in a standard template.
Sr. No. | Module | Service | Workflow Level | Task | Role |
---|---|---|---|---|---|
Data given in the above table is sample data.
Sr. No. | Column Name | Data Type | Data Size | Is Mandatory? | Definition/ Description |
---|---|---|---|---|---|
Download the data template attached to this page.
Have it open and go through all the headers and understand the meaning of them by referring 'Data Definition' section.
Make sure all the headers, its data type, field size and its definition/ description is understood properly. In case of any doubt, please reach out to the person who has shared this document with you to discuss the same and clear out the doubts.
Identify all different types of services on the basis of ULB’s functions to create a workflow.
Start filling the data starting from serial no. and complete a record at once. repeat this exercise until the entire data is filled into a template.
Verify the data once again by going through the checklist and taking care of each and every point mentioned in the checklist.
The checklist is a set of activities to be performed once the data is filled into a template to ensure data type, size, and format of data is as per the expectation. These activities have been divided into 2 groups as given below.
This checklist covers all the activities which are common across the entities.
Please discuss with a relevant department head before finalizing the workflow.
The common configuration details required for all modules are available in this section. Refer to the required for processing applications across modules, find the for filling master data templates, and get the answers to common .
All content on this page by is licensed under a .
This section contains the configuration documents related to the DIGIT service stack.
Click on the respective service link below to find its configuration details and additional information resources.
A workflow process is a series of sequential tasks that are carried out based on user-defined rules or conditions, to execute a business process. It is a collection of data, rules, and tasks that need to be completed to achieve a certain business outcome.
In DIGIT, workflow for a business process is divided into three units out of which two are completely configurable while the remaining is fixed and lays the foundation of the other two.
This is the first unit which defines the actions and their nature which are basically executed during the workflow process by the workflow actors. It plays the foundation and is configurable in nature as per the ground needs.
This is the second unit which defines the number of steps a workflow process may have and then trigger the creation role for each and every step with appropriate rights to perform a set of actions at each step. It is completely configurable.
This is the third unit which defines the workflow process including the steps, roles with actions and the present, next and previous state of a step/level of the workflow process. It is completely configurable.
The checklist is the set of activities which are there to perform on completion of a task to ensure the fullness and quality of the task.
The data type is an attribute of data, a particular kind of data item, as defined by the values it can take, the computer system used, or the operations that can be performed on it. In order to help to fill in the right kind of data for a data field/ column in Excel, the below-given table has different data types with its description.
S.No. | Data Type | Definition/Description |
---|
Sr. No. | Activity | Example |
---|
This section provides technical details about business service setup, configuration, deployment, and API integration.
Key Performance Indicators(KPI) are a way of showing certain insights from the data available which would help the key management authorities to take important business decisions in order to improve the business, enhance the business process and help the people improve the way of functioning. This exercise largely becomes dependent on the data.
The insight could be shown in various available forms such as line graph, bar graph or a tabular format.
Sr. No. | Column Name | Data Type | Data Size | Mandatory | Description |
---|
Download the data template attached to this page.
Get a good understanding of all the headers in the template sheet, their data type, size, and definitions by referring to the ‘Data Definition’ section of this document.
In case of any doubt, please reach out to the person who has shared this template with you to discuss and clear your doubts.
Present the client with information about various available chart types.
Show the client how the various KPI’s will look on the web page by showing the reference page from the attachments.
After which the gather the information for various chart types and the information that the chart types have to display in the description column.
Verify the data once again by going through the checklist and making sure that each and every point mentioned in the checklist is covered.
This checklist covers all the activities which are common across the entities.
This checklist covers the activities which are specific to the entity:
The type of hierarchy 1 the boundary belongs to which is to be mapped with other boundaries in hierarchy 2. Refer
The type of hierarchy 2 the boundary belongs to which is to be mapped with other boundaries in hierarchy 1. Refer
This is the type of boundary from hierarchy 1. Refer
This is the code of the boundary for the boundary from hierarchy 1. Refer
This is the type of boundary from hierarchy 2. Refer
This is the code of the boundary for the boundary from hierarchy 2. Refer
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No | Checklist Parameter | Example |
---|---|---|
Sr. No. | Activity | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No | Checklist Parameter | Example |
---|---|---|
Sr. No. | Activity | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No | Checklist Parameter | Example |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Sr. No. | Activity | Example |
---|
S.No. | Activity | Example |
---|
All content on this page by is licensed under a .
Sr. No. | Checklist Parameter | Example |
---|
Sr. No. | Checklist Parameter | Example |
---|
All content on this page by is licensed under a .
1
PT_UNIT_PENALTY
PT
Penalty
PT Penalty
FALSE
FALSE
1
2
PT_UNIT_EXEMPTION
PT
Exemption
PT Exemption
TRUE
TRUE
2
1.
Code
Alphanumeric
64
Yes
The code for the tax that is being levied
2.
Service
Text
256
Yes
This is the module or the name of the service for which the tax head is being mentioned
3.
Category
Text
256
Yes
The category to which the tax head belongs such as Penalty or exemption or cess
4.
Name
Text
256
Yes
This is the name/description of the tax head
5.
Is Debit
Text
NA
Yes
In case the tax head is an amount that needs to be added up to the property tax, then this needs to be TRUE else FALSE
6.
Is Actual Demand
Text
NA
Yes
In case the tax head is an amount that needs to be subtracted from the property tax, then this needs to be TRUE else FALSE
7
Order
Integer
5
Yes
The order in which the mentioned tax head should appear on the screen
1
Make sure that each and every point in this reference list has been taken care of
1
TL_APPROVER
TL Approver
Trade License Approver
2
GRO
Grievance Routing Officer
Grievance Routing Officer
3
CSR
Customer Support Representative
An employee who files and follows up complaints on behalf of the citizen
1
Code
Alphanumeric
64
Yes
A unique code that identifies the user role name.
2
Name
Text
256
Yes
The Name indicates the User Role while creating an employee a role can be assigned to an individual employee
3
Description
Text
256
No
A short narration provided to the user role name
1
Make sure that each and every point in this reference list has been taken care of
1
The Code should be alphanumeric and unique
TL_APPROVER, GRO
2
The Name should not contain any special characters
TL Approver : [Allowed]
#TL Approver! : [Not allowed]
1
Water Charges
Important messages
5% rebate to be given on advance payment on the bills
1
Category
Text
64
Yes
To list out the components on the bill, every particular can be grouped into a category
2
Particulars
Alphanumeric
256
Yes
Each category can have multiple entries under it, ie particulars
3
Business
Text
64
Yes
The business for which the Bill format is to be configured
1
Make sure that each and every point in this reference list has been taken care of
1
PT_UNIT_PENALTY
PT
Penalty
PT Penalty
FALSE
FALSE
1
2
PT_UNIT_EXEMPTION
PT
Exemption
PT Exemption
TRUE
TRUE
2
1.
Code
Alphanumeric
64
Yes
The code for the tax that is being levied
2.
Service
Text
256
Yes
This is the module or the name of the service for which the tax head is being mentioned
3.
Category
Text
256
Yes
The category to which the tax head belongs such as Penalty or exemption or cess
4.
Name
Text
256
Yes
This is the name/description of the tax head
5.
Is Debit
Text
NA
Yes
In case the tax head is an amount that needs to be added up to the property tax, then this needs to be TRUE else FALSE
6.
Is Actual Demand
Text
NA
Yes
In case the tax head is an amount that needs to be subtracted from the property tax, then this needs to be TRUE else FALSE
7
Order
Integer
5
Yes
The order in which the mentioned tax head should appear on the screen
1
Make sure that each and every point in this reference list has been taken care of
1
Pooja
9999999999
Mr.Bala Chandra
FEMALE
22/01/1987
Nagar Nigam Haldwani-PIN CODE-263139
Haldwani
Super User
PERMANENT
Yes
EMPLOYED
REVENUE
City
Haldwani
05/10/2019
Revenue
Tax Inspector
2
M.C. Joshi
9999999999
Late Jai Dutt Joshi
MALE
04/08/1965
Nagar Nigam Haldwani
Haridwar
TL Counter Employee
PERMANENT
Yes
EMPLOYED
REVENUE
City
Haldwani
30/10/2019
Revenue
Tax Collector
1
Name
Text
256
Yes
The Name of his/her to whom the access to the system is provided, so he/she can use the application to perform the role function assigned
2
Mobile Number
Alphanumeric
10
Yes
The Mobile number of his/her to whom the access to an application provided. The mobile number is relevant so in an emergency case the person can be contacted
3
Father/Husband's Name
Text
256
Yes
The Name of the Father/Husband of his/her to whom the access to an application provided. This information is for internal records
4
Gender
Text
64
Yes
The Gender of the individual person. This information is for internal records
5
Date of Birth
Date
10
Yes
The Date of birth of the individual person. This information is for internal records
6
Alphanumeric
256
No
The email id of his/her, this email id is linked to receiving all the official communication from the customers and other counterparts
7
Correspondence Address
Text
256
Yes
The address of his/her, this information is saved for internal records
8
ULB
Text
256
Yes
A ULB to be assigned against the individual employee, So that the assigned role can perform his/her duty within that assigned ULB
9
Role
Text
256
Yes
A Role is a permission for users to perform a group of tasks, a role is assigned to the user to perform a function within the application. A user can be assigned multiple roles. Click User Roles for the Role master Data
10
Employment Type
Text
256
Yes
The employment types indicate the type of contract which he/she hold with the organization. This indicates whether he/she is a permanent employee or a contract employee for short period. The employment type “Permanent”, “Temporary”, “DailyWages” and “Contract” either one should be selected
11
Current Assignment
Text
64
Yes
The current assignment type to indicate whether the employee is currently assigned to a particular department and designation. A user can be also be assigned multiple assignments to perform his/her function
12
Status
Text
256
Yes
The Status indicates the type of status which he/she hold, whether employed or not within the organization
13
Hierarchy
Text
256
Yes
The hierarchy indicates the hierarchy type for the Boundary to which he/she is assigned
14
Boundary Type
Text
256
Yes
The boundary type indicates assigning a city to his/her role within the organization. A user can be assigned multiple Boundary Type to perform in different function. (Example: City, Zone, Block and Locality)
15
Boundary
Text
256
Yes
The boundary indicates assigning a particular city to his/her role wherein they perform role function of the application for the particular city. A user can be assigned multiple Boundary to perform in a different location (Example: City Name and Tenant Zone)
16
Assigned from Date
Date
10
Yes
The assigned from date indicates the date from which his/her role is assigned to perform the role function assigned
17
Department
Text
256
Yes
The Department indicates the particular department to which his/her role is assigned
18
Designation
Text
256
Yes
The designation indicates a particular designation is assigned to his/her role
1
Make sure that each and every point in this reference list has been taken care of
1
The Name should not have any special character
Pooja : [Allowed]
#Pooja! : [Not allowed]
2
The date should be in DD/MM/YYYY format
DD/MM/YYYY : [Allowed]
YYYY/DD/MM : [Not allowed]
3
The Email ID should be valid Id, email Id should contain the Company/Firm name or an individual personal name before the “@” and the “XXXXX.com” after the “@”
1
Create and Approve
Approved
Assistant
NA
2
Forward
Pending for Approval
Assistant
NA
3
Pending for Approval
Verify and Approve
Approved
Supervisor
NA
4
Pending for Approval
Save
Pending for Approval
Supervisor
NA
5
Pending for Approval
Reject
Rejected
Supervisor
NA
6
Pending for Approval
Send Back to Assistant
Rejected for Review
Supervisor
NA
7
Rejected for Review
Forward
Pending for Approval
Assistant
NA
8
Rejected for Review
Cancel
Rejected
Assistant
NA
1
Current State
Text
256
Yes
The Current State indicates the stage at which the process of the workflow in progress
2
Action
Reference
64
Yes
The Action indicates the activity that can be performed at the respective stage in the workflow. This refers to Workflow Actions
3
Next State
Text
256
Yes
The Next State is the state in the workflow that gets updated to in the respective stage on performing the action. (Example: assigning the for approval from one person to next person)
4
Role Name
Reference
64
Yes
The role is the different hierarchy of people with designation who are authorized to initiate, approval or rejecting the process. It refers to Workflow Levels
5
SLA
Integer
2
No
The SLA indicates the time-frame within which the action to be completed
1
Make sure that each and every point in this reference list has been taken care of
1
Initiate
The action will start the application for citizen and CEMP
Trade Licenses, Property Tax, Building Plan Approval
2
Edit
Using this action the application can be opened in editable form and any changes can be performed
Trade Licenses, Property Tax, Building Plan Approval
3
Submit
This action will freeze the application from citizen or CEMP and proceed further for workflow
Trade Licenses, Property Tax, Building Plan Approval
4
Verify and Forward
This action will proceed application to the next stage of the workflow process and also assigns tasks to the next user in the workflow (if needed)
Trade Licenses, Property Tax, Building Plan Approval
5
Pay
This action will help to pay application fees
Trade Licenses, Property Tax, Building Plan Approval
6
Approve
This action will be the last stage of application workflow which will grant permission for a specific application
Trade Licenses, Property Tax, Building Plan Approval
7
Activate connection
This action will create a consumer no. against the application and demand generation can start
Water and Sewerage Charges
8
Reject
This action will reject the application, the application rejected can’t be processed further or with the help of it, citizens can not re-apply. He has to start a new application next time.
Trade Licenses, Property Tax, Building Plan Approval
9
Send Back
An actor can assign back the application to the previous state if any edits/changes are required
Trade Licenses, Property Tax, Building Plan Approval
10
Send Back to Citizen
An actor can assign back the application to the citizen if any edits/changes are required
Trade Licenses, Property Tax, Building Plan Approval
11
View
Anyone in the workflow can view the application and task details
Trade Licenses, Property Tax, Building Plan Approval
12
Comment
Comments can be recorded before any action is taken which can change the state of the application
Trade Licenses, Property Tax, Building Plan Approval
13
Download/ Print
Download/Print of any artefacts can be configured as per the requirement for application processing
All Modules
14
Forward
This action will not create any bill but will be forwarded to the next level review and approve
Finance
15
Create and Approve
In this action, the user who initiates the action can create and approve the bill. (Here there should a threshold amount to be set up)
Finance
16
Save
In this action, the approver can Save the bill before it is approved or rejected
Finance
17
Verify and Approve
This action will help the approver to approve the bill if he/she feels all the information is updated correctly
Finance
18
Reject
This action will help the approver to reject the bill if he/she feels all the information is not correct and may need further clarification.
Finance
19
Send back to Assistant
This action will help to send back the notification on the bill is rejected from the approver
Finance
20
Cancel
This action will help the approver to cancel the bill if he/she feels that the bill need to be rejected
Finance
1
Finance
Bill Accounting
Level 1
Create Bill
Accounts Clerk
2
Finance
Bill Accounting
Level 2
Create and Approve
Accounts Clerk
3
Finance
Bill Accounting
Level 3
Forward for Approval
Chief Accountant
4
Finance
Bill Accounting
Level 4
Verify the Bill
Chief Accountant
5
Finance
Bill Accounting
Level 5
Approval
Approver
1
Module
Text
64
Yes
The module indicates for which the user would be mapped for a specific module to perform the action
2
Service
Text
64
Yes
The service indicates the type of process which the user performs in a particular module
3
Workflow Level
Integer
2
Yes
The workflow level indicates when a process has executed the level at which the flow of the process in progress
4
Task
Text
64
Yes
The task refers to which state the action is in progress during the workflow
5
Job Description
Text
256
Yes
A short description provided for the role (Example: Designation of the Role)
1
Make sure that each and every point in this reference list has been taken care of
1 | Alphanumeric | It contains alphabets and numbers. And generally used to define the code |
2 | Decimal | Floating point number with a fraction value up to 2 decimal places |
3 | Integer | Whole number without having a fraction part in it |
4 | Text | A string of alphabets, numbers, spaces, and symbols |
5 | Date | It represents a date and is captured in the format of ‘DD/MM/YYYY’ |
6 | Reference | It is a code of a record from the referred entity and having a related record in the prevailing entity |
7 | Document | It represents a document which is needed as an attachment with other relevant details in the template |
1 | The entity is to be decided to be defined at the state level and all the ULBs are agreed on the same. | NA |
2 | Data filled into templates should cater to the needs of each and every ULBs. | NA |
3 | Order of headers should remain unchanged in the template while filling the data. | NA |
4 | Value filled into the template doesn’t exceed the given data size limit. | NA |
5 | Codes filled in the template for all the records are unique. It means no 2 records in the template share the same code. | Below records in code, value pair is not acceptable. RES - Residential RES - Non-residential |
6 | All the columns marked with an asterisk must be filled with the values and not even a single record left without a value. | NA |
7 | Reference value in the template must also exist in the referred entity template. A value without being present in the referred entity template is invalid. | NA |
8 | None of the values filled in the template should have a character which is not allowed. | NA |
9 | Mobile Numbers filled into the template must be 10 digit valid mobile numbers without country code. | NA |
10 | Email id filled into template should be a valid email ID. | NA |
11 | Local language values should be Unicode charset only. | NA |
12 | Values of data type alphanumeric consist of the alphabet and numeric values only. All the entity code should follow this. | Allowed - ABC01 Not allowed - ABC#01 |
13 | Values of data type decimal must be a number having fraction part up to 2 places of decimal. | Allowed - 23.87 Not allowed - 12.0982 |
14 | Values of data type integer must be a number which is not a fraction but a whole number. | Allowed - 15 Not allowed - 12.01 |
15 | Values of data type text must be a string of alphabets, numbers, special characters, and spaces. | NA |
16 | Values of data type date must be a date in the format ‘DD/MM/YYYY’. Here DD means day, MM means month and YYYY means years. | Allowed - 31/12/2019 Not allowed - 12/31/2019, 31/12/19, etc. |
17 | Values of data type reference must be a reference to another entity referring to a value in that entity. Only the code of the referred record from the referred entity is provided as a value here. | NA |
18 | Value of the data type document must be a document which is to be provided separately as an attachment while submitting the data along with a filled data template. | NA |
1 | RC | Ration Card | ID Proof, Address Proof | This document is used as proof of identity if presented with photo and address proof. |
2 | AC | Aadhar Card | ID Proof, Address Proof | This document is used as proof of identity as well as address. |
3 | DL | Driving License | ID Proof, Address Proof | This document is used as proof of identity as well as address. |
4 | VC | Voter ID Card | ID Proof, Address Proof | This document is used as proof of identity as well as address. |
5 | PS | Passport | ID Proof, Address Proof | This document is used as proof of identity as well as address. |
6 | AL | Arms License | ID Proof, Address Proof | This document is used as proof of identity as well as address. |
7 | CC | Cast Certificate | ID Proof, Address Proof | This document is used as proof of identity if presented with photo and address proof. |
8 | DC | Domicile Certificate | ID Proof, Address Proof | This document is used as proof of identity if presented with photo and address proof. |
9 | PC | PAN Card | ID Proof | This document is used as proof of identity. |
10 | EB | Electricity Bill | Address Proof | This document is used as proof of address only. |
11 | TB | Telephone Bill | Address Proof | This document is used as proof of address only. |
12 | WB | Water Bill | Address Proof | This document is used as proof of address only. |
13 | RSA | Registered Sale Agreement | Address Proof | This document is used as proof of address only. |
14 | RLA | Registered Lease Agreement | Address Proof | This document is used as proof of address only. |
15 | VRC | Vehicle Registration Certificate | Address Proof | This document is used as proof of address only. |
16 | IAO | Income Tax Assessment Order | Address Proof | This document is used as proof of address only. |
17 | HT | House Tax Slip | Others | These are the documents which are specifically needed to avail a service. |
18 | FL | Food License | Others | These are the documents which are specifically needed to avail a service. |
19 | LL | Liquor Licence | Others | These are the documents which are specifically needed to avail a service. |
20 | GST | GST Registration | Others | These are the documents which are specifically needed to avail a service. |
Sr. No | Module* | KPI Chart Type* | Description* |
PGR | Line Chart | Showing the status of closed complaints over a year month-wise |
Pie Chart | Showing the various type of complaints |
Metric | Showing the rate of different complaint status by percentage in a tabular format |
2. | Property Tax | Horizontal Bar Graph | Showing the various information about property application status month-wise over a year |
1 | Module Name | Text | 256 | Yes | The name of the module for which the KPI chart types have to be defined |
2 | KPI Chart Type | Text | 256 | Yes | The type of chart which has to display information |
3 | Description | Text | 256 | Yes | A brief description of the information that the chart has to display. Steps to fill Data |
Make sure that the chart types are chosen from the list of available chart types from the attachment section | - |
This section contains docs that walk you through the various steps required to configure DIGIT urban services.
1 | Make sure that each and every point in this reference list has been taken care of |
The main objective of the billing module is to generate the bill for all revenue-based business services. To serve the bill, the Billing-Service requires demand. Demands will be prepared by the revenue modules and stored by billing based on which it generates the Bill.
Prior Knowledge of Java/J2EE.
Prior Knowledge of Spring Boot.
Prior Knowledge of KAFKA
Prior Knowledge of REST APIs and related concepts like path parameters, headers, JSON, etc.
Prior knowledge of demand-based systems.
The following services should be up and running:
user
MDMS
Id-Gen
URL-Shortening
notification-sms
eGov billing service creates and maintains demands.
Generates bills based on demands.
Updates the demands from payment when the collection service takes a payment.
Deploy the latest image of the billing service available.
In the MDMS data configuration, the following master data is needed for the functionality of the billing.
MDMS
Business Service JSON
TAX-Head JSON
Tax-Period JSON
Billing service can be integrated with any organization or system that wants a demand-based payment system.
Easy to create and simple process of generating bills from demands
The amalgamation of bills period-wise for a single entity like PT or Water connection.
Amendment of bills in case of legal requirements.
Customers can create a demand using the /demand/_create
Organizations or Systems can search the demand using /demand/_searchendpoint
Once the demand is raised the system can call /demand/_update endpoint to update the demand as per need.
Bills can be generated using, which is a self-managing API that generates a new bill only when the old one expires /bill/_fetchbill.
Bills can be searched using /bill/_search.
Amendment facility can be used in case of a legal issue to add values to existing demands using /amendment/_create and /amendment/_update can be used to cancel the created ones or update workflow if configured.
Interaction Diagram V1.1:
Doc Links
API List
What is apportioning?
Adjusting the receivable amount with the individual tax head.
Types of apportioning V1.1
Default order-based apportioning(Based on apportioning order adjust the received amount with each tax head).V1.1
Types of apportioning V1.2: (TBD)
Proportionate-based apportioning (Adjust total receivable with all the tax heads equally)
Order & percentage-based apportioning (Adjust total receivable based on order and the percentage which is defined for each tax head).
Principle of apportioning
The basic principle of apportioning holds that if the full amount is paid for any bill then each individual tax head should get nullified with their corresponding adjusted amount.
Example: Case 1: When there are no arrears all tax heads belong to their current purpose.
Example: given below
Case 2: Apportioning with two years of arrear: Example: The apportioning details for the financial year 2014-15 are given below.
The table below illustrates the demand structure generated in case there are no payments for the specified financial year (2015-16).
The Collection service is to serve as a revenue collection platform for all the billing systems through cash, cheque, demand draft, and swipe machines. It enables payment for all services provided by the eGov platform at a single point for the citizen and counter collection in municipal alike.
Prior knowledge of Java/J2EE
Prior knowledge of SpringBoot
Prior knowledge of REST APIs and related concepts like path parameters, headers, JSON, etc
Prior knowledge of Kafka and related concepts like Producer, Consumer, Topic, etc.
The following services should be up and running:
egov-localization
egov-mdms
egov-idgen
egov-url-shortening
billing-service
Allows citizens to create a payment.
Allows employees to create the payment for the citizen indirectly.
provides facilities to capture partial and advanced payments based on configs.
allows payment cancellation to help with scenarios of bad checks and other failed payment scenarios.
Integrates with billing service for demand back-update of payment.
deploy the latest version of the collection-services docker builds.
The MDMS data configuration uses the same data updated by Billing-Service
Billing Service | Configuration-Details: Refer to the MDMS data config from here.
Following are the properties in the application.properties
Collection service can be integrated with any organization or system that wants a payment system to keep track of its payments. Organizations can customize part of the application or its functionality based on their requirements.
Easy payments and tracking of payments.
Configurable functionalities according to client requirements.
Customers can create a payment using the /payments/_create.
Actors on the system can keep track of payments using /payments/_search
endpoint.
Once the payment is done but it encounters a technical issue outside of the system then it can be cancelled with /payments/_workflow.
For employees to access the payments API the respective module name should be appended after the payment API path - /payments/PT/_workflow
- here PT refers to the property module.
Port forward the collection service to the current environment where the IFSC CODE bank details data is to be migrated. Sample command - 1kubectl port-forward collection-services-76b775f976-xcbt2 8055:8080 -n egov
Import postman collection from API list which refers as /preexistpayments/_update
and runs with the same localhost to where we port forwarded using the above command.
Expected result. In the EGCL_PAYMET table where IFSCODE data is present for those records, EGCL_PAYMET.ADDITIONALDETAILS bankdetails will be updated.
Ex: For IFSCCODE : UCBA0003047 Response from API https://ifsc.razorpay.com/UCBA0003047 is updated in EGCL_PAYMET.ADDITIONALDETAILS as {"bankDetails": {"UPI": true, "BANK": "UCO Bank", "CITY": "BHIKHI", "IFSC": "UCBA0003047", "IMPS": true, "MICR": "151028452", "NEFT": true, "RTGS": true, "STATE": "PUNJAB", "SWIFT": "", "BRANCH": "BHIKHI", "CENTRE": "MANSA", "ADDRESS": "ADJOINING HP PETROL PUMP MANSA ROADDISTRICT MANSA","BANKCODE":"UCBA","DISTRICT":"MANSA","CONTACT":"+918288822548"}
Billing-Collection-Integration Refer to the integration with details and explanation.
The consumer sometimes needs additional amounts (Amendments) added to their bill due to reasons from outside of the system. The addition of amounts happens with respect to the consumer code of the entity in the product(PT, WS, etc..,), any unpaid demand in the system is a candidate for amendments.
Prior knowledge of billing-service in the DIGIT framework.
Amendment mainly works with two types of functionality as follows:
Amendment
Demand
Bill Amendment provides a separate flow that triggers the workflow for validating the process of adding additional amounts to existing demands. This validation was earlier available only to the respective modules. An amendment is allowed only when there is a need to add or reduce the amount from the existing bill belonging to an entity. The reasons for such cases could be:
Court case settlement
One time waiver
Write-offs
DCB correction (old demands in paid status)
Property tax remission
Criteria:
Below are certain prerequisites to creating an amendment,
presence of demand in the billing system
any one of the reasons listed above
valid document proof for the reason
there is no other amendment already in the workflow
Procedure:
The process of adding an amendment in specific scenarios is given below.
There are two scenarios on how an amendment is completed based on the paid status of the existing demands in the system.
1. when demand is unpaid/partially paid
Create a demand (Or an existing demand can be used) with demand detail → DD1.
Do not pay the bill or make a partial payment.
Create an amendment for the same consumer code (with demand detail → DD2).
Approve the amendment - the response should return an amendment with the status CONSUMED.
Search the demand or fetch the bill for the consumer code. The demand/bill should contain demand details of the demand and amendment together with DD1 and DD2 in the same demand/bill.
2. when demand is completely paid
Create demand and make complete payment or choose a consumer code which is fully paid.
Create amendment (with demand detail → DD1).
Approve amendment - the response should be APPROVED.
Create new demand for the consumer code (with demand detail → DD3). The demand response should contain two demand details DD1 and DD2 saved to the demand.
The amendment search returns CONSUMED status after the demand is created.
IMPACT: Does not impact any other functionality other than adding demand details to demands on APPROVAL.
IMPACTED BY: Existence of demands in the system.
Billing Service | Configuration-Details: Refer to billing-service config for MDMS data. The amendment makes use of the same data set.
WORKFLOW CONFIG:
Amendment integration helps organizations add additional value to the demand without any change in the system.
Easy to create and simple process of updating demands
Helps ease changes into the system which are not part of normal functionality - Amendment of bills in case of legal requirements.
This is integrated into the billing system by default.
The amendment facility can be used in case of a legal issue to add values to existing demands using /amendment/_create. The parameter /amendment/_update is used to cancel the created updates or update configured workflows.
API Definition
API List
V2 Technical Document for UI
This release for DSS focuses on improving user experience and the ability given to the user to get deeper insights using drill-through and comparison indicators in tables.
The release includes the following features:
Breadcrumbs for better navigation
Drill through options in tables and charts
Comparison indicators in Table
In addition to the left navigation panel, the addition of breadcrumbs is also useful to provide a better sense of the current page insight. It is also very much helpful for mobile navigation. The user can navigate using the breadcrumbs by clicking on the required parent menu.
Technical Implementation Details
It Works based on the Current Route URL and previous Route URL
File Details - https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/Breadcrumbs.js
The ability provided in DSS to configure the drill through for required options in tables as well as charts. The drill through options is useful in configuring the required hierarchy of data set. This helps users to go up to 'N' levels to get deeper insights
Technical Implementation Details:
Drill down/drill through in tables, is based on the drillDownChartId and filter.
Here chart id is used for the subsequent call to fetch the next table along with the applied/selected filters.
File Details - https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/components/Charts/TableChart.js
Drill throughs in piecharts:
It is similar to the drill-down in tables. Here drill through in piecharts are based on the drillDownChartId field in the parent piechart.
File Details - https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/components/Charts/DonutChart.js
Providing better insights about the metric performances of different dimensions, a comparison indicator is required inside data tables comparing usually with a different time range (last year/last month) and what is percentage change with time.
Technical Implementation Details:
Comparison with the previous year's data in every table data uses the same request object by changing the time range to the previous year/month/week.
File Details - https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/components/Charts/TableChart.js
The following method along with parameters is used to fetch the previous year's data.
After receiving last year's data it is compared with the current year's data. The comparison is shown as insight data. The comparison logic is present in uiTable.js -
TimeFilter
The current time component is not very intuitive and user-friendly. So a new library react-date range is used to enhance the time filter.
File Details - https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/components/common/DateRange/index.js
Event Duration Graphs
Ability to generate graphs showcasing time spent between multiple events like average turnaround time, complaint assigning time, etc.
A DSS_EVENT_DURATION_GRAPH is added in the PGR config
A decision support system (DSS) is a composite tool that collects, organizes and analyzes business data to facilitate quality decision-making for management, operations and planning. A well-designed DSS aids decision makers in compiling a variety of data from many sources: raw data, documents, personal knowledge from employees, management, executives and business models. DSS analysis helps organizations identify and solve problems, and make decisions.
Code Git Repos: https://github.com/egovernments/frontend/tree/master/web/dss-dashboard
State-Level Admin
Commissioner
Domain-Level Employee
There are three types of dashboards -
Home page (refer figure 1)
Overview page (refer figure 2)
Module level dashboard (refer figure 3)
The home page contains multiple cards, each card is clickable.
There are two types of cards, i.e, the overview card and the module-level card.
The overview and the module level cards are differentiated by vizType,
Overview card: Clicking on the overview card navigates to the overview page. vizType for overview is a collection.
Module Level card: Clicking on the module level card navigates to the module level dashboard. vizType is a module (i.e Property Tax, Trade License etc).
Request Payload for dashboardConfig
auth-token: authenticate the request and it fetches from a local storage key called “Employee.token”
DashboardConfig API Response
roleName: the type of user.
Visualisations: The key contains all configurations for displaying the visualisation like rows with charts etc please refer to figure 1.3.
In Figure 1.3, vizType key will define the module UI like
Collection chart & module chart refer figure 1
In dashboardConfig response, the visualisation key contains all rows & charts details (refer figure 1.3). Each row contains the visual details like name, vizType, noUnit, isCollapsible, charts etc (refer figure 1.3).
name - name of visualisation
vizType - type of visualisation like COLLECTION, MODULE, METRIC-COLLECTION, PERFORMING-METRIC, CHART
COLLECTION - The home page, contains the collection data (refer figure 1).
MODULE - The home page, contains the module-level data (refer figure 1).
METRIC-COLLECTION - In Overview/Module Level Page, contains the collection data (refer figure 2.1).
PERFORMING-METRIC -In Overview/Module Level Page, contains the top/bottom performing data (refer figure 2.2).
CHART - In Overview/Module Level Page, contains the below visualisations (refer figure 2.3 to figure 2.7).
PIE CHART (refer figure 2.3)
LINE CHART (refer figure 2.4)
BAR CHART (refer figure 2.5)
HORIZONTAL BAR CHART (refer figure 2.6)
TABLE CHART (refer figure 2.7)
Visualisations
ULB dashboard contain different filters, i.e ULBs and Wards/Blocks. The data to the filters are loaded from MDMS API below - https://dev.digit.org/egov-mdms-service/v1/_search
Each ULB dashboard, overview dashboard and module-level pages contain different filters and are identified by roleName in configs API.
The Wards/Blocks filter is a dependable filter, which gets loaded on ULB selection.
In the ULB dashboard, the on-page ULB filter is applied across all the charts and for the performance chart, the default ULB filter is not applied.
Overview and all module level pages has a ULB dashboard.
GLOBAL Filters (refer to figure 2.8)
Filters are loaded from the MDMS API - https://dev.digit.org/egov-mdms-service/v1/_search. Filters are loaded on the basis of roleName.
Admin role: On the Module level page, Date, DDR and ULB filter are loaded.
On the Overview level page, Date, DDR, ULB and Service filter are loaded.
Commissioner role: On the Module level page, Date, ULB and Wards/Blocks filters are loaded.
On the Overview page, Date, ULB and Service filters are loaded.
Denomination filter: The Denomination filter has three options to display the amount and number in a particular format.
Crore
Lack
Unit
The denomination filter is not applied to the percentage and text (refer to figure 2.10). The type of data is identified by a symbol in the plots of charts API.
Custom Date Filter
If duration < 15 days, it displays data day-wise
If duration <= 30 days, it displays data week-wise
If duration >30, it displays data month-wise
Tabs
Currently, the dashboard contains two types of tabs -
Revenue (refer figure: 4.1)
Service (refer figure: 4.1)
Tabs are identified by name in visualisations of config API.
Table Chart with drill-down
Table chart visualisations have normal material UI data table features like search, sort etc.
In table response, if filter key & drillDownChartId contain any value users can drill-down the table.
Cards
Each card header is localised and has an info icon with a tooltip option that displays the header and can display a description.
The number of cards in a row and in a page is driven by the backend. The backend provides the row number to each card where it should be displayed.
Card contains option icon that enables users to either download images and or share images.
Image download and share user id from vizArray in order to differentiate each card in a page.
Download and Share (refer to figure 2.9)
Download offers two options - to download data as an image or a PDF.
Share: Share creates the Image/PDF and uploads it S3 using below API and returns file id - https://mseva-uat.lgpunjab.gov.in/filestore/v1/files
The file Id is fetched using the API - https://mseva-uat.lgpunjab.gov.in/filestore/v1/files/url
Each S3 image is shortened using the API - https://mseva-uat.lgpunjab.gov.in/egov-url-shortening/shortener
Configurations
Github link for config: https://github.com/egovernments/frontend/blob/master/web/dss-dashboard/src/config/configs.js
BASE URL: End point of REST API for dashboard.
FILE Upload: End point of REST API for file upload.
FETCH FILE: End point of REST API for file fetch.
MDMS: End point of REST API for fetch MDMS Data.
SHORTEN URL: End point of REST API for Shorten URL, which is used for share via Email / What's app.
CHART COLOR CODE: Color code object for all charts.
MODULE LEVEL: for global filters, which contains services name & filter key.
SERVICES: for global filter, service filter.
Upload Localisation Keys
code: pre-defined key for back-end.
message: message contains the value for the key.
module: rainmaker-dss
locale: contains locale data
for more details eGov team to be documented
Module name: rainmaker-dss
NPM Module Used - https://github.com/egovernments/DIGIT-OSS/blob/master/frontend/mono-ui/web/dss-dashboard/package.json
Steps to setup DSS in Local
Step 1: Run as independent, switch to dss-dashboard folder
Step 2: Get the below details from the environment website and update the localstorage in the browser.
Employee.tenant-id Employee.user-info Employee.token Employee.module Employee.locale localization_en_IN locale
Step 3: Run Yarn install and yarn start to start working on dss in local setup.
DSS Features Enhancements V2: DSS Features Enhancements V2 Technical Document for UI
DSS Backend Configuration Manual
DSS has two sides to it. One is the process in which the data is pooled to ElasticSearch and the other is the way it is fetched, aggregated, computed, transformed and sent across. DSS must be configurable since the entire process involves playing around with a variety of data sets. This ensures easy configuration of data sets in new scenarios.
This document explains the steps on how to define the configurations for both sides of DSS Analytics and Ingest Pipeline Services.
Ingest: Micro Service which runs as a pipeline and manages to validate, transform and enrich the incoming data and pushes the same to ElasticSearch Index
Analytics: Micro Service which is responsible for building, fetching, aggregating and computing the Data on ElasticSearch to a consumable Data Response. Which shall be later used for visualizations and graphical representations.
JOLT: JSON to JSON transformation library written in Java where the "specification" for the transform is itself a JSON document
Modules / Domain Level: These are the Services in this context. Each of the services, such as Property Tax, Trade License, Water and Sewerages are considered as Modules / Domains
Chart: Each individual graphical representation is considered as a Chart in specific. For example, a Metric of Total Collection is considered as a Chart.
Visualization: Group of different Charts is considered as a Visualization. For example, the group of Total Collection, Target Collection and Target Achieved is considered as a Metric Collection of Charts and thus it becomes a Visualization.
Discussed below are the ingest pipeline configuration details -
Topic Context Configurations
Validator Schema
JOLT Transformation Schema
Enrichment Domain Configuration
JOLT Domain Transformation Schema
Topic context configuration is an outline to define which data is received on which Kafka Topic.
Indexer Service and many other services are sending out data on different Kafka Topics. If the Ingest Service receives the data and passes it through the pipeline, the context and the version of the data received have to be set. This configuration is used to identify which Kafka topic consumed the data and the mapping details.
Validator schema is a configuration schema library from Everit. By passing the data against this schema, it ensures whether the data abides by the rules and requirements of the schema that has been defined.
JOLT is a JSON to JSON transformation library. In order to change the structure of the data and transform it in a generic way, JOLT has been used.
While the transformation schemas are written for each data context, the data is transformed against the schema to obtain transformed data.
This configuration defines and directs the enrichment process that the data goes through.
For example, if the data which is incoming is belonging to a collection module data, then the collection domain config is picked. And based on the specified business type the right config is picked.
In order to enhance the data of collection, the domain index specified in the configuration is queried with the right arguments and the response data is obtained, transformed and set.
As a part of enhancement, once the domain level object is obtained, we might not need the complete document as is in the end data product.
Only those parameters which should be or can be used for aggregation and representation are to be held and others are to be discarded.
In order to do that, we make use of JOLT again and write schemas to keep the required ones and discard the unwanted ones.
The above configuration is used to transform the data response in the enrichment layer.
Use case:- JOLT Transformation Schema for collection V2
JOLT transformation schema for payment-v1 has been taken as a use case to explain the context collection and context version v2. The payment records are processed/transformed with the schema. The schema supports splitting the billing records into an independent new record. So if there are 2 bill items in the collection/payment incoming data then this results in 2 collection records in turn.
Here: $i, the variable value that gets incremented for the number of records of paymentDetails
$j, the variable value that gets incremented for the number of records of bill details.
Note: For Kafka connect to work, Ingest pipeline application properties or in environments direct push must be disabled.
es.push.direct=false
Below is the list of configurations
Chart API Configuration
Master Dashboard Configuration
Role Dashboard Mappings Configuration
Each Visualization has its own properties. Each Visualization comes from different data sources (Sometimes it is a combination of different data sources)
In order to configure each visualization and its properties, we have Chart API Configuration Document.
In this, Visualization Code, which happens to be the key, will be having its properties configured as a part of the configuration and are easily changeable.
Master dashboard configuration defines the dashboards that are to be painted on the screen.
It includes all the visualizations, their groups, the charts and even their dimensions in terms of height and width.
Role dashboard mapping ensures that each role is mapped against the dashboards that they are authorized to see.
To add a new role, modify the RoleDashboardMappingsConf.json (roles node) configuration file as given below.
Note: Any number of roles & dashboards can be added
Below as in Figure 9 is a sample to add a new role object, a new dashboard object.
To add a new dashboard, modify the MasterDashboardConfig.json (dashboards node) as shown below in Figure 10.
Note: dashboards array add a new dashboard as given below
To add new visualisations, modify the MasterDashboardConfig.json (vizArray node) as shown in Figure 11.
Note: vizArray is used to hold multiple visualizations.
To add a new chart, chartApiConf.json has to be modified as shown below. A new chartid has to be added with the chart node object.
Metric chart Sample as shown in Figure 12.
Pie chart Sample as shown in Figure 13.
Line chart Sample as shown in Figure 14.
Table chart Sample: This chart is of 2 types - table and xtable.table (as shown in Figure 15.) Type allows the addition of aggregated fields as available in the query keys. To extract the values based on the key, aggegationPaths have to be added along with their data type as in pathDataTypeMapping.
xtable (as shown in Figure 16) type allows the addition of multiple computed fields with the aggregated fields added dynamically.
To add multiple computed columns, define the following params within computedFields []
actionName - (IComputedField<T> interface),
fields - [] names as existing in the query key,
newField - name to appear for the computation
Steps to create charts and visualise are:
Create/Add a chart in chartApiConf.json
Add a visualization for the existing dashboard in MasterDashboardConfig.json as defined above.
Or in order to create/add a new dashboard create the dashboard in MasterDashboardConfig.json and create a role in RoleDashboardConfig.json
Configuration Changes For DrillThroughs:
Example - drill through in ward table in the property tax dashboard.
wardDrillDown is the visualization code for PT Drill Down. The 'kind' attribute shows the type of visualization code. Apart from two things all the attributes are common.
Example - Drill through in the ComplaintList table in the PGR Dashboard.
complaintDrillDown is the visualization code for PGR Drill Down.
The above complaintDrillDown visualization code is called in the drill chart parameter.
v2 configuration details
The Collection Service serves as a revenue collection platform for all the billing systems through cash, cheque, demand drafts, or the swipe machine. It enables payment for all services provided by the eGov platform at a single point directly from the citizen or over-the-counter collection within municipalities.
Prior Knowledge of Java/J2EE
Prior Knowledge of SpringBoot
Prior Knowledge of REST APIs and related concepts like path parameters, headers, JSON, etc
Prior Knowledge of Kafka and related concepts like Producer, Consumer, Topic, etc.
The following services must be up and running:
egov-localization
egov-mdms
egov-idgen
egov-url-shortening
billing-service
Allows citizens to create a payment.
Allows employees to create the payment for the citizen indirectly.
Provides facilities to capture partial and advanced payments based on configs.
Allows payment cancellation to help with scenarios of bad checks and other failed payment scenarios.
Integrates with billing service for demand back-update of payment.
deploy the latest version of the collection-services docker builds.
The MDMS data configuration uses the same data updated by the Billing-Service.
The table below lists the application properties.
Collection service can be integrated with any organization or system that requires a payment system to keep track of its payments. Organizations can customize part of the application or its functionalities based on their requirements.
Easy payments and tracking of payments.
Configurable functionalities according to client requirement
Customers can create a payment using the /payments/_create
Actors on the system can keep track of payments using /payments/_searchendpoint
Once the payment is done and it encounters a technical issue that is beyond the system - the payment can be cancelled with /payments/_workflow
For employees to access the payments API the respective module name should be appended to the payment API path - /payments/PT/_workflow. Here PT refers to the property module.
Doc Links
API List
Apportion service is used to apportion the amount paid against a bill among the different tax heads based on the implemented algorithm. The default algorithm uses the order of the tax head to apportion the tax head with the lowest order apportioned off first and the highest order tax head apportioned last.
Before you proceed with the documentation, make sure the following pre-requisites are met -
Java 8
Kafka server is up and running
egov-persister service is running and has apportioned persister config path added to it
PSQL server is running and a database is created to store apportion audit data
Apportion payment in tax heads of bill
Apportion advance amount in tax heads of demand during demand creation
Environmental Variables | Description |
---|---|
Deploy the latest version of egov-apportion-service
Add apportion persister yaml path in persister configuration
There is no separate configuration required. The TaxHead master that is configured in the billing service is only used
Any payment service which wants to divide the paid amount into different tax head buckets can integrate with apportion service.
Apportions amount in tax heads
To integrate, the host of egov-apportion-service should be overwritten in the helm chart
/apportion-service/v2/bill/_apportion should be called to apportion the bill
/apportion-service/v2/demand/_apportion should be called to apportion the advance amount in demands
(Note: All the APIs are in the same postman collection therefore the same link is added in each row)
Migration details from v1 to v2
According to the new collection service, which follows the payment structure for storing information about payments and payment details, it is necessary to migrate the old collection structure into the new payment structure.
In the old collection service, for every transaction, the receipt number is generated on the bill detail level. Since the bill contains multiple bill details each transaction is mapped to multiple receipt numbers. So after payment of a single bill, multiple receipt numbers are generated. The mapping of the transactions to the receipt number changed in the new collection service.
In the new collection service, the receipt number is generated at the bill level. For each bill transaction, one receipt number is generated. So every bill for a consumer code and business service has one receipt number.
The records from tables egcl_receiptheader, egcl_receiptdetails, egcl_instrument, egcl_instrumentheader need to be transferred into tables egcl_payment, egcl_paymentdetail, egcl_bill, egcl_billdetial, egcl_billaccountdetail.
For smooth data transactions, the record from the old receipt is mapped according to the payment structure. The new payment response can be formed with receipt data.
The table below provides the mapping between receipt and payment structure with some remarks.
After the creation of the payment response with receipt data, it is pushed into the Kafka topic “egov.collection.migration-batch”. The persister inserts the payment data into tables - egcl_payment, egcl_paymentdetail, egcl_bill, egcl_billdetial, egcl_billaccountdetail.
Indexer config for the legacy data index and new payments.
https://github.com/egovernments/configs/blob/master/egov-indexer/payment-indexer.yml
persister config -
These need to get promoted before initiating the migration process. Migration happens through an API call, add role-actions based on your requirement. Otherwise, port-forwarding will work.
Endpoint: /collection-services/payments/_migrate?batchSize=100&offset= Body: { "RequestInfo": { "apiId": "Rainmaker", "action": "", "did": 1, "key": "", "msgId": "20170310130900|en_IN", "ts": 0, "ver": ".01", "authToken": "a6ad2a1b-821c-4688-a70e-4322f6c34e54" }
In case of any failure and restarting migration, take the value of offset and tenantId printed in the logs and resume the migration process.
/collection-services/payments/_migrate?batchSize=100&offset=200&tenantId='pb.tenantId'
Collection-service build:- collection-services-db:9-COLLECTION_MIGRATION-e9701c4
DIGIT is India's largest open-source platform for Urban Governance. It provides API-based access to government functions enabling the government to provide facilities via integration with relevant service players. This document provides the details of how system integrators enable bill collection facilities to customers using DIGIT as the governance platform. It outlines the integration approach with Billing and Collections services to enable fetching bill dues to citizens and recording their payments into the system.
DIGIT is completely API driven and allows for data exchange with disparate systems using REST API calls. Most functional APIs are protected resources that can be accessed after proper authentication with the platform. The platform also checks for the right level of access for given credentials. A bill collection flow -
Authenticate with DIGIT
Get citizen bill using a service-specific query
Record the payment details against the bill
Optional - Get payment API to fetch the receipt details
The in-field team of the system integrator makes the calls to the integrator's own system (or a standard system like BBPS). Integration with DIGIT follows a server-to-server approach where the backend system of the integrator makes these calls to the DIGIT platform as per requirement. The diagram below depicts the high-level flow of calls between on-field devices like PoS to the integrator backend (Integrator System) and from the backend of the DIGIT integrator to DIGIT (DIGIT Platform).
Note: The process of calling payment API results in a receipt creation.
DIGIT uses Swagger 2.0 as its API standard and all its APIs are documented in Swagger. Wherever needed this document provides a link to our API documentation online. An example of typical request/response snippets necessary for integration is provided below in the respective sections.
DIGIT is a multi-tenanted system - hence all APIs in DIGIT except tenantid are passed either in the query param or RequestBody (Please refer to detailed API documentation as indicated in sections below). The tenantid represents the modular operating unit for the operation of an API, e.g. in a municipal governance use case. A tenantid represents one ULB. Your platform contact will help you access the configured list for your use case.
Authentication API also expects tenantid (your platform contact will help you identify the one to use). Based on the role as an integrator the OAUTH token in response can be used for unit/ULB level tenants in subsequent API calls (meaning you may not need one authentication per unit/ULB level tenant).
Authentication
To ensure data privacy and security, transactional APIs in DIGIT are protected under authentication. System integrators are requested to contact the respective state authority to get the necessary OAUTH tokens required to access the APIs.
Note: Apart from the userid/password, the system may enforce IP-based access control in which case the integrator may be required to share the IP or range of IPs from which the request will originate.
Use the API below to generate the access token based on the credentials provided. Given below is an example of the request and response. The OAuth token to be used from the response is highlighted in bold.
Request Snippet
Response Snippet
2. Fetching Bill
DIGIT allows the integrators to fetch the bills for citizens using the consumer number of the respective service (e.g. Water charges, Property Service, Trade License).
Note: Different services may have different notions of consumer number, e.g. for Water Charges consumer number signifies the "Connection number" while for Property it is the "Property Id".
For some services, DIGIT also provides the facility to fetch bills by mobile number.
Note: A bill search by mobile number may return multiple bills across services and may not return bills from services that do not support mobile-number-based search.
To support the partial payment use case each bill in the response of the fetch bill API indicates whether it allows partial payment and if yes, the minimum amount to be paid.
To fetch a bill from DIGIT, make sure that the OAuth token is generated as per the Authentication section above. Post that use the following API to fetch the bill -
Choose Billing Service from the dropdown.
Go to the Bill section of BillingService.
Go to the Bill tab.
3. Make Payment
Once the bill is fetched from the DIGIT system, the system integrator is expected to relay it back to the Field Device. The integrator is expected to Initiate and collect the payment based on government preference indicated in the bill (can it be partially paid and if so the minimum amount etc.) and citizen's preference of payment instrument etc.
Once the payment is successfully done in the integrator's system, the integrator is expected to register the payment in DIGIT using the Payment Create API.
Note: A bill is considered unpaid/partially paid by DIGIT till appropriate receipts are created using this API - which means that a subsequent fetch of the bill, till this API is called, returns the original bill
DIGIT expects a receipt (result of calling payment API) to be created against the bill number returned in the fetch bill API.
Note: A receipt needs to be created for each bill. Therefore, if a total payment represents multiple bills - one receipt creation per bill is expected (DIGIT supports multiple receipt creation in a single call).
To create a receipt in DIGIT, make sure that the OAuth token is generated as per the Authentication section above. Post that use the following API to create the receipt -
Choose Collection Service from the dropdown.
Go to Payment.
Go to Make Payment.
A decision support system (DSS) is a composite tool that collects, organizes and analyzes business data to facilitate quality decision-making for management, operations and planning. A well-designed DSS aids decision-makers in compiling a variety of data from many sources: raw data, documents, personal knowledge from employees, management, executives and business models. DSS analysis helps organizations identify and solve problems, and make decisions.
The Swagger API for the backend is below
Swagger API for ingest
The target upload file template is given below -
All content on this page by is licensed under a .
Objective: Reap Benefit system is one of the vendors that provide the chatbot services using the as backend services to communicate with citizen through chatbot. As part of the requirement, we need to create a complaint in digit platform when ever citizens raise the complaint through Reap Benefit chatbot.
The turn-io-adapter service is a wrapper to transform Reap Benefit request format to DIGIT PGR request format. This service has transform API that constructs the required PGR request from the request message sent from the Reap Benefit system. Reap Benefit system consumes the tranform API to communicate with the DIGIT PGR module.
In this process, once a complaint is created it sends a WhatsApp message to the citizen with a track link. Whenever some action is taken by ULB employees on complaint, a WhatsApp message is sent to citizen.
Before you proceed with the configuration, make sure the following pre-requisites are met -
Java 8
Rainmaker-PGR service is running
Complaints are generated on the DIGIT platform using the Reap Benefit system chatbot.
Messages are sent to citizen through WhatsApp when employees perform some action on the complaint.
Deploy the following builds
rainmaker-pgr-db:v1.1.3-bb2961cf-13
turn-io-adapter:v1.1.3-bb2961cf-19
egov-searcher:v1.1.3-d43c421c-5
nlp-engine:v1.0.0-c3889d14-10
Frontend commits
2) Add /turn-io-adapter/_transform in egov-mixed-mode-endpoints-whitelist configuration
3) Once you are done with 2nd step restart zuul pod
Add name filed in complaint category master in PGR. Link for the data -
Push the localisation data for all the locality data with module as rainmaker-chatbot. Sample localisation object -
{ "code": "SC1", "message": "Azad Nagar - WARD_1", "module": "rainmaker-chatbot", "locale": "en_IN" }
NA
This is the samplerequest for _transform api to create a complaint
Turn-io-adapter is integrated with Rainmaker-pgr application. Turn-io-adapter application internally invokes the rainmaker-pgr service to generate complaints.
Turn-Io-adapter application to call turn-io-adapter/_transform
to generate the complaint and takes the data from the PGR.
DIGIT offers key municipal services such as Public Grievance & Redressal, Trade License, Water & Sewerage, Property Tax, Fire NOC, and Building Plan Approval.
The inbox service is an aggregation service which aggregates data of municipal services and workflow based on given complex search criteria and returns applications and workflow data in paginated views. The service also returns the total count matching the search criteria.
This service allows searching of both the module objects as well as processInstance
(Workflow record) based on the given criteria for any of the municipal services. It uses a module-specific configuration which is stored in application.properties as a key value map, where the key is the businessService name while the value is the configuration map. A sample configuration is attached below -
Here, the key of the config map is the PT module business service for which the inbox is to be configured. The search definition details are specified below -
searchPath
- Points to the search URL of the municipal module
dataRoot
- This is the search response key that we get from module search, e.g. in the property module, the search response returns response objects inside the “Properties” key.
applNosParam
- This is the parameter that calls the workflow search once the module objects are retrieved based on the search. This parameter is the field that joins the module table with the workflow process instance table, e.g. in the case of the Property module it is “acknowldgementNumber”.
businessIdProperty
- This is the parameter with which we search module objects in case of empty moduleSearchCriteria
by performing the workflow search first. Again, this parameter is the field that joins the module table and workflow process instance table, e.g. in the case of the Property module it is “acknowldgementNumber”.
applsStatusParam
- This is the application status field name for the module used for the search, e.g. in the case of the Property module, it is “status”.
To provide pagination and total count across multiple modules, the inbox service is integrated with the searcher. The searcher provides the list of ids and the total count of applications. The inbox service processes the count and the results are returned to the API. The sample configuration link for PT and TL modules is given below:
Technical documentation detailing migration steps
This specifies the migration steps which are specific to the payment index.
Step 1: Adding a target index - Add index name dss-payment_v2 as below:
In kibana, dev tools, apply the below command
Note: This name should be as the value present in ingest es.index.namemapping.json24 May 2021, 11:15 AM
Step 2: Optional changes required in Ingest application properties
Ingest pipeline application properties contain es.direct.push supposed to be set true for testing.
S.No. | Property Name | Value | Description |
---|
Step 3: Run migration Api, which migrates the data from the source index to the target index.
S.No. | Name | Description |
---|
Note: After migration, ensure dss-payment_v2 data has been populated and is available.
In kibana, dev tools verify using below command
Variable | Path | Description |
---|---|---|
Description | Link |
---|---|
Description | Link |
---|---|
Tax Head | Amount | Order | Full Payment (2000) | Partial Payment (1500) | Partial Payment (750) | Partial Payment With Rebate (500) |
---|---|---|---|---|---|---|
Tax Head | Amount | Tax Period From | Tax Period To | Order | Purpose |
---|---|---|---|---|---|
Tax Head | Amount | Tax Period From | Tax Period To | Order | Purpose |
---|---|---|---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Property | Value | Remarks |
---|---|---|
Description | Link |
---|---|
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Description | Link |
---|---|
API | Action ID | Roles |
---|---|---|
__All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
__All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Parameter Name | Description |
---|
Parameter Name | Description |
---|
Parameter Name | Description |
---|
Parameter Name | Description |
---|
Parameter Name | Description |
---|
for the full configuration in detail.
All content on this page by is licensed under a .
Refer to the MDMS data configuration here.
Property | Value | Remarks |
---|
- Refer to the integration details.
Description | Link |
---|
Description | Link |
---|
All content on this page by is licensed under a .
Description | Link |
---|---|
Description | Link |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Field From Payments | Field from Receipts | Remark |
---|---|---|
__All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
__All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Note: Please refer to the following url for nlp-engine technical documentation -
1) turn-io-adapter: "" (In service host configuration)
bs.businesscode.demand.updateurl
Each module’s application calculator should provide its own update URL. if not present then a new bill will be generated without making any changes to the demand.
bs.bill.billnumber.format
BILLNO-{module}-[SEQ_egbs_billnumber{tenantid}]
IdGen format for the bill number
bs.amendment.idbs.bill.billnumber.format
BILLNO-{module}-[SEQ_egbs_billnumber{tenantid}]
is.amendment.workflow.enabled
true/false
enable disable workflow of bill amendment
Id-Gen service
****
url-shortening
MDMS
/demand/_create, _update, _search
/bill/_fetchbill, _search
/amendment/_create, _update
Pt_tax
1000
6
1000
1000
750
750
AdjustedAmt
1000
-250
-750
-750
RemainingAMTfromPayableAMT
0
0
0
0
Penality
500
5
500
500
AdjustedAmt
500
-500
RemainingAMTfromPayableAMT
1000
250
Interest
500
4
500
500
AdjustedAmt
500
-500
RemainingAMTfromPayableAMT
1500
750
Cess
500
3
500
500
AdjustedAmt
500
-500
RemainingAMTfromPayableAMT
2000
1250
Exm
-250
1
-250
-250
AdjustedAmt
-250
250
RemainingAMTfromPayableAMT
2250
1750
Rebate
-250
2
-250
-250
AdjustedAmt
-250
250
RemainingAMTfromPayableAMT
2500
750
Pt_tax
1000
2014
2015
6
Current
AdjustedAmt
0
Penality
500
2014
2015
5
Current
AdjustedAmt
0
Interest
500
2014
2015
4
Current
AdjustedAmt
0
Cess
500
2014
2015
3
Current
AdjustedAmt
0
Exm
-250
2014
2015
1
Current
AdjustedAmt
0
Pt_tax
1000
2014
2015
6
Arrear
AdjustedAmt
0
Pt_tax
1500
2015
2016
6
Current
AdjustedAmt
0
Penalty
600
2014
2015
5
Arrear
AdjustedAmt
0
Penalty
500
2015
2016
5
Current
AdjustedAmt
0
Interest
500
2014
4
Arrear
AdjustedAmt
0
Cess
500
2014
3
Arrear
AdjustedAmt
0
Exm
-250
2014
1
Arrear
AdjustedAmt
0
collection.receipts.search.paginate
true/false
By setting this property true, show you the search result of receipt in a bucket(page) which contains a certain number of records.
is.payment.search.uri.modulename.mandatory=true
TRUE/FALSE
Make module name in URI path mandatory
collection.receipts.search.default.size
Certain number (say 30)
Give the 30 records at a time and next 30 results are in the next page.
collection.is.user.create.enabled
true/false
By setting this property true, enabling the creation of user with receipt creation
receiptnumber.idname
This property is used for creation of receipt number using ID-GEN service
receiptnumber.servicebased
true/false
If servicebased is set to false, use default state level format for the format of receipt number and if it is set to true the format for the receipt number has to be mentioned in MDMS
receiptnumber.state.level.format
[cy:MM]/[fy:yyyy-yy]/[SEQ_COLL_RCPT_NUM]
Default state level format for the receipt number.
collection.payments.search.paginate
true/false
By setting this property true, show you the search result of payment records in a bucket(page) which contains a certain number of records.
egov.collection.payment-create
The kafka topic on which the record has to push/pull when payment is created.
egov.collection.payment-cancel
The kafka topic on which the record has to push/pull when payment is cancelled.
egov.collection.payment-update
The kafka topic on which the record has to push/pull when payment is updated.
Billing-service
Id-Gen service
url-shortening
MDMS
/payments/_create
/payments/_update
/payments/_workflow
/preexistpayments/_update
/amendment/_create, _update
egov.apportion.default.value.order
If set to true will apportion of the negative amount first irrespective of tax head order
Collection Service
Billing Service
API Swagger Documentation
/apportion-service/v2/bill/_apportion
/apportion-service/v2/demand/_apportion
Payments.Id
---
Set as UUID
Payments.tenantId
Receipt.tenantId
Payments.totalDue
---
Total due for payment is calculated by subtracting totalAmount from bill and amount from Receipt.instrument
Payments.totalAmountPaid
Receipt.instrument.amount
Payments.transactionNumber
Receipt.instrument.transactionNumber
Payments.transactionDate
Receipt.receiptDate
Payments.paymentMode
Receipt.instrument.instrumnetType.name
Payments.instrumentDate
Receipt.instrument.instrumentDate
Payments.instrumentNumber
Receipt.instrument.instrumentNumber
Payments.instrumentStatus
Receipt.instrument.instrumentStatus
Payments.ifscCode
Receipt.instrument.ifscCode
Payments.additionalDetails
Receipt.Bill.additionalDetails
Payments.paidBy
Receipt.Bill.paidBy
Payments.mobileNumber
Receipt.Bill.mobileNumber
If mobileNumber from Receipt.bill is null it has to set with some value e.g: “NA”
Note: Payments.mobileNumber should not be null
Payments.payerName
Receipt.Bill.payerName
Payments.payerAddress
Receipt.Bill.payerAddress
Payments.payerEmail
Receipt.Bill.payerEmail
Payments.payerId
Receipt.Bill.payerId
Payments.paymentStatus
--
Based on paymentMode from Payment, the paymentStatus is set.
If paymentMode is ONLINE or CARD then paymentStatus is set to DEPOSITED otherwise it is set to NEW
Payments.auditDetails.createdBy
Receipt.auditDetails.createdBy
Payments.auditDetails.createdTime
Receipt.auditDetails.createdTime
Payments.auditDetails.lastModifiedBy
Receipt.auditDetails.lastModifiedBy
Payments.auditDetails.lastModifiedTime
Receipt.auditDetails.lastModifiedTime
Payments.paymentDetails.Id
---
Set as UUID
Payments.paymentDetails.tenantId
Receipt.tenantId
Payments.paymentDetails.totalDue
---
Total due for paymentDetails is calculated by subtracting totalAmount from bill and amount from Receipt.instrument
Payments.paymentDetails.totalAmountPaid
Receipt.instrument.amount
Payments.paymentDetails.receiptNumber
Receipt.receiptNumber
Payments.paymentDetails.manualReceiptNumber
Receipt.Bill.billDetails.manualReceiptNumber
Payments.paymentDetails.manualReceiptDate
Receipt.Bill.billDetails.manualReceiptDate
Payments.paymentDetails.receiptDate
Receipt.receiptDate
Payments.paymentDetails.receiptType
Receipt.Bill.billDetails.receiptType
Payments.paymentDetails.businessService
Receipt.Bill.billDetails.businessService
Payments.paymentDetails.additionalDetail
Receipt.Bill.additionalDetail
Payments.paymentDetails.auditDetail
---
auditDetail for paymentDetail is same as payment auditDetail
Payments.paymentDetails.billId
---
Based on id in egbs_billdetail_v1 table billId is extracted,Where id in egbs_billdetail_v1 is Receipt.Bill.billDetails.billNumber
Payments.paymentDetails.bill
---
Based on the billid, tenantid and service the bill is search by calling the Billing service API and set it to Payments.paymentDetails.bill
Payments.paymentDetails.bil.billDetails.amountPaid
Receipt.instrument.amount
For each amountPaid in billDetails, its value is set from Receipt.instrument.amount
/localization/messages/v1/_search
1531
SUPERUSER,EMPLOYEE,CITIZEN,GRO,DGRO,
/egov-mdms-service/v1/_search
954
LOA_CREATOR,SUPERUSER,WO_CREATOR,AE_CREATOR,WORKS_MASTER_CREATOR,
/dashboard-analytics/dashboard/getDashboardConfig/propertytax
1892
STADMIN
/dashboard-analytics/dashboard/getDashboardConfig/home
1889
STADMIN
/dashboard-analytics/dashboard/getDashboardConfig/tradelicense
1893
STADMIN
/dashboard-analytics/dashboard/getDashboardConfig/pgr
1894
STADMIN
/dashboard-analytics/dashboard/getDashboardConfig/ws
2010
STADMIN
/dashboard-analytics/dashboard/getChartV2
1890
STADMIN, EMPLOYEE
topic | Holds the name of the Kafka Topic on which the data is being received |
dataContext | Context Name which needs to be set for further actions in the pipeline |
dataContextVersion | Version of the Data Structure is set here as there might be different structured data at a different point in time |
id | Unique Identifier for the Configuration within the configuration document |
businessType | This defines as in which kind of Domain / Service is the data related to. Based on this business type, query and enhancements are decided |
indexName | Based on Business Type, Index Name is defined as to which index has to be queried to get the enhancements done from |
query | Query to execute to get the Domain Level Object is defined here. |
targetReferences sourceReference | Fields which are variables in order to get the domain level objects are defined here. The variables and where all the values has to be picked from are documented here |
Key (e.g: totalApplication) | This is the Visualization Code. This key will be referred to in further visualization configurations. This is the key that will be used by the client application to indicate which visualization is needed for display. |
chartName | The name of the Chart has to be used as a label on the Dashboard. The name of the Chart will be a detailed name. In this configuration, the Name of the Chart will be the code of Localization which will be used by Client Side |
queries | Some visualizations are derived from a specific data source. While some others are derived from different data sources and are combined together to get a meaningful representation. The queries of aggregation which are to be used to fetch out the right data in the right aggregated format are configured here. |
queries.module | The module / domain level, on which the query should be applied on. Property Tax is PT, Trade License is TL. If the query is applied across all modules, the module has to be defined as COMMON |
queries.indexName | The name of the index upon which the query has to be executed is configured here. |
queries.aggrQuery | The aggregation query in itself is added here. Based on the Module and the Index name specified, this query is attached to the filter part of the complete search request and then executed against that index |
queries.requestQueryMap | Client Request would carry certain fields which are to be filtered. The parameters specified in the Client Request are different from the parameters in each of these indexed documents. In order to map the parameters of the request to the parameters of the ElasticSearch Document, this mapping is maintained |
queries.dateRefField | Each of these modules has separate indexes. And all of them have their own date fields. When there is a date filter applied against these visualizations, each of them has to apply it against their own date reference fields. In order to maintain what is the date field in which index, we have this configured in this configuration parameter |
chartType | As there are different types of visualizations, this field defines as what is the type of chart / visualization that this data should be used to represent. Chart types available are: metric - this represents the aggregated amount/value for records filter by the aggregate es query pie - this represents the aggregated data on grouping. This is can be used to represent any line graph, bar graph, pie chart or donuts line - this graph/chart is data representation on date histograms or date groupings perform - this chart represents groping data as performance-wise. table - represents a form of plots and value with headers as grouped on and list of its key, values pairs. xtable - represents an advanced feature of the table, it has additional capabilities for dynamic adding header values. |
valueType | In any case of data, the values which are sent to plot might be a percentage, sometimes an amount and sometimes it is just a count. In order to represent them and differentiate the numbers from the amount from percentage, this field is used to indicate the type of value that this Visualization will be sending. |
action | Some of the visualizations are not just aggregating on data source. There might be some cases where we have to do a post aggregation computation. For Example, in the case of Top 3 Performing ULBs, the Target and Total Collection is obtained and then the percentage is calculated. In these kinds of cases, what is the action that has to be performed on that data obtained, is defined in this parameter. |
documentType | The type of document upon which the query has to be executed is defined here. |
drillChart | If there is a drill down on the visualization, then the code of the Drill Down Visualization is added here. This will be used by Client Service to manage drill-downs |
aggregationPaths | All the queries will be having Aggregation names in it. In order to fetch the value out of each Aggregation Responses, the name of the aggregation in the query will be an easy bet. These aggregation paths will have the names of Aggregation in it. |
_comment | In order to display information on the “i” symbol of each visualization, Visualization Information is maintained in this field. |
UI configuration docs for trade licence module
name | Name of the Dashboard which has to be displayed as Page Heading |
id | Unique Identifier of the Dashboard which should be used later for Querying each of these Visualizations |
isActive | Active Indicator which can be used to quickly disable a dashboard if required. |
style | Style of the Dashboard. Whether it should be a linear one or a tabbed one. This information is maintained in this parameter. |
visualizations | The list of visualizations that are to be displayed in the Dashboard is listed out here. |
visualizations.row | The row identifier for each Visualization are mentioned here |
The name of an individual visualization is added here |
visualizations.vizArray | The list of Charts within the Visualization is specified in this list. |
Group of Charts is given an ID to have a placement on the Dashboard. This unique identifier is maintained in this field. |
Group of Charts is given a name that can be displayed on the group on Dashboard in that row. |
visualizations.vizArray.dimensions | Each of these group of charts is given a dimension based on which they are placed in a specific row in a dashboard |
visualizations.vizArray.vizType | As there are multiple charts grouped into one visualization, the type of Visualization needs to be specified in order to indicate to the client application what goes inside each of these visualizations and charts inside them vizType used for any other dashboards:- metric-collection, chart, performing-metric metric-collection:- Used to specify the type as single or group of metric chart type 2. performing-metric:- Used perform chart type 3. chart:- Used chart type for pie, donut, table, bar, horizontal bar, line vizType used for the Home page:- collection, module collection: used in UI style as full width 2. module: used in UI style for specific width. |
visualizations.vizArray.noUnit visualizations.vizArray.isCollapsible visualizations.vizArray.ref | The value types of these charts are different. Some are numbers, some are amounts, some are percentage. In the case of amounts, there is a requirement to display in Lakhs, Crores and Units. In order to indicate the client application whether to display these units or not, we have this boolean to control that The value type is for card/visualisation collapsible as boolean values. This object contains url (as mandatory), logoUrl (optional), type(optional). |
visualizations.vizArray.charts | The list of individual charts inside a Visualization Group is maintained in this array list |
Individual Chart Number Identifier to indicate the uniqueness of Charts |
Name of the Chart which can be a header label for Charts within a Visualization |
visualizations.vizArray.charts.code | Code of the Chart is the indicator that has to be sent to Server Side to get the data for representing the Visualization. |
visualizations.vizArray.charts.chartType | Type of Chart which has to represent the data result set that is obtained is specified here chartType:- bar, horizontalBar, line, donut, pie, metric, table |
visualizations.vizArray.charts.filters | Filters that can be applied to the Visualization and what are the fields which are filterable are mentioned here. |
visualizations.vizArray.charts.headers | In some cases, there are headers which can be a title or additional information for the Chart Data which gets represented. This field is kept open to accommodate the information which can be sent along with the Chart Data in itself. |
roles | List of Roles that are available in the system |
roles._comment | Role Description and comment on why does this role has an entry in this configuration and sums up the summary as to what are the things that are to be enabled. |
roles.roleId | Unique Identifier of the Role for which Access is being given |
roles.roleName | Name of the Role for which the access is being given |
roles.isSuper | Boolean flag which defines whether the Role is a Super User or not |
roles.orgId | Organization to which the Role belongs to |
roles.dashboards | List of Dashboards that are enabled for the Role |
Name of the individual Dashboard which has been enabled |
Identifier of the individual Dashboard which has been enabled |
| true/false | By setting this property true, show you the search result of receipt in a bucket(page) which contains a certain number of records. |
| TRUE/FALSE | Make module name in URI path mandatory |
| Certain number (say 30) | Give the 30 records at a time and next 30 results are in the next page. |
| true/false | By setting this property true, enabling the creation of user with receipt creation |
| This property is used for creation of receipt number using ID-GEN service |
| true/false | If servicebased is set to false, use default state level format for the format of receipt number and if it is set to true the format for the receipt number has to be mentioned in MDMS |
| [cy:MM]/[fy:yyyy-yy]/[SEQ_COLL_RCPT_NUM] | Default state level format for the receipt number. |
| true/false | By setting this property true, show you the search result of payment records in a bucket(page) which contains a certain number of records. |
egov.collection.payment-create | The kafka topic on which the record has to push/pull when payment is created. |
egov.collection.payment-cancel | The kafka topic on which the record has to push/pull when payment is cancelled. |
egov.collection.payment-update | The kafka topic on which the record has to push/pull when payment is updated. |
Billing-service |
Id-Gen service |
url-shortening |
MDMS |
/payments/_create |
/payments/_update |
/payments/_workflow |
PUT dss-payment_v2 {} // add mapping file content here. mapping.json as attached below |
1. | es.direct.push | true | the transformed data will be pushed to ES index directly. |
2. | es.direct.push | false | the transformed data will be lying at egov-dss-ingest-enriched topic |
Method End Point Body | POST {host}/dashboard-ingest/ingest/migrate/paymentsindex-v1/v2 {"RequestInfo":{"authToken":"2ba70924-1bba-4a9b-b55d-2e9471bf3081"}} |
2. | CURL |
This service is used to issue a license to the user after verification. The service is designed in such a way that it can be used to serve different type of licenses. Currently used to issue trade licenses, perform stakeholder registration and issue lockdown pass. The service is integrated with workflow where we can define the steps for approval of the application. Once the application is approved the license is generated.
Before you proceed with the documentation, make sure the following pre-requisites are met -
Java 8
Kafka server is up and running
egov-persister service is running and has tl-services persister config path added in it
PSQL server is running and database is created
Used for license generations in trade licenses, stakeholder registration and issue lockdown pass
Define roles to applicants on successful application to access Building Plan Approval services at the time of stakeholder registration
Generate application number and license number
Support workflows
Provide notification on various status changes for an application
Add MDMS configs required for Trade License and BPA stakeholder registration and restart MDMS service
Deploy the latest version of tl-services service
Add tl-service persister yaml path in persister configuration and restart persister service
Add Role-Action mapping for API’s
Create businessService (workflow configuration) according to trade license and stakeholder registration
Add tl-service indexer yaml path in indexer service configuration and restart indexer service
Following application properties in the Trade License service are configurable.
The trade-license service is currently used to issue trade licenses, perform stakeholder registration and issue lockdown pass.
Provide backend support for the different license registration process.
Mseva and SMS notifications on application status changes.
The elastic search index for creating visualizations and Dashboards.
Bpa Stakeholder registration provides new roles to the user to access the Building Plan Approval system.
Supports workflow which is configurable
To integrate, host of tl-services service should be overwritten in the helm chart.
{servicename}/_create/ _create should be added as the create endpoint for creating any license in the system
{servicename}/_search/ _search should be added as the search endpoint. This method handles all requests to search existing records depending on different search criteria
{servicename}/_update/ _update should be added as the update endpoint. This method is used to update fields in existing records or to update the status of the application based on workflow.
In all below endpoints if the service name is BPAREG it is treated as a stakeholder registration application and if it is TL or if it is absent then the application is treated as trade license application.
Stakeholder registration APIs:- https://www.getpostman.com/collections/d18b79ccfb69ee8bb526
Trade-License APIs:- https://www.getpostman.com/collections/99f98723c45f97024831
Trade License Calculator service is used to calculate the Trade license fees / renewal fees based on the defined billing slabs. This service enables the TL admins to create billing slab with different combination of license type, trade type, structure type and accessory type. The service is designed in such a way that it can be used to serve different type of licenses.
Before you proceed with the configuration, make sure the following pre-requisites are met -
Java 8
Kafka server is up and running
egov-persister service is running and has tl-calculation-persister & tl-billing-slab-persister config path added in it
PSQL server is running and a database is created to store TL Application data
Following services should be up and running:
egov-perister
egov-mdms
tl-services
billing-service
TL Admin an Employee of ULB with TL_Admin role can create, update billing slab(s)
ULB Employee with TL_CREATOR and TL_ADMIN can search billing slab(s)
TL service internally calls tl-calculator to generate demand.
Deploy the latest version of tl-service and tl-calculator
Add tl-calculation-persister.yml & tl-billing-slab-persister.yml file in config folder in git and add that path in persister . (The file path is to be added in environment yaml file in param called persist-yml-path )
tl-calculator will be integrated with tl-services. tl-services internally invoke the tl-calculator service to calculate and generate demand for the charges.
Tl calculator application is used to calculate the Trade license Fees based on the different billing slabs in the DB that's why the calculation and demand generation logic will be separated out from TL services. So in future, if calculation logic needs to modify then changes can be carried out for each implementation without modifying the TL services.
TL application to call /tl-calculator/v1/_calculate to calculate and generate the demand for the TL application
ULB Employee can create billing slab calling /tl-calculator/billingslab/_create
ULB Employee can update billing slab calling /tl-calculator/billingslab/_update
ULB Employee can search billing slab calling /tl-calculator/billingslab/_search
(Note: All the APIs are in the same postman collection therefore the same link is added in each row)
The Inbox page contains 4 react components:-
Application Links is a separate component that holds links to other pages of possible navigation from the inbox. This component is common in both mobile and desktop views. Links are conditionally rendered according to the user roles.
The Search Application component is a form-based component, that controls the Table component and the search param for Inbox API, it uses FormComposer HOC to render fields.
Validation of these fields is achieved by using controlled component rules
Any number of search fields can be added but by convention, only mobile numbers and application numbers are provided.
Filters contain input fields to filter the result of API, by sending search params to inbox API.
It contains 3 sections
Assigned to Me/ All - It is a radio component to send the assignedToMe param as true or false.
Locality - Filter result according to the selected locality by sending locality code in module search params in inbox API.
Status - Status filters are achieved by sending the id received from the inbox API response and mapping the name of businessService, status name and count
The table is a react component which uses the React-Table plugin, used in multiple modules
However, in Mobile view are using cards to list all the applications without pagination support.
On Inbox page {env}/inbox/v1/_search?_=1627374959930 is the only API that is called.
API CURL -
This feature allows the user to renew any trade license applications, which either has been expired or had to be renewed for current financial year (Approved and Paid), it also had integration with the payment component, in order to complete the flow all together for renewal.
Renewal can be two types:
DIRECT RENEWAL
EDIT RENEWAL
Once the user clicks on Renew Trade License button on the home page, it will redirect to the renewal list page which will display all the applications eligible for renewal corresponding to the mobile number on which the user has logged in. It will show the Trade name, License Number, Owner Name and status whether active or expired.
Once the user clicks on Renew Button, it redirects the user to the summary page just like in edit Trade license, with all the values pre-populated from the search API. The info card will be declared so that the user will understand how to proceed with either direct renewal or edit renewal.
If the user just wants to renew the same application without updating any data, it will cross-verify all the values in the summary page and then click on submit button directly at the end of the page, this will lead the application to the next status as pending for payment and user can go through payment flow from the acknowledgement screen also, by clicking on the Make Payment button.
If the user before renewal needs to update the application, they can do so by clicking on the change button on the summary screen, this will tell that the flow has been changed from direct to edit renewal, and the user will need to follow the same apply flow in order to complete the editing part of it. the values from the application will be pre-populated in the respective screens, to get the details about the edit flow, one can refer to this link. Send Back Flow - Edit.
Once the user clicks on submit button it will change the current action of the application to pending for document verification as the data has been updated.
Renewal Trade main index can be found in the below-given link:
in this, we are calling the trade license search API, In order to get all the applications, through which the sorting is happened to classify which applications are eligible for renewal for the current financial year.
the hook which has been used for the API is:
The data from here then are sorted into the application which doesn’t have any open renewal application for the current financial year or which has a status of approved or expired. the significant method to get the renewal list of applications is mentioned below:
From here the Trade License List Component has been called which displays the list of the renewal application.
The main functionality of converting the License Object received from the API to the object structure for formdata for apply flow, following is done in a similar way as the edit trade, and the same method is being used to convert the response object, to know more details please refer to Send Back Flow - Edit
Once the user has completed the flow as required or clicked the submit button directly, the method convertToEditTrade
is being called, which re-arranges the data for the request body for the updated API /tl-services/v1/_update
.
If it is a direct renewal only one update API is being called which updates the financial year only.
but if it is an edit renewal, two updated API is called after the first API successful call the application status gets changed to Initiated but after the second API call, it is changed to apply. with the next action pending for document verification.
The code for these can be found in the utils folder index please refer to the below link for the same:
https://github.com/egovernments/digit-ui-internals/blob/main/packages/modules/tl/src/utils/index.js
MDMS data which is being used here is the same as the Apply flow only, as the flow structure used for edit renew trade is the same as the Apply for Trade License. Please refer to the link for detailed MDMS information.
For Renew Trade also, the Localization keys are being added in the ‘rainmaker-tl’ locale module. Change, update or add any new localization key will be done in the same locale module only.
This feature allows the user to edit the application already created under their mobile number. After verifying employee can send the application back to the citizen with remarks on any changes that needed to be done, which can be edited by the user using this flow.
On the Application details page, on the employee side, if the application is marked with “Send Back to Citizen”, the edit option will appear dynamically at the end of the application details page, which the user can navigate through my applications.
After this, On clicking the button, the user can edit the trade license details by going through the Create Flow again. First, it will land on the Summary page, where for each section “change” button is there. Clicking on the Change button, the user will be redirected to the particular content, the only exception here will be the values will be pre-populated from the License object received from Trade License Search API, on completing the flow, Update API will be called and License application will get successfully updated.
Edit Trade License main index can be found in the link given below:
Here the main code consists of the function which results in transforming the License object received in Search API to the object structure which is suitable for citizen Apply flow (owner details, units, accessories etc), as the user needs to go through Apply flow again with pre-populated details and update the value of any accordingly. it also consists of the routing for the pages in the Apply flow.
getTradeEditDetails() function is being used so that the License object which is received from the Trade Search API, is converted to the Apply flow relevant structure so that the values can be pre-populated for the user convenience, on completing the flow, the application is updated. The link for the same can be found below:
Similarly, for owners the method which is used to form the new request param array is gettradeownerarray
It is similar to Accessories and Units - the only difference is in UI. Users can’t select multiple owners and only add one owner, it needs to either add more than one owner or select a single owner in the ownership category and proceed. After the successful update, the application's next action will be “Pending for document verification” as there is an update in the data.
On completing the flow, the same object structure which was being used earlier in the flow gets changed into the request body structure for the update API: /tl-services/v1/_update
, for this, the method which gets used is declared inside the Utils folder. Method name: convertToResubmitTrade
and it can be found in the below link:
MDMS data that is being used here is the same as the Apply flow only, as the flow structure used for edit trade is the same as the Apply for Trade License. Please refer to the link for detailed MDMS information.
For Edit Trade also, the localization keys are added to the ‘rainmaker-tl’ locale module. Change, update or add of any new localization key is done in the same locale module only.
Search Application and Search License pages are used for searching any application/ license that may or may not be relevant to the workflow action of the logged-in users.
Search Application has 2 components.
A search field component is a form which takes inputs and passes them into tl-search API params. It utilizes SearchForm and SearchField components to create and arrange the form.
Result Table uses the Table react component and the result from API is adapted to the table config using a custom hook inside the common parent wrapper and passing the response to individual components.
Search License has a fixed param where the status of the application is “APPROVED”, other than differences in table config
The API end point for searching trade licenses is {env}/tl-services/v1/_search
API CURL -
Provide employee purpose workflow actions.
The same screen is used for both application details and trade details.
Based on the conditions, we are showing the details here
Example: If the application is not in an approved state and the business Service New TL, then we are showing application details otherwise we are showing trade details.
For workflow action details, please refer to the file below.
The workflow is the same as Old UI only, please refer to the documentation link below.
Users can review the list of applications and their status registered under their mobile numbers in the My Applications tab. Each Application for the initial view displays the Application No, Service Category, Owner Name (Multiple with a comma), status, SLA, and Trade Name with the View Details option. If the status is pending for payment the View Details & Pay button is available that enables the users to look up more details about the application.
Once the user clicks on the View Details or View Details & Pay button, the Application Details Page is displayed with all the necessary information about the application.
The user can download the Application Acknowledgement Form (status - pending for payment ) or TL Certificate or the payment receipt using the Download Link button available at the top right corner of the page.
If the status is Pending for Payment for the application or Action required by a citizen (discussed elaborately here), a button will be visible to pay or edit at the end of the page respectively. On clicking on the Make Payment button it will redirect to the common pay screen through which the user can make the payment.
Timeline Component - timeline component is present at the end of the application details which tells about the current status and history of the application being initiated, Applied, Pending for Document Verification, Pending for Field Inspection, Pending approval, Pending payment, Approved etc.
The link for the Applications and Application Details main code is given below, it can be used to understand the working of the code, Below is the folder link.
The template for My Application List is present under https://github.com/egovernments/digit-ui-internals/blob/main/packages/modules/tl/src/pages/citizen/Applications/Application.js and Application Details page is present inside - https://github.com/egovernments/digit-ui-internals/blob/main/packages/modules/tl/src/pages/citizen/Applications/ApplicationDetails.js .
All the Application lists are retrieved by calling the search API "/tl-services/v1/_search
". If the view is set as “bills”, all the application is loaded using the hook useFetchBill
which calls the /billing-service/bill/v2/_fetchbill
API. SLA value in the Application List Screen is calculated from the data received from workflow API : /egov-workflow-v2/egov-wf/process/_search
Following is the hook used for the trade search API.
To get the Application details in key-value format, in order to make it more compatible, the following hook is being used, which is a common service to be used across modules.
No MDMS data is being used here, all the data is being loaded from Search API/Fetch Bill API.
For My Applications also the localization keys are added in the ‘rainmaker-tl’ locale module same as other parts of the TL module. Change, update or add any new localization key is done in the same locale module only.
Objective: To provide the facility for the user to create a trade license application for the current year by citizen users or counter employees.
Users can apply for a trade license application by clicking on the Apply for Trade License button. Users can add all the information as per the questions asked across the workflow. The summary screen at the end of the flow displays all details for review. Users can click on the submit button after review. The application for a trade license is created for the current financial year.
Apply Flow - The trade license registration screen is displayed after login which helps users identify the documents required to apply for a trade license. A citizen info card at the bottom of the page displays any additional information about the maximum size of the file that can be uploaded.
Trade License Details/Assessment Flow - This flow captures the trade-specific information required for registering the trade.
Trade Name - The user provides the name of the trade. An info card is displayed at the bottom of the screen stating that the license will be issued for the current financial year. The financial year value is retrieved from the MDMS.
Structure Type - The users can select yes or no based on whether the trade has mobility or not. If yes, the next screen prompts you to enter the vehicle type. If no, the next screen moves to the building type page.
Vehicle Type / Building Type - The options for vehicle and building type are fetched from the MDMS. The Building Type screen displays an information card about the pucca or kuccha options.
Commencement Date - It defines the date on which the trade started or will start in the future.
Trade Units - Users must provide the trade category as either goods or services. Based on the selected option the Trade Type is loaded from the MDMS as a drop-down list. The trade sub-type options are loaded based on the selected trade type. The unit of measure and UOM value get pre-populated from the MDMS as per the options selected above.
Users must enter at least one unit to move forward. Clicking on Add More Unit option enables users to enter additional units. Clicking on the delete icon on the top right corner of the unit card removes the unit.
Accessories - The Accessory page inquires if there are any accessories required for the business. Accessories may not be compulsory for all trades. If yes, it will move to the accessory details page. If not, it will skip it altogether and will load the address flow.
Accessory details - The options for accessories are retrieved from the MDMS. The Unit of Measure or UOM is pre-populated and cannot be edited. The users can edit the UOM value and the accessory count. In some cases, these are pre-populated from the MDMS. Clicking on Add More Trade Accessory button allows users to add multiple accessories.
If the citizen selects Movable as the structure type in the previous screens, then the flow will jump to the owner details flow. Here the Same as Property Owner’s check box will not be visible.
If the citizen selects Immovable as the structure type then the user is allowed to add the property details. Once the property is added, the flow will redirect to the owner details flow where the Same as Property Owner check box is displayed. If the user checks it, the following details get auto-populated and the screen skips to the proof of Identity page.
Common PT integration with TL: After entering the trade details, users have the option to either search and integrate the already created property or create new lightweight property data for the trade license. This step can be skipped and users can proceed with the normal address details flow.
Once a property is selected user can see the details of the property on the property details page. Refer to the Common PT document for more details.
Address Details Flow - In the next flow, users have to enter the trade address details. This flow is straightforward, without any conditional routing.
Users can pinpoint the location in the Geo-location map, according to which pin code and city, as well as locality, is auto-filled.
Owner Details Flow - Finally, the users need to enter the trade owner details. Ownership can be Single or Multiple Owners. According to which the details are filled.
In the case of single/multiple owners, the following screen is displayed. The remaining flows remain the same.
Users can add multiple owners by clicking on the add owner button - a similar functionality as in trade units and accessories. The Add Owner button is not visible in case the user selects a single owner on the previous page.
The user must provide the owner's primary address and upload three documents that include address proof, owner identity and owner photograph.
Check Page and Acknowledgement Screen - Users can cross-verify the data entered throughout the flow in the Check page. Clicking on the change option adjacent to the data fields allows users to make any changes or updates to the data. The user is redirected back to a corresponding information page and the entire flow is repeated once again to submit the application.
The Applying of Trade License Create API is called. Create API snippet:1create: "/tl-services/v1/_create"
If the API response is successful, then the Acknowledgement Screen is displayed, otherwise Failed Acknowledgement Screen is displayed.
Clicking on the Download Acknowledgement Form button downloads the PDF copy of the acknowledgement.
On the Trade Units page, values for trade type and trade subtype have been loaded by the following MDMS call:
The following validations have been added for the same :
When users select trade type and then subtype, it is compared with the available billing slab. The hook for this is given below. In case the billing slab is not there it will not allow users to move forward and an error message is displayed.
Once the correct trade type and subtype are added and the correct billing slab is there, the UOM value validation is added. This checks the value in the given range, mentioned in the billing slab object and displays an error if the value is outside of the range.
All screens are developed using the new-UI structure followed previously in FSM, PGR and PT, except for multi-component.
The link for the Apply Trade License Main Index is given here and it helps understand the starting point of the flow: https://github.com/egovernments/DIGIT-Dev/blob/master/frontend/micro-ui/web/micro-ui-internals/packages/modules/tl/src/pages/citizen/Create/index.js
The TL (Trade License) module is segregated into a specified structure. The screen configuration is inside the PageComponent folder, and the configuration for routing of the pages are mentioned under the config folder which is common for both citizen users and employees. Below is the snippet for folder structure and routing configuration.
The pages folder is where the high-level configuration for controlling the whole flow is mentioned, for citizens and employees. Citizen flows include Create, Edit Trade, Renewal, Applications and Search Trade. The index or the starting point of the entire flow is available in this folder.
In the Accessory-details page, the Billing slab search API "/tl-calculator/billingslab/_search"
is called. This returns the array list of all the accessories for which the billing slab has been configured. If the response returns an empty array then the options are curated from the MDMS API mentioned in the MDMS data section.
After completing the flow the user can download the acknowledgement PDF form of the License created. PDF generation config link: https://github.com/egovernments/digit-ui-internals/blob/main/packages/modules/tl/src/utils/getTLAcknowledgementData.js
The Utils folder basically contains all the methods used throughout the TL module. Additional common methods can be imported and added to this folder.
For creating an application the Create API from Trade License is called using the React hooks. This is declared in the hooks/elements/TL as TLService.
There are multiple pages within the workflows where data is imported from the MDMS. The table below lists the pages .js files for distinct page components.
React Hooks are used to call MDMS data that is shared across the modules. Below is the code snippet for the MDMS call.
Localization keys are added to the ‘rainmaker-tl’ locale module. In future, if any new labels are implemented in the Trade License (Citizen) it is pushed to the locale DB in the rainmaker-tl locale module. Below is an example of a few locale labels.
The trade license 'apply' is the major feature in TL Module. It allows Citizens or Counter Employees to create TL Applications for the current financial year.
Every application is a part of the workflow. Once the user login with TL_CEMP
role, then the User will get the option for creating a New TL Application in the TL card as well as in the inbox.
Clicking on New Application navigates to the New Trade License Application screen.
Initial MDMS call is being made on page load like old UI.
Data fetch, load and render
Clicking on Submit button, tl-services/v1/_create
api is called and create the application, after getting success response, we are calling update API tl-services/v1/_update
.
Acknowledgement Screen
After the success of creating and updating calls will route to the acknowledgement screen.
````
````
``
curl -X POST -H 'cache-control: no-cache' -H 'content-type: application/json' -H 'postman-token: d83fc136-116d-265f-3b83-ea41e3d5bb57' -d '{"RequestInfo":{"authToken":"2ba70924-1bba-4a9b-b55d-2e9471bf3081"}}'
Property | Value | Remarks |
---|---|---|
Title | Link |
---|---|
API | Description |
---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Title | Link |
---|---|
Title | Link |
---|---|
User can delete and add as many accessory or units as it needs but there should be at least one unit to complete the application, to add a new unit or accessory “Add” button is used which is located at the end of the page. A new array is formed with all the updated details or with the old unit/accessory, when the flow is completed, this new array is then compared with the old array of accessories and units, and a new resulting array object is formed for the request body, you can find the respective code in the following method : gettradeupdateaccessories
& gettradeupdateunits
. this can be found in the below link digit-ui-internals/index.js at main · egovernments/digit-ui-internals
It is common for all modules, find the path here: digit-ui-internals/ApplicationDetailsContent.js at main · egovernments/digit-ui-internals
S.No. | API | Roles | Action ID |
---|---|---|---|
PageComponent | MDMS Detail | Module Details Name | Master-Detail Name |
---|---|---|---|
API | Action ID | Roles |
---|---|---|
File path: digit-ui-internals/TLCard.js at main · egovernments/digit-ui-internals and digit-ui-internals/DesktopInbox.js at main · egovernments/digit-ui-internals
Route: mSeva
Structure Type and Sub Structure Type field data is fetched from egov-mdms-data/StructureType.json at master · egovernments/egov-mdms-data
Trade Category, Trade Type, Trade Sub Type field data is fetched from egov-mdms-data/TradeType.json at master · egovernments/egov-mdms-data
Mohalla Data - egov-mdms-data/boundary-data.json at master · egovernments/egov-mdms-data (For Amritsar)
Type of Ownership and Type of Sub ownership - egov-mdms-data/OwnerShipCategory.json at master · egovernments/egov-mdms-data egov-mdms-data/OwnerType.json at master · egovernments/egov-mdms-data
On loading the page, /tl-calculator/billingslab/_search
api is called for showing the licence type (File Path: digit-ui-internals/TLTradeDetailsEmployee.js at main · egovernments/digit-ui-internals ) and accessories options (File Path: digit-ui-internals/TLAccessoriesEmployee.js at main · egovernments/digit-ui-internals ).
S.No. | API | Roles | Action ID |
---|---|---|---|
egov.idgen.tl.applicationNum.format
PB-TL-[cy:yyyy-MM-dd]-[SEQ_EG_TL_APL]
The format of the TL application number
egov.idgen.tl.licensenumber.format
PB-TL-[cy:yyyy-MM-dd]-[SEQ_EG_PT_LN]
The format of the TL license number
egov.idgen.bpa.applicationNum.format
PB-SK-[cy:yyyy-MM-dd]-[SEQ_EG_TL_APL]
The format of the Stake holder application number
egov.idgen.bpa.licensenumber.format
PB-SK-[cy:yyyy-MM-dd]-[SEQ_EG_PT_LN]
The format of the Stake holder license number
egov.tl.max.limit
100
Max number of records to be returned
citizen.allowed.search.params
tenantId, applicationNumber, limit, offset, licenseNumbers
The search parameters on which citizen can search
employee.allowed.search.params
tenantId, applicationNumber, applicationType, status, mobileNumber, fromDate, toDate, licenseNumbers, oldLicenseNumber, limit, offset
The search parameters on which employee can search
persister.save.tradelicense.topic
save-tl-tradelicense
The name of kafka topic on which create request is published
persister.update.tradelicense.topic
update-tl-tradelicense
The name of kafka topic on which update request is published
persister.update.tradelicense.workflow.topic
update-tl-workflow
The name of kafka topic on which update request is published
Local Setup
API Swagger Documentation (Trade License)
{servicename}/_create, _create
This API is used to create an application for the license in the system. Whenever an application is created an application number is generated and assigned to the application for future reference.
{servicename}/_search, /_search
This API is used to search the applications in the system based on various search parameters like mobile number, the application number, status etc.
{servicename}/_update, _update
The _update API is used to update the application information or to forward the application from one state to another.
In the case of the stakeholder registration if the application reaches the last stage the role depending on the license type is given to the user.
{servicename}/{jobname}/_batch, /_batch
Searches trade licenses that are expiring and send a reminder SMS to owner's of the licenses
tl-calculator/billingslab/_create
tl-calculator/billingslab/_search
tl-calculator/billingslab/_update
tl-calculator/v1/_calculate
1
egov-mdms-service/v1/_search
CR_PT
954
2
/tl-services/v1/_update
TL_APPROVER, TL_CEMP, EMPLOYEE, TL_DOC_VERIFIER, TL_FIELD_INSPECTOR
2029
3
/egov-workflow-v2/egov-wf/process/_search
EMPLOYEE
1730
4
/tl-services/v1/_search
EMPLOYEE
, TL_APPROVER
, TL_CEMP
1687
5
/egov-hrms/employees/_search
TL_APPROVER, TL_CEMP, EMPLOYEE, TL_DOC_VERIFIER, TL_FIELD_INSPECTOR
1752
TradeLicense
List of documents required for registration
TradeLicense
Documents
SelectTradeName
Current Financial Year
egf-master
FinancialYear
SelectVehicleType
Type of mobility trade
common-masters
StructureType
SelectBuildingType
Type of Steady Trade
common-masters
StructureType
SelectTradeUnits
List of trade category and its corresponding type and sub-type
TradeLicense
TradeType
SelectAccessoriesDetails
Lit of Accessory and its Unit of measure and UOM value
TradeLicense
AccessoriesCategory
TLSelectAddress
List of cities and its corresponding localities
TradeLicense
tenants
SelectOwnerShipDetails
categories imported are single and multiple owner.
common-masters
OwnerShipCategory
SelectOwnerDetails
List of Gender options.
common-masters
GenderType
/egov-mdms-service/v1/_search
954
CITIZEN
/tl-services/v1/_create
1685
CITIZEN
/filestore/v1/files/url
1528
CITIZEN
/billing-service/bill/v2/_fetchbill
1862
CITIZEN
/tl-calculator/billingslab/_search
1684
CITIZEN
/tl-services/v1/_update
1686
CITIZEN
/localization/messages/v1/_search
1531
CITIZEN
1
/egov-mdms-service/v1/_search
TL_CEMP
954
2
/tl-services/v1/_create
TL_CEMP
1685
3
/tl-services/v1/_update
TL_CEMP
1686
4
/tl-calculator/billingslab/_search
TL_CEMP
1684
API Swagger Contract
Trade License Document