Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Introduced three new services: PQM Service, PQM Anomaly Service, and PQM Scheduler Service.
Create, update, and search for process quality monitoring tests.
Record test results gradually for a test and submit from evaluation when all the test results available
Evaluation of test values against benchmarks configured in MDMS, for producing a test result (FAIL/PASS) status.
Test results undergo anomaly analysis for comprehensive insights.
Plant performance related charts.
Actively monitors anomalies in process quality.
Added notification for relevant user groups promptly for proactive anomaly management.
Automation of test scheduling based on environment configuration and MDMS test standard configurations.
Triggers schedule APIs from PQM-Scheduler to generate tests at defined frequency.
Efficiently runs tests at specified intervals, ensuring a configurable approach.
A new worker registry concept has been introduced. Creating a worker, updating details, searching and tagging a worker for different operations on sanitation programmes. Sanitation workers' creation, updation, and search operations are performed using the individual registry.
Added functionality for creating, updating, and searching for sanitation workers.
Deprecated the drivers tab and added the driver concept from the FSM registry.
Introduced the sanitation workers tab to the FSM registry.
Added functionality for assigning. a sanitation worker to a vendor.
Migration of existing drivers to sanitation workers.
Deprecation of delinking of a driver with vendors introduced in v1.3.
Tag sanitation workers (drivers and helpers) to an FSM application for capturing the workers' activities.
The inbox for FSM has been upgraded from V1 to V2.
No core services have been modified. Workflow details, flow diagram, and API details are given in confluence.
Introducing New Services for Enhanced Process Quality Management
Process Quality Management (PQM) Service.
Process Quality Management (PQM) Anomaly Finder Service.
PQM Scheduler (CronJob Scheduler).
Enhancements
Sanitation Worker Welfare feature
Driver-Individual migrate feature
FSM inbox V2
Feature | Description |
---|---|
Feature | Description |
---|---|
The platform is built and designed as a Digital Public Good.
The source code for the platform is hosted .
To set up the platform, follow the installation steps listed .
The following MDMS changes were done as part of the FSM v1.4 release:
Feature | Service name | Changes | Description |
---|
For MDMS-V2 changes, refer to the below table for the sequence in which MDMS schema and data needs to be added for the PQM service:
MasterName | Schema Link | Required Fields | Unique Fields |
---|
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json:
Click to learn more about our features.
UI Technical Documents | Backend Service Documents |
---|---|
Make sure TQM and FSM is enabled in this master ->
data/pg/common-masters/howItWorks.json ->
data/pg/FSM/SanitationWorkerSkills.json ->
data/pg/FSM/SanitationWorkerEmploymentType.json ->
data/pg/FSM/SanitationWorkerEmployer.json ->
data/pg/FSM/SanitationWorkerFunctionalRoles.json ->
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Field | Definition |
---|
Service
API's
Description
New/Update
PQM Service
/pqm-service/v1/_create
Adhoc create test
New
/pqm-service/v1/_update
Update test with results, save test with partial results as drafts
New
/pqm-service/v1/_search
Search for PQM tests
New
/pqm-service/v1/_scheduler
Schedule test based on the test standards in MDMS
New
/pqm-service/v1/_plainsearch
API for redexing the data
New
pqm-service/plant/user/v1/_create
Add plant and user mapping
New
pqm-service/plant/user/v1/_update
Update plant and user mapping
New
pqm-service/plant/user/v1/_search
Search for plant user mapping
New
PQM Anomaly FInder Service
/pqm-anomaly-finder/v1/_search
Search for anomalies
New
/pqm-anomaly-finder/v1/_plainsearch
Reindex the anomaly data
New
FSM
/fsm/v1/_update
Added list of workers object for tagging workers to the FSM application
Updated
Vendor
/vendor/v1/_update
Added list of workers object to assign sanitation worker to vendor
Updated
Process Quality Management (PQM) Service
Create, update, and search for process quality monitoring tests.
Evaluate test values against benchmarks, producing a result (FAIL/PASS) status.
Test results undergo anomaly analysis for comprehensive insights.
Plant performance related charts.
Process Quality Management (PQM) Anomaly Finder Service
Actively monitors anomalies in process quality.
Notifies relevant user groups promptly for proactive anomaly management.
PQM Scheduler (CronJob Scheduler)
Automates test scheduling based on environment configuration.
Triggers schedule APIs from PQM-Service to generate tests.
Efficiently runs tests at specified intervals, ensuring a configurable solution.
Sanitation worker welfare feature
A new worker registry concept has been created.
The creation of a worker, updation of details, searching and tagging a worker for different operations on sanitation programmes.
Driver-individual migration feature
The driver-individual migration scripts is responsible for fetching all existing drivers in the system and subsequently generating creates corresponding individuals in the system.
FSM Inbox V2
The inbox for FSM has been upgraded from V1 to V2.
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
manualTestPendingEscalationDays | The number of days after which a scheduled test that is still pending requires escalation. |
pendingTestsToDisplayWithinDays | The number of days within which pending tests, assessments, or evaluations should be displayed or considered. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
input | Materials provided as input to a stage. |
output | Materials provided as output to a stage. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
type | Defines the type of process. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
stages | A list of stages that come under a particular process. |
wasteType | The classification of waste materials based on their characteristics or origin. |
code | Alphanumeric or numeric representation assigned to uniquely identify the record. |
name | Textual or human-readable identity given to a record. |
description | Details or explanation for a record. |
plantType | The classification of plants based on their processing. |
wasteType | The classification of waste materials based on their characteristics or origin |
address | Location details for a particular plant. |
processes | A list of processes that happen under a particular plant. |
plantConfig | Configuration details for a particular plant. |
ULBs | Comma separted ULB list who have operational access to. e.g. |
PlusCode | Address of the plant. e.g. |
Latitude | Latitude of the plant location |
Longitude | Logitude value of the plant location |
PlantLocation | Location of the plant. e.g. |
PlantOperationalTimings | Plant Operational Timings. E.g. |
PlantOperationalCapcityKLD | Capacity of the plant for operating at max. E.g. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
parameter | Anything that is measurable as an input/output for a particular stage. |
unit | The unit for measuring this particular parameter. |
benchmarkRule | The rules according to which a test value should be tested (For example, greater than, less than, equals to). |
benchmarkValues | Specific numbers on which the benchmark rule is applied for a test value. |
allowedDeviation | The acceptable difference from the benchmark values. |
code | Alphanumeric or numeric representation assigned to uniquely identify the field. |
plant | Plant code for which this test standard is registered. |
process | Process code for which this test standard is registered. |
stage | Stage code for which this test standard is registered. |
material | Material code for which this test standard is registered. |
qualityCriteria | The quality criteria which is applicable for the unique combination of plant, process, stage, and material. |
frequency | The frequency at which this test standard should be scheduled. |
sourceType | The origin of this particular test standard. |
This page contains the changes related to Process Quality Management-related services (PQM) and Sanitation Worker Welfare along with the MDMS, DevOps and configuration setups required to accommodate these features.
The PQM service is required to create, update, search and evaluate tests against benchmarks. Follow the steps given below:
Add the PQM-Service.
Add the new persister file: pqm-service-persister.
Add the Helm chart.
Configure the build.
Make the role action-mapping changes in the MDMS.
Master Data for PQM
Click on the Job-builder once the above steps are complete.
Restart the following services: egov-accesscontrol, egov-mdms-service, egov-persister
Add the PQM Scheduler configuration.
Add the Helm chart.
Configure the build.
Make the role action-mapping changes in the MDMS.
Click on the Job-builder once the above steps are complete.
Restart the following services: egov-accesscontrol, egov-mdms-service.
Add the PQM Anomaly Finder service.
Add the Helm chart.
Configure the build.
Add mdms changes, refer to this link.
Add config changes, refer to this link.
Add the Helm chart, refer to this link.
Deploy the latest build of vendor and FSM, refer to this link.
Sanitation worker depends on the individual service. Refer to this link.
Deploy the latest build of individual service refer link#
Migrate all the drivers to individual using script
Make sure UI MDMS changes are added from #ui-related-mdms-files.
Refer to the technical documentation for TQM UI to run UI TQM UI
Refer to the technical documentation for sanitation worker UI Sanitation Worker UI
PQM Business Service Request
Instructions for Production execution:
Replace the tenant id for production environment
Replace the request info object with production user info details
For PQM new Business service PQM has been created
Create businessService (workflow configuration) using the /businessservice/_create
. Following is the product configuration for PQM
The Driver-Individual Migration Scripts is responsible for fetching all existing drivers in the system and subsequently generating creates corresponding individuals in the system.
The data migration has been implemented in Vendor Service with latest stable build -
File Path -
File Path -
Please use FSM_ADMIN credentials for accessing this API.
PQM inbox | MDMS for inbox | Added MDMS configuration for inbox-v2 integration. |
Role Action | MDMS | Added role-action mapping for all APIs of the PQM service. |
PQM.BenchmarkRule | id, code, name | code |
PQM.QualityTestLab | code | code |
PQM.Material | code, name | code |
PQM.Parameter | code, name | code |
PQM.PlantConfig | code, pendingTestsToDisplayWithinDays, pendingTestsToDisplayWithinDaysInbox, pendingTestsToDisplayWithinDaysForULB, iotAnomalyDetectionDays, manualTestPendingEscalationDays | code |
PQM.PlantType | code, name | code |
PQM.ProcessType | code, name | code |
PQM.Unit | code, name | code |
PQM.WasteType | code, name | code |
PQM.SourceType | code, name | code |
PQM.Stage | code, name, output | code |
PQM.Process | code, type, name, stages, wasteType | code |
PQM.Plant | code, plantType, active | code |
PQM.QualityCriteria | code, name, plantType, wasteType, address, processes | code |
PQM.TestStandard | code, plant, process, stage, material, qualityCriteria, frequency, sourceType | code |
The Process Quality Management (PQM) anomaly finder service helps in monitoring anomalies in process quality and notifies the concerned user groups.
The following masters need to be created as part of this module:
Plant
PlantConfig
List of services that PQM service depends on:
User-event service api will get the notification for the anomaly .
Deploy the latest build if the service is not there in the environment “egov-user-event-db:v1.2.0-32caf0d992-25” .
Add the changes which are required . For this kindly refer the below pr.
Path: deploy-as-code/helm/charts/core-services/egov-user-event/Chart.yaml
Alert notification format needs to add in localisation.
Find Anomaly
Learn how to setup and configure FSM service
Faecal sludge management (FSM) is a system that enables citizens to raise a request for septic tank cleaning with their urban local bodies (ULBs) directly or by reaching out to the ULB counter. Citizens can track the application, make a payment for the charges, and rate the service. This document contains details on how to set up FSM, and describes the functionalities it provides. It contains the details about the feature enhancements being released as part of FSM v1.3.
As part of the FSM v1.4, a new worker registry concept is being introduced. The creation of a worker, updation of details, searching and tagging a worker for different operations on sanitation programmes will be covered. While assigning vehicle for the application, we have the functionalities which will provide the ability to assign multiple sanitation workers for the application.
The fsm service enables mapping plant codes to FSTP employee, ensuring that each is only authorised to manage to their account, enhancing control and accountability. In the mapping process need to check below mention things
PlantCode
: Unique plant code which is configured in the mdms
employeeUuid
: Unique user id of the employee which have the type of fsm-fstp.
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Kafka server is up and running
egov-persister service is running and has fsm-persister config path added in it
PSQL server is running and a database is created to store FSM Application data
(Optional) Indexer config for FSM is added in egov-indexer yaml paths to index the generated data. An index is required for data visualisation in kibana or in DSS.
Following services should be up and running:
egov-user
egov-workflow-v2
egov-perister
egov-localisation
egov-notification-sms
egov-mdms
egov-idgen
egov-url-shortening
vehicle
vendor
fsm-calculator
billing-service
collection-services
individual-services
Citizens can file, track, and rate the application for cleaning the septic tank.
A ULB employee can file application for cleaning the septic tank on behalf of a citizen.
A ULB employee can assign a DSO to the given application with a possible service date.
A DSO can accept or reject the application.
A DSO or a ULB employee can complete the FSM application after cleaning the septic tank.
The FSM admin in a ULB can cancel the application at any stage before completing the application.
A ULB employee or an admin can view the audit log of the given application.
Capture citizen gender information if not present or pre-populate the gender information when a citizen is creating the FSM application.
Add citizen's choice for payment.
Introducing pre-pay and post-pay service.
Post-pay service: Workflow changes (Desludging application and vehicle trip).
Post-pay service: Employee flow enhancements.
Add payment selection for DSO.
Post-pay service: Number of trips is editable, and price calculation will be now based on the number of trips entered by the DSO.
Capture DSO and FSTPO gender.
Show citizen gender on FSM DSS.
Select vehicle capacity instead of vehicle make.
Citizen Notifications | Payment Options | Timeline Enhancements.
FSTPO vehicle log inbox enhancements.
FSTPO can decline the vehicle trip.
Add owner attribute for vehicles.
Add ULB contact details in the FSM application flow.
DSO can edit pit and property usage details.
Show vehicle trip status in employee inbox along with the desludging application.
Unrestricted assignment of service requests to a single vehicle.
Vehicle logging at FSTP decoupled from the FSM module.
Photo and attachment view in the application of the ULB employee UI.
Dashboard enhancement.
Advance pay service: Employee flow enhancements.
Introduced two new workflows in the system:
- FSM_ADVANCE_PAY_SERVICE and FSM_ZERO_PAY_SERVICE .
Advance pay service: The number of trips is made editable (increase or decrease based on the requirement), and price calculation will be now based on number of trips entered by the DSO or ULB.
Allowed to pay part payment while creating the application.
ULB and DSO are allowed to decrease the number of trips if not required and if full payment is not done.
ULB and DSO are allowed to increase or decrease the number of trips n number of times.
With the updated number of trips, an updated bill will be generated.
Delink the payment from DSO in progress state.
Zero pay service: Employee flow enhancements.
Zero pay service: System now skips the collection, and will not generate the demand for zero price application.
Demand generation process: Generating demand every time the trip is updated.
Demand generation process: Added validation not to complete the application from ULB side before completing all payment.
Enhancement of FSM receipt.
Enhancement of assigning sanitation-worker to fsm application
Enhancement of assigning driver to fsm application
As part of the FSM v1.4, Creation of fstp plant user mapping configuration we moved PlantInfo from mdms v1 to mdms v2 . For the changes follow below steps.
For a new environment these changes are required.
Create plantInfo in PQM.Plant schema .
Use the existing collection to create,update and search
Users
User with userType employee and role FSM_CREATOR_EMP role.
FSM can be integrated with any ULB or system which wants to track the FSM application. The organisations can customise the workflow depending on their product requirements.
Easy tracking and resolution of the FSM application.
Configurable workflow according to client requirement.
Citizen/ULB employee can file application request using the /fsm/v1/_create.
An organisation or system can search the FSM Applications using /fsm/v1/_searchendpoint.
Once the application is filed, the organisation or system can call /fsm/v1/_update endpoint to move the application further in the workflow until it gets resolved.
Introduced new inbox service to get the FSM applications in registered ULB employee inbox. With this, a ULB employee can track the application or perform the actions based on employee role.
ULB employees can also apply the filter to check the particular state or applications or any other filter as required.
Inbox v2 provide same feature what ever have in inbox v1 .
Introduced new inbox service to get the FSM applications in registered ULB employee inbox. With this, a ULB employee can track the application or perform the actions based on employee role.
ULB employees can also apply the filter to check the particular state or applications or any other filter as required.
Integration
Add the mdms changes in the given path and restart the mdms service .
Path: data/pg/inbox-v2/InboxConfiguration.json
Add the indexer file in the given path and restart the services.
Path: sanitation/egov-indexer/fsm-inbox-indexer.yml
Currently, we provide FSM as an adhoc service.To avoid multiple times, a user has to create the FSM request every time. In the system itself after some days, we will create the same FSM application and if the user wants a service, he/she will pay the amount.
As mentioned above, we need to define the time parameter in order to create a periodic application. For that, we added the periodic service master where we configure the time limit and whether the schedular is enabled or not. Find the configuration and location below
cronjob will read the cron job’s configured in the cronjobapiconfig.json, and based on the schedular time, it will call the API which is configured. Find the configuration and file location below:
We are using the fsm/v1/_schedular API. This API will read the master data for each tenant and based on the time limit configured for that tenant, it will get all eligible applications and create periodic applications for those FSM applications.
data/pb/FSM/AdvancePayment.json
master-config.json
Infra changes
We added a new chart called mdms-read-cronjob. Find the chart location below:
test
Details for setting up FSM calculator sevice
FSM calculator is a system that enables the FSM admin to create billing slabs for the FSM application(s) with different combinations of property type, slum, tank, and capacity. It generates the demand after calculating the charges for the given application using the billing slab already configured.
This document contains the details on how to set up the FSM calculator service, describes the functionalities it provides, and details the enhancements made to the FSM calculator service.
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Kafka server is up and running
egov-persister service is running and has fsm-calculator-persister config path added in it
PSQL server is running and database is created to store FSM Application data
The following services should be up and running-
- egov-perister
- egov-mdms
- fsm
- billing-service
EXISTING
FSM admin, an employee of ULB with FSM admin role can create, update billing slab(s).
ULB employee with FSM_CREATOR and FSM_EDITOR can search billing slab(s).
ULB employee citizen can file, track and rate the application for cleaning septic tank.
ULB employee can get the estimate for the FSM application.
FSM service internally call fsm-calculator to generate a demand.
Vehicle type check has been removed from calculator service and the bill amount is calculated based on the number of trips entered while submitting the FSM application.
ENHANCEMENT
Bill amount is calculated based on the number of trips entered while updating the number of trips in the FSM application.
Added validation for advance payment with the configuration.
Added validation for maximum total advance payment.
Added cancellation charges for canceling the application.
Validation before completing the request with the payment.
Minimum part payment is configurable, that is, it should be fixed or percentage calculation, and the calculation should done based on the mdms config value.
Minimum cancellation fee is configurable, that is, it should be fixed or percentage calculation, and the calculation should done based on the mdms config value.
Demand generation process: Generating demand every time the trip is updates.
Demand generation process: Added validation not to complete the application from the ULB side before completing the payment.
Sample Curl
The FSM-calculator will be integrated with the FSM application. The FSM application internally will invoke the fsm-calculator service to calculate and generate demand for the charges.
The calculation and demand generation logic will be separated from the FSM service. For each implementation, the calculation implementation can be changed, if required, without modifying the FSM service.
FSM application to call fsm-calulator/v1/_calculate to calculate and generate the demand for the fsm application.
ULB employee can call fsm-calculator/v1/_estimate to get the estimates for the fsm application.
ULB Employee can create billing slab calling fsm-calculator/v1/billingSlab/_create
ULB employee can update billing slab calling fsm-calculator/v1/billingSlab/_update
ULB Employee can search billing slab calling fsm-calculator/v1/billingSlab/_search
FSM application to call fsm-calculator/v1/_cancellationFee to calculate cancellation charge based on the configuration data, that is, either it will be fixed or it will be a percentage.
FSM application to call fsm-calculator/v1/_advanceBalanceCalculate to calculate the advance charge based on the configuration data, that is, either it will be fixed or a percentage.
TBD
The release provides the functionality of tagging the gram panchayat (GP) and villages via urban local bodies (ULBs). The ULB will be servicing desludging requests for households in the respective GP.
The functionality included are:
Adding GPs while creating an application.
Make the amount per trip field editable for GPs.
Feature | Description |
---|---|
Click here to access the backend service document.
Sl No | Checklist | Yes/No/Partially | Reference link | Owner | Reviewer | Remarks |
---|---|---|---|---|---|---|
FSM uses several core components, and is built on top of these reusable building blocks to deliver citizen services as part of Faecal Sludge Management (FSM).
To develop a new service, one needs to create a micro-service and make it available through the API gateway. The API gateway calls the User Service for authentication and the Access Service for authorisation. The service developer can configure the roles and map the roles and actions using the Master Data Management Service.
Citizen Dashboard: The service user interface can be developed as part of the Citizen Dashboard or can be an independent solution. The citizen can log in using a mobile number and OTP. They can apply for a new service using the service UI. The File Store Service allows users to upload relevant documentation.
The Persister Service stores the submitted application asynchronously into the registry. The PII data is encrypted using the Encryption Service before storing. All changes are digitally signed and logged using the Signed Audit Service (ongoing). The Indexer Service transforms and enriches the application data. The Indexer Service also strips the PII data and sends it to the Analytics Data Store(ElasticSearch).
The Dashboard Backend Service executes configured queries on the stripped data and makes the aggregated data available to the Dashboard Frontend for the administrator or employee view. The views are in accordance with the user role access which is also configurable.
The Billing Service generates a demand based on the calculation logic for the given service delivery. Based on this demand, a bill is generated against which the payment has to be made. The citizen can either make an online payment or can pay at the counter (offline payment). The Payment Gateway Service is called for online payments and this is integrated with third-party service providers. The service routes the citizen to the service provider's website and then back to the citizen's UI once the payment is successful.
The Collection Service is the payment registry and records all the successful payments. For offline payments, a record is made in the collection service after the collection of the Cash/Cheque/DD/RTGS. The allowed payment modes are configurable. The PDF service is used to generate receipts based on a configurable template. The service then triggers the Workflow Service to assign tasks for verification and approval. Workflows can be configured.
The Employee Service allows employee registrations and enables them to log in using the Employee Dashboard. The dashboard displays the list of pending applications as per the employee's role. The employee can perform actions on these applications using the employee UI for the service. As the status changes or actions are taken on the applications, the relevant SMS and Email Notification Service sends the updates to the applicant. Once the application is approved, the applicant can download the final certificate which is generated using PDF Service.
DIGIT has multi-tenant configuration capabilities. The Location Service allows the configuration of the tenant hierarchy and maps multiple tenants like country, state, district, or state, department, and sub-department. Each tenant can have their own configurations for service, roles, workflows etc. This allows for variations across different agencies in tune with the local context. The Language Service facilitates the support of multiple languages in DIGIT. This service stores the translations in multiple languages.
Faecal sludge management (FSM) is a system that enables citizens to raise a request for septic tank cleaning with their urban local bodies (ULBs) directly or by reaching out to the ULB counter. Citizens can track the application, make a payment for the charges and rate the service. Citizens can track the application, make a payment for the charges and rate the service.
The FSM calculator is a system that enables the FSM administrator to create billing slabs for the FSM application(s) with different combinations of property type, slum, tank and capacity. It generates the demand after calculating the charges for the given application using the billing slab already configured.
The vendor registry is a system that enables urban local body (ULB) employees to create and search for a vendor, that is, the desludging operator (DSO) and driver entities with appropriate vehicle entities for the FSM application. The vendor registry is a system that enables urban local body (ULB) employees to create and search for a vendor, that is, the desludging operator (DSO) and driver entities with appropriate vehicle entities for the FSM application.
The vehicle registry is a system that enables urban local body (ULB) employees to create and search vehicle entities, schedule vehicle trips for FSM application, and track vehicle trips. A vehicle registry is a system that enables ULB employees to create and search vehicle entities, schedule vehicle trips for FSM application, and track vehicle trips.
Process Quality Management (PQM) is a service that will be responsible for Treatment Quality Monitoring. It will improve the quality of processes, such as waste treatment process, water treatment process, etc. We are building these services on top of the existing building blocks of DIGIT.
To develop a new service, one needs to create a micro-service and make it available through the API gateway. The API gateway calls the User Service for authentication and the Access Service for authorisation.
Implementation users and service developers can configure the roles and map the roles, actions, plants, treatment process, stages, and testing standards will be created using the Master Data Management Service.
Citizen Dashboard: The service user interface can be developed as part of the Citizen Dashboard or can be an independent solution. The citizen can log in using a mobile number and OTP. They can apply for a new service using the service UI. The File Store Service allows users to upload relevant documentation.
The Persister Service stores the submitted application asynchronously into the registry. The PII data is encrypted using the Encryption Service before storing. All changes are digitally signed and logged using the Signed Audit Service (ongoing). The Indexer Service transforms and enriches the application data. The Indexer Service also strips the PII data and sends it to the Analytics Data Store(ElasticSearch).
The Dashboard Backend Service executes configured queries on the stripped data and makes the aggregated data available to the Dashboard Frontend for the administrator or employee view. The views are in accordance with the user role access which is also configurable.
The Employee Service allows employee registrations and enables them to log in using the employee dashboard. The dashboard displays the list of pending applications as per the employee's role. The employee can perform actions on these applications using the employee UI for the service. As the status changes or actions are taken on the applications, the relevant SMS and Email Notification Service sends the updates to the plant operator and lab collectors. Once the test is completed, the lab/urban local body (ULB) employee can upload and download the test results submitted.
DIGIT has multi-tenant configuration capabilities. The Location Service allows the configuration of the tenant hierarchy and maps multiple tenants like country, state, district, or state, department, and sub-department. Each tenant can have their own configurations for service, roles, workflows etc. This allows for variations across different agencies in tune with the local context. The Language Service facilitates the support of multiple languages in DIGIT. This service stores the translations in multiple languages.
The PQM Service schedules and manages recurring test to evaluate the quality of a process. This can read sensor devices if assigned for a process stage. It can update the treatment quality workflow status based on the results submitted manually or collected through IoT devices. Provides APIs for creating ad-hoc tests, update results, search for results, and anomalies identified.
The PQM Anomaly Finder service finds anomalies in the process quality and raises notifications/alerts based on the defined anomaly check configurations.
The Process Quality Management (PQM) service will help users to create, update, and search for process quality monitoring tests. The service will evaluate the uploaded test values against benchmarks and produce result (FAIL/PASS) status. Test results will be further processed for anomaly analysis. The service can perform two types of test: Manual Test (Lab), and Automatic Test (IoT-based).
The service enables mapping plant codes to operators and ULB admins, ensuring that each is only authorised to manage plants linked to their account, enhancing control and accountability. In the mapping process need to check below mention things
PlantCode
: Unique plant code which is configured in the mdms
PlantUserType
: User type will be either plant-operation or the ulb-admin
PlantUserUuid
: Unique user id of the employee which have the type of ulb-admin or the plant-operator.
List of services that PQM service depends on:
Localisation
The following masters need to be created as part of this module:
Plant
PlantType
PlantConfig
Process
ProcessType
ProcessSubType
Stage
Material
Parameter
Unit
BenchmarkRule
TestingStandard
QualityCriteria
Device
DeviceDetails
DeviceConfig
Create New
Update - new
Search
Here are the articles in this section:
This pqm scheduler is a cronjob scheduler for scheduling tests. It runs based on environment configuration. It triggers multiple the schedule API from PQM-Service to generate the Tests on the basis of Test Standards present in MDMS.
MDMS
User
PQM-Service
It creates tests based on MDMS Test Standards using pqm-service /v1/_scheduler API.
Create Lab Test (CronJob)
Here are the articles in this section:
Details for registering new vendors
The vendor registry is a system that enables urban local body (ULB) employees to create and search a vendor, that is, the desludging operator (DSO) and driver entities with appropriate vehicle entities for the FSM application. This document contains the details about how to set up the vendor and describe the functionalities provided.
As part of the worker welfare v1.0, a new worker registry concept is being introduced. The creation of a worker, updation of details, searching and tagging a worker for different operations on sanitation programmes will be covered. We will leverage the Individual Registry for storing and querying details about a worker.
The individual service is an enhanced version of the user service that houses data about individuals. The individual service is being re-used from .
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Kafka server is up and running.
egov-persister service is running and has a vendor-persister config path added in it.
PSQL server is running and database is created to store FSM application data.
Following services should be up and running:
- egov-mdms-service
- egov-user-service
- boundary-service
- vehicle-service
EXISTING
Added payment payment preference and agency attributes for DSO.
Added gender attribute in the create and update APIs for vendor.
Updated the vendor search API to add vehicleCapacity in the search parameter to search all vendors matching the vehicle capacity specified in the search parameter.
Introduced the vendor tab.
Option to add/remove/update vendors individually.
User can add vehicle and driver.
Search for the list of all vehicles not associated with any vendors.
Users can enable or disable the vendor.
Part Search for FSM Registry Vendor Tab by vendor name
Updating vendor information, such as Gender, Mobile number, and Locality/Mohalla in FSM Registry.
ENHANCEMENT
Changes from Version 1.3.1 is 1.4.0
Change from driver concept to worker.
Deprecation of the driver table.
Backward compatibility for existing drivers (converting a driver user into an individual and mapping/backfilling to vendors).
Introducing worker vendor mapping.
Creation of workers directly using Individual registry APIs.
The DSO for the FSM system is a vendor. For every city/ULB, a DSO should be created with the representative details as owner, associated vehicles and drivers.
Sample Curl
Any system or DIGIT module can be integrated with the vendor service. It helps to manage the vendor with the vehicles, drivers, and owner for representatives, and login for the representative/owner to login into the system to carry our role-specific operations.
Validation of DSO/vendor availability.
Fetch the vehicle assigned to the DSO.
Fetch the drivers assigned to the DSO.
FSM to call vendor/v1/_search to fetch the DSOs.
FSM can call vendor/v1/_search to fetch the DSO’s and the respective vehicles and drivers.
File:
Note: For the test not submitted notification, the notification will only be sent when the frequency of a is greater than the manualTestPendingEscalationDays from
Link to
path:- deploy-as-code/helm/charts/sanitation/fsm/values.yaml -
Description | Topic Name |
---|
File: file which need to add
Create billing slab with combination of PropertyType, refer values from, Slum (YES/NO), capacityFrom and capacityTo refers to the Vehicle Tank Capacity.
Description | Topic Name |
---|
PQM-
Plant User Mapping-
Description | Topic |
---|
Adding GPs while creating an application
The user can add a GP and the villages under that GP while creating an application. The GPs are tagged to the ULBs closest to them. The ULBs will be servicing the desludging requests from the GPs and the villages falling under that GP.
Make the amount per trip field editable
Once the application is created with a selection of GPs, the amount per trip field will become editable for the user.
Category
Services
GIT TAGS
Docker artifact ID
Remarks
FSM v1.4
FSM
fsm:v1.4.0-4ca02ac299-75
FSM calculator
fsm-calculator:v1.2.0-01dd6c9b3a-14
Vehicle
vehicle:v1.3.0-5682061fd3-18
Vendor
vendor-db:v1.3.0-ece6fa00d1-35
PQM
pqm-service:v1.0.0-7c253cb947-169
PQM Anomaly
pqm-anomaly-finder-db:v1.0.0-4eb854b58b-52
PQM Scheduler
pqm-scheduler:vNA-7201d5466b-13
Digit-UI FSM v1.4
DIGIT UI
sanitation-ui:v1.5.0-99fe703c60-449
DIGIT dependency builds
The FSM release is bundled with the DIGIT 2.8 release. Hence, the release builds for DIGIT 2.8 release can be accessed here.
Configs v1.4
MDMS v1.4
Localisation v1.4
Devops v1.4
Category
Services
GIT TAGS
Docker Artifact ID
Remarks
Frontend (old UI) v2.9
Citizen
citizen:v1.8.0-b078fa041d-97
Employee
employee:v1.8.0-2ac8314b2f-116
DSS dashboard
dss-dashboard:v1.8.0-0d70d60e63-53
Core Services v2.9
Encryption
egov-enc-service:v1.1.3-44558a0-3
xState chatbot
xstate-chatbot:v1.1.1-44558a0-2
Searcher
egov-searcher:v1.1.5-72f8a8f87b-16
Payment gateway
egov-pg-service:v1.2.3-ffbb7a6-4
Filestore
egov-filestore-db:v1.3.0-72d8393-4
Zuul - API gateway
zuul:v1.3.1-76bf31f-5
Mail notification
egov-notification-mail:v1.2.0-9fde481-3
SMS notification
egov-notification-sms:v1.1.3-48a03ad7bb-10
Localisation
egov-notification-sms:v1.2.0-9fde481-3
Persist
egov-persister:v1.1.5-3371bc2-5
ID gen
egov-idgen:v1.2.3-44558a0-3
User
egov-user:v1.2.8-9fde481-19
User chatbot
egov-user-chatbot:v1.3.0-6cfa52c1f9-1
MDMS
egov-mdms-service:v1.3.2-44558a0-3
URL shortening
egov-url-shortening:v1.1.2-1715164454-3
Indexer
egov-indexer:v1.1.7-44558a0-3
Report
report:v1.3.4-96b24b0d72-16
Workflow
egov-workflow-v2:v1.3.0-fbea797-11
PDF generator
pdf-service:v1.1.6-96b24b0d72-22
Chatbot
chatbot:v1.1.6-72f8a8f87b-8
Deprecated
Access control
egov-accesscontrol:v1.1.3-72f8a8f87b-24
Location
egov-location:v1.1.5-fbea797-5
OTP
egov-otp:v1.2.3-9fde481-3
User OTP
user-otp:v1.2.0-9fde481-8
NLP engine
nlp-engine:v1.0.0-fbea6fba-21
No changes in the current release.
Egov document-Uploader
egov-document-uploader:v1.1.1-6cfa52c1f9-4
National dshboard ingest
national-dashboard-ingest:v1.0.1-44558a0-3
New service
National dashboard Kafka pipeline
national-dashboard-kafka-pipeline:v1.0.1-44558a0-3
New service
Business Services v2.9
Apportion
egov-apportion-service:v1.1.5-72f8a8f87b-5
Collection
collection-services:v1.1.6-c856353983-29
Billing
billing-service:v1.3.4-72f8a8f87b-39
HRMS
egov-hrms-db:v1.2.6-116d8db-9
Dashboard analytics
dashboard-analytics:v1.1.7-1ffb5fa2fd-49
Dashboard ingest
dashboard-ingest:v1.1.4-72f8a8f87b-10
EGF instrument
egf-instrument:v1.1.4-72f8a8f87b-4
EGF master
egf-master:v1.1.3-72f8a8f87b-15
Finance collection Voucher consumer
finance-collections-voucher-consumer:v1.1.6-96b24b0d72-18
Municipal Services v2.9
Individual
User event
egov-user-event:v1.2.0-c1e1e8ce24-21
Inbox
inbox:v1.3.1-733167672c-14
Utilities Services v2.9
Custom consumer
egov-custom-consumer:v1.1.1-72f8a8f87b-3
egov-pdf:v1.1.2-344ffc814a-37
eDCR v2.9
eDCR
egov-edcr:v2.1.1-1815083c26-25
Finance v2.9
Finance
egov-finance:v3.0.2-0d0a8db8ff-28
pqm-persister
PQM
Added persister config for the pqm service.
pqm-anomaly-persister
PQM-Anomaly
Added persister file for the pqm-anomaly service.
pqm-indexer
PQM
Added indexer file for the pqm service.
pqm-anomaly-indexer
PQM-anomaly
Added indexer file for the pqm-anomaly service.
pqm-inbox-indexer
PQM
Added inbox indexer file for the pqm service.
Vendor-persister
Vendor
Updated vendor-persister file for sanitation worker feature.
FSM-persister
FSM
Updated fsm-persister file for sanitation worker feature.
TQM Chart API changes
PQM
Added chart API configs for TQM plant operator landing page.
PQM Download PDF
PQM
Pdf download configuration for test results summary
PQM
Save test on creation
save-test-application
Update test
update-test-application
Update test workflow
update-workflow-test-application
Save test topic for inbox
save-test-event-application
Update test topic for inbox
update-test-event-application
Anomaly topic for when a test fails
create-pqm-anomaly-finder
Anomaly topic for when a test result has not been submitted
testResultNotSubmitted-anomaly-topic
Creation of plant-user mapping
save-plant-user-mapping
Updation of plant-user mapping
update-plant-user-mapping
PQM Anomaly Finder
Anomaly topic for when a test fails
save-pqm-test-anomaly-details
Anomaly topic for when test result not submitted
testResultNotSubmitted-anomaly-topic
Save anomaly topic
create-pqm-anomaly-finder
Event (In-App) Notification
persist-user-events-async
FSM
Save new FSM application
save-fsm-application
Update FSM application
update-fsm-application
Update FSM application workflow
update-fsm-workflow-application
Update vehicle trip details
update-vehicle-trip-Details
Save plant user mapping
save-plant-map-application
Update plant user mapping
update-plant-map-application
Receipts get pushed to this topic
egov.collection.payment-create
SMS notification topic
egov.core.notification.sms
Event (In-app) notification topic
persist-user-events-async
Vendor
Save vendor topic
save-vendor-application
Update vendor topic
update-vendor-application
Save driver topic
save-driver-application
Update driver topic
update-driver-application
Update vendor-driver and vendor-vehicle relationship
save-vendordrivervehicle-application
Save Test on creation | save-test-application |
Update Test | update-test-application |
Update Test Workflow | update-workflow-test-application |
Save test topic for Inbox | save-test-event-application |
Update test topic for Inbox | update-test-event-application |
Anomaly topic for when a test fails | create-pqm-anomaly-finder |
Anomaly topic for when a test result has not been submitted | testResultNotSubmitted-anomaly-topic |
Creation of plant-user mapping | save-plant-user-mapping |
Updation of plant-user mapping | update-plant-user-mapping |
Event (In-App) Notification | persist-user-events-async |
Anomaly topic for when a test fails | save-pqm-test-anomaly-details |
Anomaly topic for when test result not submitted | testResultNotSubmitted-anomaly-topic |
Save anomaly topic | create-pqm-anomaly-finder |
Save new FSM application | save-fsm-application |
Update FSM application | update-fsm-application |
Update FSM application workflow | update-fsm-workflow-application |
Update vehicle Trip Details | update-vehicle-trip-Details |
Save plant user mapping | save-plant-map-application |
Update plant user mapping | update-plant-map-application |
Receipts get pushed to this topic | egov.collection.payment-create |
SMS Notification topic | egov.core.notification.sms |
Event (In-app) Notification topic | persist-user-events-async |
Save vendor topic | save-vendor-application |
Update vendor topic | update-vendor-application |
Save driver topic | save-driver-application |
Update driver topic | update-driver-application |
Update vendor-driver and vendor-vehicle relationship | save-vendordrivervehicle-application |
Click here to know more.
Click here for more details.
Here are the articles in this section:
The Process Quality Management (PQM) anomaly finder service helps in monitoring anomalies in process quality and notifies the concerned user groups.
MDMS changes for user-event in MDMS, and then restart mdms-service.
Path: data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json - file link
Path: data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json - file link
Add below file in the given path and then restart persister service in the respective environment.
Path: egov-persister/pqm-anomaly-finder-persister.yaml - Persister file link
Add below file in the given path and then restart indexer service in the respective environment.
Path: egov-indexer/pqm-anomaly-finder-indexer.yml - Indexer file link
Add the file which is given below pr in the given path and then start deployment
Path: deploy-as-code/helm/charts/sanitation/pqm-anomaly-finder
Path: build/build-config.yml - file link
For re-indexing refer this link
Click here to know more.
1
Development is completed for all the features that are part of the release.
Yes
@ Pritish.Rath
Code was frozen by 10-01-2024.
2
Test cases are documented by the QA team, and reviewed by the product owners, and test results are updated in the test cases sheet.
Yes
All test cases are reviewed by the Product Owner and results have been updated in the sheet. QA sign-off is given accordingly.
3
The incremental demo of the features showcased during the sprint showcase and feedback is incorporated. If possible, list out the JIRA tickets for feedback.
Yes
Demo given on 20-12-2023 for PQM, and 02-01-2024 for Sanitation Worker.
4
UI/UX audit review by UX architect is completed along with feedback incorporation for any changes in UI/UX.
Yes
@ AndrewJones
UI/UX audit is done and review comments are incorporated.
5
Incremental demos to the product owners are completed as part of the sprint and feedbacks are incorporated.
Yes
Demo given on 20-12-2023 for PQM, and 02-01-2024 for Sanitation Worker.
6
QA sign-off is completed by the QA team and communicated to the product owners. All the tickets' QA sign-off status is updated in JIRA.
Yes
All test cases are reviewed by the Product Owner and QA sign-off is given accordingly.
7
UI, API technical documents are updated for the release along with the configuration documents.
Yes
8
UAT promotion and regression testing from the QA team is completed. QA team has shared the UAT regression test cases with the product owners.
Yes
All test cases are reviewed by the Product Owner.
9
API automation scripts are updated for new APIs or changes to any existing APIs for the release. API automation regression is completed on UAT, the automation test results are analysed, and necessary actions are taken to fix the failure cases. Publish the list of failure use cases with a reason for failure and the resolution taken to fix these failures for the release.
Yes
ULB User: Search, Create, Update PQM Service: Adhoc Create, Search, Update PQM Quality Criteria
Automation script is reviewed by the Tech Lead.
10
The API backward compatibility testing is completed.
Yes
Tested on 05-JAN-2024.
11
The communication is shared with the product owners for the completion of UAT promotion and regression by the QA team. The product owners have to give a product sign-off within one week of this communication.
Yes
UAT sign-off was completed. Sign-off dates 10th JAN 2024 for PQM and Sanitation Worker Welfare.
12
The UAT product sign-off communication is received from the product owners along with the release notes and user guides (if applicable).
Yes
UAT sign-off was completed. Sign-off dates 10th April 2023 for PQM and Sanitation Worker Welfare.
13
The GIT tags and releases are created for the code changes for the release.
Yes
Added a new git tag V1.4
14
Verify whether the release notes are updated.
Yes
15
Verify whether all UAT builds are updated along with the GIT tag details.
Yes
All the budils are added with release tag V1.4
16
Verify whether all MDMS, configs, InfraOps configs are updated.
Yes
17
Verify whether all docs will be published to http://sanitation.digit.org by the Technical Writer as part of the release.
Yes
18
Verify whether all test cases are up to date and updated along with necessary permissions to view the test cases sheet. The test cases sheet is verified by the Test Lead.
Yes
19
Verify whether the UAT credentials sheet is updated with the details of new users and roles, if any.
Yes
20
Verify whether all the localisation data was added in UAT, including Hindi, and updated in the release kits.
Yes
All localisations added as part of the release.
21
Verify whether the product release notes and user guides are updated and published.
Yes
22
The demo of the released features is done by the product team as part of the sprint/release showcase.
Yes
23
Technical and product workshops/demos are conducted by the engineering and product teams to the implementation team (Implementation handover).
Yes
Technical and product handover to impel conducted on 24-11-2023, and 04-01-2024. Signoff on handover is given on 11-01-2024
24
Architect sign-off and technical quality report.
Yes
Signed off on 02-Jan-2024.
25
Success metrics and product roadmap.
Yes
26
Adoption metrics.
Yes
27
Programme roll-out plan.
Yes
28
Implementation checklist.
Yes
29
Implementation roll-out plan.
Yes
30
Gate 2
31
The internal release communication along with all the release artefacts are shared by the engineering/product teams.
32
Plan for upgrading the staging/demo instance with the release product - within 2-4 weeks based on the period where no demos are planned from staging for the previous version of the released product.
33
The release communication to partners is shared by the GTM team and the webinar is arranged by the GTM team after the release communication - within 2-4 weeks of the release.
Title | Link |
Workflow Technical Document |
User Technical Document |
MDMS Technical Document |
IDGen Technical Document |
Localisation Technical Document |
Persister Technical Document |
Link |
pqm-service/v1/_create |
pqm-service/v1/_update |
/pqm-service/v1/_search |
/pqm-service/v1/_create Adhoc |
/pqm-anomaly-finder/v1/_plainsearch |
/pqm-service/plant/user/v1/_create |
/pqm-service/plant/user/v1/_update |
/pqm-service/plant/user/v1/_search |
/inbox/v2/_search |
Title | Link |
User Technical Document |
MDMS Technical Document |
Localisation Technical Document |
Persister Technical Document |
API Contract |
Postman Collection |
Link |
pqm-anomaly-finder/v1/_search |
pqm-anomaly-finder/v1/_plainsearch |
User | Role | Description | How to create |
FSM Creator | FSM_CREATOR_EMP |
| Through HRMS with role. |
FSM Editor | FSM_EDITOR_EMP |
| Through HRMS with role. |
FSM Viewer | FSM_VIEW_EMP | Can view FSM application. | Through HRMS with role. |
FSM Admin | FSM_ADMIN |
| Through HRMS with role. |
DSO | FSM_DSO |
|
FSTP Operator | FSM_EMP_FSTPO |
| Through HRMS with role. |
Collector | FSM_COLLECTOR |
| Through HRMS with role. |
FSM Report Viewer | FSM_REPORT_VIEWER | Can view FSM reports | Through HRMS with role. |
FSM Driver | FSM_DRIVER | Driver role for FSM. | Through HRMS with role. |
Title | Link |
Workflow Technical Document |
User Technical Document |
MDMS Technical Document | NEEDS TO BE UPDATED |
IDGen Technical Document | NEEDS TO BE UPDATED |
Localisation Technical Document | NEEDS TO BE UPDATED |
Persister Technical Document | NEEDS TO BE UPDATED |
SMS Notification Technical Document | NEEDS TO BE UPDATED |
HRMS Technical Document | NEEDS TO BE UPDATED |
API Contract |
Postman Collection |
Link | Deprecation Status |
/fsm/v1/_create | False |
/fsm/v1/_update | False |
/fsm/v1/_search | False |
/fsm/v1/_audit | False |
/fsm/v1/_plainsearch | False |
/fsm/plantmap/v1/_create | False |
/fsm/plantmap/v1/_update | False |
/fsm/plantmap/v1/_search | False |
/inbox/v1/_search | True |
/fsm/v1/_schedular | False |
/inbox/v2/_search | False |
Workflow Technical Document |
User technical document |
MDMS technical document | NEEDS TO BE UPDATED |
IDGen technical document | NEEDS TO BE UPDATED |
Localisation technical document | NEEDS TO BE UPDATED |
Persister technical document | NEEDS TO BE UPDATED |
SMS notification technical document | NEEDS TO BE UPDATED |
API contract |
Postman scripts |
Title | Link | Deprecation Status |
/vendor/v1/_create | False |
/vendor/v1/_search | False |
/vendor/v1/_plainsearch | False |
/vendor/v1/_update | False |
/vendor/driver/v1/_create | True |
/vendor/driver/v1/_update | True |
/vendor/driver/v1/_search | True |
/individual/v1/_create | False |
/individual/v1/_update | False |
/individual/v1/_search | False |
Workflow Technical Document |
|
User Technical Document |
MDMS Technical Document | NEEDS TO BE UPDATED |
IDGen Technical Document | NEEDS TO BE UPDATED |
Localization Technical Document | NEEDS TO BE UPDATED |
Persister Technical Document | NEEDS TO BE UPDATED |
SMS Notification Technical Document | NEEDS TO BE UPDATED |
API Contract |
Postman Scripts |
Title | Link |
fsm-calulator/v1/_calculate |
fsm-calculator/v1/_estimate |
fsm-calculator/v1/billingSlab/_create |
fsm-calculator/v1/billingSlab/_update |
fsm-calculator/v1/billingSlab/_search |
fsm-calculator/v1/_cancellationfee |
fsm-calculator/v1/_advancebalancecalculate |
Configuration and setup details on registering vehicles in FSM module
Vehicle registry is a system that enables urban local body (ULB) employees to create and search vehicle entities, schedule vehicle trips for FSM application and track vehicle trips. This document contains the details about the new enhancements made to the vehicle service and how to set up the vehicle and describes the functionalities provided.
Before you proceed with the configuration, make sure the following prerequisites are met:
Java 8
Kafka server is up and running.
egov-persister service is running and has vehicle-persister config path added in it.
PSQL server is running and database is created to store FSM Application data.
Following services should be up and running:
- egov-perister
- egov-mdms-service
- egov-workflow-v2
- egov-idgen
EXISTING
DSO or ULB can create multiple vehicle trips based on the number of trips entered while submitting the FSM application.
FSTPO can decline the vehicle trip with appropriate reason.
Owner attribute has been added to the vehicle.
FSTPO Vehicle Log Inbox Enhancements to include Application No search filter so that FSTPO can view all the vehicle trips associated with the application.
FSPTO vehicle log API upgraded to show trip numbers in case of multi-trip application.
Introduced Vehicle Tab.
Option to add/remove/update vehicle individually.
Admin can enable or disable the vehicle.
Functionality to add/remove vehicles to vendor.
ENHANCEMENT
Part Search: The Vehicle tab now includes the ability to perform a part search by vehicle number. This means that users can enter a partial vehicle number and retrieve all relevant results that contain that specific portion. For example, if the vehicle number is "AA 77 JJ 3324", users can search for any part of the vehicle number, such as "AA", "77", or "JJ", and retrieve all relevant results that contain that specific portion.
Updating Registry Information: In the Vehicle Tab, the admin has the ability to update certain vehicle information, such as Owner name, Phone Number. Added a new column for gender , Dob and Email address which are updatable.
Create a vehicle with one of the vehicle types available in the VehicleMakeModel MDMS.
Sample Curl
Integrated with the application through REST API to create, and search vehicles. For any module where the vehicle trip is required, one can integrate REST API trip/v1/create, update, and search.
Vehicle management would become easy.
Trip management would become easy.
FSM application can vehicle/v1/_search to validate the FSM vehicle assignment.
FSM application call vehicle/trip/v1/_create on assigning vehicle to the spplication.
FSTP operators can mark the vehicleTrip as DISPOSED.
The Amazon Elastic Kubernetes Service (EKS) is one of the AWS services for deploying, managing, and scaling any distributed and containerised workloads. Here we can provide the EKS cluster on AWS from the ground up and using an automated way (infra-as-code) using terraform and then deploy the DIGIT services config-as-code using Helm.
Know about EKS: https://www.youtube.com/watch?v=SsUnPWp5ilc.
Know what Terraform is: https://youtu.be/h970ZBgKINg.
AWS account with the admin access to provision EKS service. You can always subscribe to a free AWS account to learn the basics and try, but there is a limit to what is offered as free. For this demo, you need to have a commercial subscription to the EKS service, if you want to try out for a day or two, it might cost you about Rs 500 - 1000. (Note: Post the demo, for the internal folks, eGov will provide a 2-3 hours time-bound access to eGov's AWS account based on the request and the available number of slots per day).
Install Kubectl on your local machine that helps you interact with the Kubernetes cluster.
Install Helm that helps you package the services along with the configurations, envs, secrets, etc, into a Kubernetes manifests.
Install the Terraform version (0.14.10) for the Infra-as-Code (IaC) to provision cloud resources as code and with desired resource graph and also it helps to destroy the cluster in one go.
Install AWS CLI on your local machine so that you can use AWS CLI commands to provision and manage cloud resources on your account.
Install AWS IAM Authenticator that helps you authenticate your connection from your local machine so that you should be able to deploy DIGIT services.
Use the AWS IAM User credentials provided for the Terraform (Infra-as-code) to connect with your AWS account and provision cloud resources.
You will get a Secret Access Key and Access Key ID. Save them.
Open the terminal and run the following command. You have already installed the AWS CLI, and you have the credentials saved. (Provide the credentials. You can leave the region and output format blank).
The above will create the following file in your machine as /Users/.aws/credentials.
Before we provision cloud resources, we must understand what resources need to be provisioned by Terraform to deploy DIGIT.
The following picture shows the key components (EKS, Worker Nodes, Postgress DB, EBS Volumes, Load Balancer).
Considering the above deployment architecture, the following is the resource graph we will provision using Terraform in a standard way so that every time and for every env, it will have the same infra.
EKS control plane (Kubernetes master)
Work node group (VMs with the estimated number of vCPUs, memory)
EBS volumes (Persistent volumes)
RDS (PostGres)
VPCs (Private network)
Users to access, deploy, and read-only
Ideally, one would write the Terraform script from the scratch using this doc.
Here, we have already written the Terraform script that provisions the production-grade DIGIT infra and can be customised with the specified configuration.
Let's clone the DIGIT-DevOps GitHub repo where the Terraform script to provision the EKS cluster is available. Below is the structure of the files:
Example:
VPC Resources:
VPC
Subnets
Internet Gateway
Route Table
EKS Cluster Resources:
IAM Role to allow EKS service to manage other AWS services.
EC2 Security Group to allow networking traffic with the EKS cluster.
EKS Cluster.
EKS Worker Nodes Resources:
IAM role allowing Kubernetes actions to access other AWS services.
EC2 Security Group to allow networking traffic.
Data source to fetch the latest EKS worker AMI.
AutoScaling Launch Configuration to configure worker instances.
AutoScaling Group to launch worker instances.
Database
Configuration in this directory creates a set of RDS resources, including DB instance, DB subnet group, and DB parameter group.
Storage Module
Configuration in this directory creates EBS volume and attaches it together.
The following main.tf with create s3 bucket to store all the state of the execution to keep track.
iFix-DevOps/Infra-as-code/terraform/sample-aws/remote-state
main.tf.
The following main.tf contains the detailed resource definitions that need to be provisioned.
Dir: iFix-DevOps/Infra-as-code/terraform/sample-aws
You can define your configurations in variables.tf and provide the env-specific cloud requirements so that using the same Terraform template, you can customise the configurations.
Following are the values you need to mention in the following files. The blank ones will be prompted for inputs during execution.
variables.tf
Use this URL https://keybase.io/ to create your PGP key to create both public and private keys in your machine. Upload the public key into the keybase account that you have created, give it a name, and ensure that you mention that in your Terraform. This allows you to encrypt sensitive information.
Example: User keybase user in eGov case is "egovterraform" that needs to be created and the public key needs to be uploaded here: https://keybase.io/egovterraform/pgp_keys.asc
You can use this portal to decrypt your secret key. To decrypt a PGP message, upload the PGP message, PGP private key, and passphrase.
Now that we know what the Terraform script does, the resources graph that it provisions, and what custom values should be given to your env, let us begin to run the Terraform scripts to provision infra required for deploying DIGIT on AWS.
First CD into the following directory, run the following command 1-by-1, and watch the output closely.
Upon successful execution, the following resources get created, which can be verified by the command "terraform output".
s3 bucket: to store terraform state.
Network: VPC, security groups.
IAM users auth: using keybase to create admin, deployer, the user. Use this URL https://keybase.io/ to create your own PGP key to create both public and private keys in your machine. Upload the public key into the keybase account that you have just created, give it a name, and mention that in your terraform. This allows you to encrypt sensitive information.
Example: User keybase user in eGov case is "egovterraform" that needs to be created, and public keys need to be uploaded here: https://keybase.io/egovterraform/pgp_keys.asc
You can use this portal to decrypt your secret key. To decrypt a PGP message, upload the PGP message, PGP private key, and passphrase.
EKS cluster: with master(s) & worker node(s).
Storage(s): for es-master, es-data-v1, es-master-infra, es-data-infra-v1, zookeeper, kafka, kafka-infra.
Use this link to get the kubeconfig from EKS in order to get the kubeconfig file and connect to the cluster from your local machine. This enables you deploy DIGIT services to the cluster.
Finally, verify that you can connect to the cluster by running the following command.
You are all set to deploy the product.
Know the basics of Kubernetes: https://www.youtube.com/watch?v=PH-2FfFD2PU&t=3s.
Know the basics of kubectl commands.
Know Kubernetes Kubernetes Resources via YAML, Deployments, Replica Sets, and Pods: https://www.youtube.com/watch?v=ohSUtEfDefc.
Know how to manage env values, secrets of any service deployed in Kubernetes: https://www.youtube.com/watch?v=OW244LxB4oI.
Know how to port forward to a pod running inside k8s cluster and work locally https://www.youtube.com/watch?v=TT3nd5n5Yus.
Know sops to secure your keys/credentials: https://www.youtube.com/watch?v=DWzJ87KbwxA.
Choose your cloud and follow the instructions to set up a Kubernetes cluster before moving on to the deployment.
Finally, clean up the cluster setup if you wish, using the following command. This will delete the entire cluster and other cloud resources that were provisioned for the DIGIT setup.
We have successfully created infra on cloud, and deployed DIGIT in the cluster.
On this page, you will find a set of standard configuration steps that should be applied consistently across all services. Please adhere to these steps within the context of each service, making necessary replacements only as instructed by the respective service's guidelines.
Steps:
Deploying a service encompasses three key aspects:
Service Image Deployment: This entails deploying a published Docker image of the service within the DIGIT environment.
Helm Charts Requirement: Helm charts play a crucial role in service deployment as they configure environment variables tailored to the specific Kubernetes cluster. You can deploy a service either through CI/CD pipelines or directly by utilizing Helm commands from your system. All helm charts for PQM services are available in this repository.
Service Configuration: To ensure the service functions seamlessly, it is essential to configure it correctly. This includes setting up MDMS, IDGen, Workflow, and other masters as necessary, all of which can be done on GitHub.
In summary, deploying a service involves these three fundamental steps, each contributing to the successful deployment and operation of the service within the DIGIT environment.
Click here to find detailed information on MDMS configuration.
All modules expose certain actions (APIs), roles (actors) and role-action mappings (who can access which resource). Role-action mappings are used for access control.
Each service documentation has a role-action table that identifies the actors that can access the resource. Follow the outline below replacing specific actions/roles for each module.
Actions, roles and role-action mapping are defined within a master tenant in folders. The folders have the same name as the module name for easy identification.
Example:
In the above image, "pg" is the state level tenant. The three folders highlighted in orange contain the masters for actions, roleactions and roles respectively.
Folder structures are only for categorisation and easy navigation of master files. The MDMS service retrieves data only through module and master names. Make sure that these are correct.
Add all the APIs exposed by the service (refer to service for actual APIs) to the actions.json
file in MDMS.
Keep appending new entries to the bottom of the file.
Make sure the id
field is unique. Best practice is to increment the id by one when adding a new entry. This id field will be used in the role-action mapping.
Module name: ACCESSCONTROL-ACTIONS-TEST
Master name: actions-test
In case 403s are encountered despite configuration, double check the actions.json file to make sure the API in question has a unique ID. In case of duplicate IDs, a 403 will be thrown by Zuul.
A sample entry given below:
Configure roles based on the roles column (refer to service documentation) in the roles.json file. Make sure the role does not exist already. Append new roles to the bottom of the file.
Module name: ACCESSCONTROL-ROLES
Master name: roles
A sample entry is given below:
Role-action mapping should be configured as per the role-action table defined. Add new entries to the bottom of the roleactions.json file.
Identify the action id (from the actions.json file) and map roles to that id. If multiple roles are mapped to an API, then each of them becomes a unique entry in the roleactions.json file.
Module name: ACCESSCONTROL-ROLEACTIONS
Master name: roleactions.json
A sample set of role-action entries is shown below. Each of the actionid
fields needs to match a corresponding API from the actions.json file.
In the example below the ESTIMATE_CREATOR
is given access to API actionid 9. This maps to the estimate create API in our repository.
Note that the actionid
and tenantId
might differ from implementation to implementation.
Each service has a persister.yaml file which needs to be stored in the configs repository. The actual file will be mentioned in the service documentation.
Please add that yaml file under the configs repository if not present already.
Make sure to restart MDMS and the persister service after adding the file at the above location.
Each service has a indexer.yaml file which needs to be stored in the configs repository. The actual file will be mentioned in the service documentation.
Please add that yaml file under the configs repository if not present already.
Make sure to restart MDMS and the persister service after adding the file at the above location.
The Process Quality Management (PQM) service will help users to create, update, and search for process quality monitoring tests. The service will evaluate the uploaded test values against benchmarks and produce result (FAIL/PASS) status. Test results will be further processed for anomaly analysis. The service can perform two types of test: Manual Test (Lab), and Automatic Test (IoT-based).
Functional Specification
Make changes in config accordingly and restart the egov-persister-service and egov-indexer-service.
egov-persister/pqm-persister.yaml - file
egov-indexer/pqm-service-indexer.yml - file
Add path of indexer in the devOps (pqm-service-indexer.yml) - file
Add path of persister in the devOps (pqm-persister.yaml) - file
Idgen config - file
Add path of pdf data and format config files - PR |
Path: data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
Path :- data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json
Role.json
Add the MDMS changes in the given path and restart the MDMS service.
Path: data/pg/inbox-v2/InboxConfiguration.json
Add the indexer file in the given path and restart the services .
Path: egov-indexer/egov-pqm-service.yml
Deploy the latest version of PQM.
Add pqm-persister.yml file in the config folder in git and add that path in persister (the file path is to be added in environment yaml file in param called persist-yml-path), and restart the egov-persister service.
If index are to be created, add the indexer config path in the indexer service (the file path is to be added in environment yaml file in param called egov-indexer-yaml-repo-path), and restart egov-indexer service.
Devops setups for new environment
Path: deploy-as-code/helm/charts/sanitation/pqm-service
Path: build/build-config.yml
https://github.com/egovernments/SANITATION/commit/6e4db6ca9d4a52df1df5d4cb67a2df84a0bed420
Path: deploy-as-code/helm/charts/sanitation/pqm-anomaly-finder
https://github.com/egovernments/DIGIT-DevOps/tree/unified-env/deploy-as-code/helm/charts/sanitation/pqm-anomaly-finder
Path: build/build-config.yml
https://github.com/egovernments/SANITATION/commit/576b1bb9531a2080e039d17af9d3c6a6c63d76e7
Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for PQM:
This guide offers a systematic view of how to create the application screens on DIGIT.
Developer code: Download the UI code from the link here .
Technical pre-requisites
Knowledge of the DIGIT UI framework
Prior knowledge of React JS
Prior knowledge of Redux/Hooks
Prior knowledge of SCSS/React StoryBook
Prior knowledge of Git
Prior knowledge of Queries and Mutations
This section of the guide enables developers to create their own front-end citizen module from scratch which they can deploy on top of DIGIT UI. The new module will be visible as a ‘card’ in the DIGIT citizen portal.
Create Project Structure:
Project Structure
Front-end module project structure
Follow the steps detailed in this doc.
Create Project Structure
Go to micro-ui--internals → packages → modules.
Inside the module, create a folder and provide a name for the service. For instance, the service name here is birth registration.
Create folder fsm (you can give any name).
Add the package.json to the created folder. Mention the module name and other dependencies here.
After creating the package.json for FSM, the project structure (as seen in the image below) is created:
Install Dependency:
Add MDMS (Master Data Management Service) Configuration.
When creating a new module, the module needs to be enabled in citymodule.json. A sample module is available for reference here: .
For the purpose of illustration here, add the following module (birth registration) as given below:
The FSM UI module needs to be registered in three places so that it will be available for the developer at run-time as well as at the time of deployment.
Below are the three places where the module needs to be registered:
micro-ui/web/micro-ui-internals/package.json
micro-ui/web/micro-ui-internals/example/package.json
micro-ui/web/micro-ui-internalsWeb/package.json
Micro-ui-internals:
Open the micro-ui-internals package.json file and add this module as a dependency.
"dev:fsm": "cd packages/modules/fsm && yarn start",
"build:fsm": "cd packages/modules/fsm && yarn build",
In the example/package.json, add the following line:
"@egovernments/digit-ui-module-fsm":”1.6.1”,
In the web/package.json, add the following line:
"@egovernments/digit-ui-module-fsm":”1.6.1”,
Import Required Components:
DIGIT comes with common re-usable libraries that can be imported for use in a new module. CSS , libraries , common , react components can be imported by adding into package.json.
"@egovernments/digit-ui-css": "1.5.3",
"@egovernments/digit-ui-libraries":"1.5.3",
"@egovernments/digit-ui-module-common":"1.5.3",
"@egovernments/digit-ui-module-fsm":"1.5.0",
"@egovernments/digit-ui-module-dss":"1.5.3",
"@egovernments/digit-ui-react-components":"1.5.3",
Common components live in micro-ui/web/micro-ui-internals/packages/react-components micro-ui/web/micro-ui-internals/packages/css. DIGIT components that we are reusing:
FormComposer
ActionBar
Banner
Card
CardText
Loader
SubmitBar
AppContainer
BackButton
PrivateRoute
Icons
CitizenHomecard
This section will walk you through the code that needs to be developed for the application. Detailed user screen wireframes should be available at this point for development.
Following are the steps:
1. Create Application Form:
We need to create a form where users can enter all required information and submit the form. Create a file called index.js in the path below:
/web/micro-ui/internals/packages/module/br/src/pages/citizen/create/index.js
index.js will import the Formcomposer. Inside that, add the heading, label, and form components. The configuration file that will contain the actual form schema is mapped below in the following two lines. The newConfig.json file details are mentioned in the below sections:
import { newConfig } from "../../../components/config/config";
const configs = newConfig?newConfig:newConfig;
2. Filling in config.js:
Create a file called config.js under the following path:
/micro-ui/web/micro-ui-internals/packages/modules/br/src/components/config.js
This file defines the form meta-data and structure. The form heading goes into the "head" field. Components inside the form go into the body field. This form config has already been mapped in the index.js file, and therefore, will be rendered onto the screen:
Components used in the newConfig.js:
3. Routing:
After adding the config.js and create/index.js, add routing for the birth registration form. Create the index.js into fsm/src/pages/citizen/index.js, where the private route will be added. In index.js, mention the path and component name, the component one needs to show or render when one hits that route.
4. Filling in module.js:
module.js is the entry point of every module, so one needs to register all the components, links, code, etc.
5. Enable Module in the UI framework:
After registering all components, links and module code, enable it in two places:
Web/Src/app.js : In app.js we import the FSMModule, initFSMComponents, and FSMLinks and enable the FSM module.
web/micro-ui-internals/example/src/index.js:
In index.js, we will import the FSMModule, initFSMComponents, and FSMLinks and enable the FSM module.
Once the link is added to the homepage, one can see the FSM module on our Digit-UI Homepage. Next, add the homepage card for the citizen module.
6. Integration with Backend API:
This section will explain how to integrate the UI part for the citizen module with the backend API.
In Request, pass the URL so that the URL is mentioned or added into the service/atoms/url.js
Hooks
Once FSMService is created with all requests, create a hook and that hook will be used in the code to pass the data to the backend.
After creating Service and Hooks, register it into packages/ libraries/src/index.js
Once the backend is set up, the hooks or service will be used to send the data to the backend after submitting the form. Add the onSubmit function in this file: (fsm/src/pages/citizen/create/index.js) and in that function, pass the user’s entered data to the BRService that has been created.
Once the integration is done, the data will be saved into the database.
Local Development Setup
The following tools have to be installed before development. Make sure to install the specific versions provided below. If no version is provided, it is assumed that the latest version can be installed.
Install Visual Studio Code. VS Code Extensions to be installed from the marketplace:
Install NodeJS 14.20.0
version 1.22.19
Clone the DIGIT-OSS repository locally from your organization's umbrella. This contains the frontend code under the frontend folder.
Build & Deploy
Go to the Jenkins build page. Click on SANITATION under the folder path mentioned below. frontend/sanitation-ui
Click on Build with parameter. Select the feature branch name by searching for it in the search box on the right side of the screen. Click on Build.
Once the build is successful, open the console output and find the docker image that has been built. Copy the docker image ID.
Deploy
Copy the docker image IDs from the previous step and paste in the above box. Click on ‘Build’. Once the image is deployed, you will see a message as shown below:
Run the Application
After the application is built and deployed, run and test it in the local environment.
Configure Environment File - Citizen To run the application in the local environment, add the .env file in the example folder - If the user is a citizen, configure the .env file as shown below:
SKIP_PREFLIGHT_CHECK=true
REACT_APP_USER_TYPE=CITIZEN
REACT_APP_EMPLOYEE_TOKEN=c835932f-2ad4-4d05-83d6-49e0b8c59f8a
REACT_APP_CITIZEN_TOKEN=7cd58aae-30b3-41ed-a1b3-3417107a993c
REACT_APP_PROXY_API=https://dev.companyname.org
REACT_APP_PROXY_ASSETS=https://dev.companyname.org
REACT_APP_GLOBAL=https://path/to/public/s3/bucket/globalConfigs.js
REACT_APP_CENTRAL_GLOBAL=https://path/to/public/s3/bucket/statebglobalConfigs.js
REACT_APP_QA_GLOBAL=https://path/to/public/s3/bucket/egov-dev-assets/globalConfigs.js
REACT_APP_UAT_GLOBAL=https://path/to/public/s3/bucket/egov-uat-assets/globalConfigs.js
REACT_APP_STATEB_GLOBAL=https://path/to/public/s3/bucket/statebglobalConfigs.js
staging=https://staging.companyname.org
Configure Environment File - Employee
SKIP_PREFLIGHT_CHECK=true
REACT_APP_USER_TYPE=EMPLOYEE
REACT_APP_EMPLOYEE_TOKEN=c835932f-2ad4-4d05-83d6-49e0b8c59f8a
REACT_APP_CITIZEN_TOKEN=7cd58aae-30b3-41ed-a1b3-3417107a993c
REACT_APP_PROXY_API=https://dev.companyname.org
REACT_APP_PROXY_ASSETS=https://dev.companyname.org
REACT_APP_GLOBAL=https://path/to/public/s3/bucket/globalConfigs.js
REACT_APP_CENTRAL_GLOBAL=https://path/to/public/s3/bucket/statebglobalConfigs.js
REACT_APP_QA_GLOBAL=https://path/to/public/s3/bucket/egov-dev-assets/globalConfigs.js
REACT_APP_UAT_GLOBAL=https://path/to/public/s3/bucket/egov-uat-assets/globalConfigs.js
REACT_APP_STATEB_GLOBAL=https://path/to/public/s3/bucket/statebglobalConfigs.js
Open Terminal in micro-UI-internals and run the following command:
yarn start
The application will start after you run the command.
Login
There are two types of login:
Employee:
Homepage Employee: After the login is successful for employees, users are redirected to the employee home page.
On the homepage, users can see the cards for FSM. These cards need to be added. Go through the link to create an employee card. Homepage Citizen: After the login is successful for citizens, users are redirected to the citizen homepage.
Please add Test Standards in mdms v2 under your ULBs (pg.citya, pg.cityb)
Add RoleAction Mapping for the scheduler API
File Path -
File Path -
Make sure that ULBs are configured in tenants.json file
Create a SYSTEM user with PQM_CRONJOB_SCHEDULER and SYSTEM roles. Find the curl below.
The same username will be used to generate bills PQM_SERVICE_CRONJOB, it’s defined in the environment config.
curl --location 'http://localhost:8082/user/users/_createnovalidate' --header 'Content-Type: application/json' --data-raw '{ "RequestInfo": { "api_id": "1", "ver": "1", "ts": null, "action": "create", "did": "", "key": "", "msg_id": "", "requester_id": "", "userInfo": { "userName": "BillCreator", "name": "BillCreator", "gender": "male", "mobileNumber": "9999999999", "active": true, "type": "EMPLOYEE", "tenantId": "{STATE_TANENT_ID}", "password": "eGov@123", "roles": [ { "code": "SUPERUSER", "tenantId": "{STATE_TANENT_ID}" } ] } }, "User": { "userName": "PQM_SERVICE_CRONJOB", "name": "PQM Service Cronjob", "gender": "male", "mobileNumber": "9999999999", "active": true, "type": "SYSTEM", "tenantId": "pg", "password": "eGov@123", "roles": [ { "code": "SYSTEM", "tenantId": "pg" }, { "code": "PQM_CRONJOB_SCHEDULER", "name": "PQM_CRONJOB_SCHEDULER", "tenantId": "pg" } ] } }'.
There are two ways to update the configuration of the scheduler:
Add the config in the DevOps environment file, and restart the service. This will trigger the scheduler based on the updated environment configuration and restart the pqm-service.
Pqm-scheduler:
Use the commands given below:
Product showcase done on 10-01-2024 by
@Pritish.Rath
Through vendor service, use the create DSO Request from .
File:
File: file which need to add
For re-indexing, refer to this .
For re-indexing, refer to this .
File Path- After adding the below changes, please restart dashboard-analytics.
Download the UI code from the link here .
Link:
staging=
File Path -
Create a role in ACCESSCONTROL-ROLES/roles.json MDMS like.
Cron job duration will be configured using environment variables from
Workflow Technical Document
User technical document
MDMS technical document
NEEDS TO BE UPDATED
IDGen technical document
NEEDS TO BE UPDATED
Localisation technical document
NEEDS TO BE UPDATED
Persister technical document
NEEDS TO BE UPDATED
SMS notification technical document
NEEDS TO BE UPDATED
API contract
Postman scripts
Title
Link
vehicle/v1/_create
vehicle/v1/_search
/vehicle/v1/_plainsearch
/vehicle/trip/v1/_create
vehicle/trip/v1/_update
vehicle/trip/v1/_search
/vehicle/trip/v1/_plainsearch
vehicle/v1/_update
PQM Service
PQM Anomaly Finder
PQM Scheduler
Update vehicle-persister.yaml · egovernments/configs@32a7e1b
Update vehicle-persister.yaml · egovernments/configs@23237c8
Added the audit logging for vehicle and vendor · egovernments/configs@482185f
SAN-1047 - Added the query map for update vehicle and vendor topics. · egovernments/configs@56da639
History for egov-persister/vehicle-persister.yaml - egovernments/configs
configs/vendor-persister.yaml at DEV · egovernments/configs
Add master data in MDMS service with module name as vehicle and restart egov-mdms-service. Following are some sample master data for:
SuctionType
VehicleOwner
VehicleMakeModel
FSTPO Rejection Reason (Vehicle decline reason codes)
SAN-1049: Added role actions for Driver APIs. · egovernments/egov-mdms-data@fb8e530
SAN-1063: Added the permissiosn for Vehicle trip creation · egovernments/egov-mdms-data@632ee94
Search the FSM_VEHICLE_TRIP workflow by the given search API.
/egov-workflow-v2/egov-wf/businessservice/_search? tenantId=pb.amritsar&businessServices=FSM_VEHICLE_TRIP
2. Update this below given action at “null” state at line no. 20 for FSM_VEHICLE_TRIP in below workflow and restart the workflow service.
Actions
Role Action Mapping
Make changes in config accordingly and restart the pdf-services.
1 . pdf-service/format-config/fsm-receipt.json
2 . pdf-service/data-config/fsm-receipt.json
egov-persister/fsm-persister.yaml
Add master data in the MDMS service with module name as FSM and restart the egov-mdms-service. Following is a sample master data for Application Channel (Source):
Checklist (To be answered by a citizen while rating):
Configuration (At the application level):
FSTP Plant Information (For each city):
Pit Type (Type of pit):
Property Type:
Slums (Mapped to the locality of the city):
PaymentType (Payment preference type):
data/pg/FSM/ReceivedPaymentType.json
data/pg/FSM/CommonFieldsConfig.json
FSM Persister YML
data/pb/BillingService/BusinessService.json
data/pb/DIGIT-UI/RoleStatusMapping.json
data/pb/BillingService/BusinessService.json
data/pb/amritsar/FSM/ZeroPricing.json
data/pb/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
Following are the changes that need to be integrated in dashboard-analytics, and restart the “dashboard-analytics” service
egov-dss-dashboards/dashboard-analytics/ChartApiConfig.json
egov-dss-dashboards/dashboard-analytics/MasterDashboardConfig.json
egov-indexer/egov-vehicle.yaml
inbox v2 mdms changes
Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for FSM:
For post-pay new business service, FSM_POST_PAY_SERVICE has been created. Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for FSM_POST_PAY_SERVICE:
In the system, the FSM_POST_PAY_SERVICE and FSM (that is, the above two business services) are removed and we have introduced a new business service for advance payment application and zero price application.
For advance new business service, FSM_ADVANCE_PAY_SERVICE has been created.
Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for FSM_ADVANCE_PAY_SERVICE:
For advance zero new business service PAY_LATER_SERVICE has been created.
Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for PAY_LATER_SERVICE:
For Zero Price Application new Business service FSM_ZERO_PAY_SERVICE has been created
Create businessService (workflow configuration) using the /businessservice/_create. Following is the product configuration for FSM_ZERO_PAY_SERVICE:
Using/localisation/messages/v1/_upsert, add localisation (templates) for notification messages to be sent. Following are the product notification templates:
Add role-action mapping for the API’s in MDMS. Following are the required entries. They should be mapped to both CITIZEN and the appropriate employee roles.
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
Plant user mapping
data/pg/ACCESSCONTROL-ACTIONS-TEST/actions-test.json
data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json
data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json
data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json
Plant user mapping
data/pg/ACCESSCONTROL-ROLEACTIONS/roleactions.json
For a new environment these changes are required.
path:- deploy-as-code/helm/charts/sanitation/fsm/values.yaml
Deploy the latest version of the vendor.
Add vendor-persister.yml file in the config folder in git and add that path in persister (the file path is to be added in the environment yaml file in param called persist-yml-path), and restart egov-persister-service.
Integrate the following below changes in vendor-persister.yml .
.
Configurations that we can manage through values.yml vehicle in infra-ops repo are listed below. values.yml for the vehicle is available below.
Configurations sample in Values.yml
In this document, we will learn how to legacy index/re-index the fsm index.
Kubectl access to the required environment in which you want to run the re-indexing
playground pod access
Legacy index mapping/configuration done in the respective indexer-config ( in this case for FSM, legacy index configuration for fsm is done , Similarly for VehicleTrip also exists )
Postman collection to re-index the data for FSM, VehicleTrip, Vehicle, Vendor Services can be downloaded
After importing the postman collection downloaded from above section, you can find two request
fsm-legacy : This request helps to get the data from fsm/plainsearch api and push data to fsm-enriched topic by indexer service
fsm-legacy-kafkaconnector : This is the request to create a connector which can listen to the fsm-enriched topic and push data to the elastic search with the new index fsm-enriched
Run the fsm-legacy-kafkaconnector request in the playground pod, which would create a connector which would intern start listening to the topic fsm-enriched-sink
Run the fsm-legacy request in the playground pod, which would call indexer service to intiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the fsm-enriched-sink topic
Whole process would take some time, mean while you can searc for the data in fsm-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/fsm-enriched-es-sink'
Once reindexing is completed, please verfiy the count in fsm index and fsm-enriched index, then delete the fsm index and create alias for fsm-enriched index as fsm.Please use below command for alias creating.
After importing the postman collection downloaded from above section, you can find two request
vehicleTrip-legacy : This request helps to get the data from vehicletrip/plainsearch api and push data to vehicletrip-enriched topic by indexer service
vehicle-trip-legacy-kafkaconnector : This is the request to create a connector which can listen to the vehicletrip-enriched topic and push data to the elastic search with the new index vehicletrip-enriched
Run the vehicletrip-legacy-kafkaconnector request in the playground pod, which would create a connector which would intern start listening to the topic vehicletrip-enriched-sink
Run the vehicletrip-legacy request in the playground pod, which would call indexer service to intiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the vehicletrip-enriched-sink topic
Whole process would take some time, mean while you can searc for the data in vehicletrip-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/vehicletrip-enriched-es-sink'
Once reindexing is completed, please verfiy the count in vehicletrip index and vehicletrip-enriched index, then delete the vehicletrip index and create alias for vehicletrip-enriched index as vehicletrip.Please use below command for alias creating.
Kubectl access to the required environment in which you want to run the re-indexing
playground pod access
After importing the postman collection downloaded from above section, you can find two request
fsm-legacy inbox : This request helps to get the data from fsm/plainsearch api and push data to fsm-application-enriched topic by indexer service
inbox-kafka-connector : This is the request to create a connector which can listen to the fsm-application-enriched topic and push data to the elastic search with the new index fsm-application-enriched
Run the inbox-kafka-connector request in the playground pod, which would create a connector which would intern start listening to the topic fsm-application-enriched-sink
Run the fsm-legacy-inbox request in the playground pod, which would call indexer service to intiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the fsm-application-enriched-sink topic
Whole process would take some time, mean while you can search for the data in fsm-application-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/fsm-application-enriched-es-sink'
Once reindexing is completed, please verfiy the count in fsm-application index and fsm-application-enriched index, then delete the fsm-application index and create alias for fsm-application-enriched index as fsm-application.Please use below command for alias creating.
In this document, we will learn how to legacy index/re-index the pqm index
Pre-Requisuites
Kubectl access to the required environment in which you want to run the re-indexing
playground pod access
Legacy index mapping/configuration done in the respective indexer-config ( in this case for pqm , legacy index conifiguration for pqm is done)
Postman collection to re-index the data for PQM can be downloaded .
After importing the postman collection downloaded from above section, you can find two request
pqm-services-legacy : This request helps to get the data from pqm/plainsearch api and push data to pqm-services-enriched topic by indexer service
pqm-services-legacy-kafkaconnector : This is the request to create a connector which can listen to the pqm-application topic and push data to the elastic search with the new index pqm-application-enriched
Run the pqm-legacy-kafkaconnector request in the playground pod, which would create a connector which would intern start listening to the topic pqm-application-legacyindex-enriched-sink
Run the pqm-legacy request in the playground pod, which would call indexer service to initiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the pqm-application-legacyindex-enriched-sink topic
Whole process would take some time, meanwhile you can search for the data in pqm-application-legacyindex-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/pqm-application-enriched-es-sink'
Once reindexing is completed, please verify the count in pqm-application index and pqm-application-legacyindex-enriched index, then copy the pqm-application-legacyindex-enriched index to pqm-application and delete pqm-application-legacyindex-enriched index.Please use below command for coping.
Kubectl access to the required environment in which you want to run the re-indexing
playground pod access
After importing the postman collection downloaded from above section, you can find two request
pqm-services-legacy : This request helps to get the data from pqm/plainsearch api and push data to pqm-services-enriched topic by indexer service
pqm-services-legacy-kafkaconnector : This is the request to create a connector which can listen to the pqm-application topic and push data to the elastic search with the new index pqm-application-enriched
Run the pqm-services-legacy-kafkaconnector request in the playground pod, which would create a connector which would intern start listening to the topic pqm-service-index-enriched-sink
Run the pqm-services-legacy request in the playground pod, which would call indexer service to initiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the pqm-service-index-enriched-sink topic
Whole process would take some time, meanwhile you can search for the data in pqm-service-legacyindex-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/pqm-service-index-enriched-es-sink0500'
7.Once reindexing is completed, please verify the count in pqm-service index and pqm-service-index-enriched index, then copy the pqm-service-index-enriched index to pqm-service and delete pqm-service-index-enriched index.Please use below command for coping.
Kubectl access to the required environment in which you want to run the re-indexing
playground pod access
Postman Collection
After importing the postman collection downloaded from above section, you can find two request
pqm-Anomaly-services-legacy : This request helps to get the data from pqm-Anomaly/plainsearch api and push data to pqm-Anomaly-services-enriched topic by indexer service
pqm-anomaly-services-legacy-kafkaconnector : This is the request to create a connector which can listen to the pqm-anomaly-finder topic and push data to the elastic search with the new index pqm-anomaly-finder-enriched
Run the pqm-anomaly-services-legacy-kafkaconnector request in the playground pod, which would create a connector which would intern start listening to the topic pqm-anomaly-finder-enriched-sink
Run the pqm-Anomaly-services-legacy request in the playground pod, which would call indexer service to initiate the process of fetching the data from plainsearch and push the data prepared according to the legacy-index mapping and push the data to the pqm-Anomaly-application-enriched-sink topic
Whole process would take some time, meanwhile you can search for the data in pqm-anomaly-finder-enriched index in the elastic search
we can go through the logs of the indexer pod, which would help to understand the job is done
Once the job is done, delete the kafka connector running the below curl in the playground curl --location --request DELETE 'http://kafka-connect.kafka-cluster:8083/connectors/pqm-Anomaly-services-legacy-enriched-es-sink'
Once reindexing is completed, please verify the count in pqm-anomaly-finder index and pqm-anomaly-finder-enriched index, then copy the pqm-anomaly-finder-enriched index to pqm-anomaly-finder and delete pqm-anomaly-finder-enriched index.Please use below command for coping.
Deploy the latest version of the vehicle.
Add vehicle-persister.yml file in config folder in git and add that path in persister . (The file path is to be added in environment yaml file in param called persist-yml-path ) and restart egov-persister-service.
Integrate the following below changes in vehicle-persister.yml
Configurations that we can manage through values.yml vehicle in infraops repo are listed below. values.yml for the vehicle can be found.
Configurations sample in Values.yml
Deploy the latest version of FSM.
Add fsm-calculator-persister.yml file in the config folder in GIT, and add that path in persister (the file path is to be added in environment yaml file in param called persist-yml-path):
Configurations that we can manage through values.yml fsm-calculator in infraops repo are as follows. values.yml for fms-calculator can be found.
Configurations sample in Values.yml
Deploy the latest version of FSM.
Add fsm-persister.yml file in the config folder in git, add that path in persister (the file path is to be added in environment yaml file in param called persist-yml-path), and restart the egov-persister service.
If index are to be created, add the indexer config path in the indexer service (the file path is to be added in environment yaml file in param called egov-indexer-yaml-repo-path), and restart egov-indexer service.
path:- deploy-as-code/helm/charts/sanitation/fsm/values.yaml -
Configurations that we can manage through values.yml fsm-calculator in infraops repo as follows:
Sample values.yml
sanitation/egov-persister/fsm-persister.yaml - file
sanitation/egov-persister/vendor-persister.yaml - file
Integrate following below changes in fsm-persister.yml
Dashboard Analytics Configuration
data/pg/FSM/SanitationWorkerEmployer.json - file
data/pg/FSM/SanitationWorkerEmploymentType.json - file
data/pg/FSM/SanitationWorkerFunctionalRoles.json - file
data/pg/FSM/SanitationWorkerSkills.json - file
Legacy index mapping/configuration done in the respective indexer-config ( in this case for FSM Inbox, legacy index configuration for fsm inbox is done )
Postman collection to re-index the data for FSM inbox can be downloaded
Legacy index mapping/configuration done in the respective indexer-config ( in this case for pqm , legacy index conifiguration for pqm is done)
Postman collection to re-index the data for PQM can be downloaded .
Legacy index mapping/configuration done in the respective indexer-config ( in this case for pqm-Anomaly , legacy index conifiguration for pqm-Anomaly is done)
Postman collection to re-index the data for pqm-Anomaly can be downloaded .
Description | Name in values.yml | Current value |
Kafka Consumer Group | SPRING_KAFKA_CONSUMER_GROUP_ID | egov-vendor-services |
Kafka topic to which service push data to save new vendor | PERSISTER_SAVE_VENDOR_TOPIC | save-vendor-application |
MDMS service host | EGOV_MDMS_HOST | egov-mdms-service from egov-service-host |
Vehicle service host | EGOV_VEHICLE_HOST | vehicle from egov-service-host |
User service host | EGOV_USER_HOST | egov-user-service from egov-service-host |
Location service Host | EGOV_LOCATION_HOST | egov-location from egov-service-host |
Description | name in values.yml | Current Value |
id-gen host, to generate the application number | EGOV_IDGEN_HOST | egov-idgen from egov-service-host |
mdms service host | EGOV_MDMS_HOST | egov-mdms-service from egov-service-host |
workflow v2 service host | WORKFLOW_CONTEXT_PATH | egov-workflow-v2 from egov-service-host |
user service host, to get the locale data | EGOV_USER_HOST | egov-user from egov-service-host |
Kafka Consumer Group | SPRING_KAFKA_CONSUMER_GROUP_ID | egov-vehicle-services |
kafka topic to which service push data to save new vehicle application | PERSISTER_SAVE_VEHICLE_TOPIC | save-vehicle-application |
kafka topic to which service push data of the vehicleTrip to save | PERSISTER_SAVE_VEHICLE_TRIP_TOPIC | save-vehicle-trip |
kafka topic to which service push data of the vehicleTrip to update | PERSISTER_UPDATE_VEHICLE_TRIP_TOPIC | update-vehicle-trip |
kafka topic to which service push data of the vehicleTrip to update the status | PERSISTER_UPDATE_VEHICLE_TRIP_WORKFLOW_TOPIC | update-workflow-vehicle-trip |
VehicleTrip Appilcatiion Number format` | egov.idgen.vehicle.trip.applicationNum.format | "[CITY.CODE]-VT-[cy:yyyy-MM-dd]-[SEQ_EGOV_VEHICLETRIP]" |
Description | name in values.yml | Current Value |
contextPath of the api’s | SERVER_CONTEXTPATH | /fsm-calculator |
Kafka Consumer Group | SPRING_KAFKA_CONSUMER_GROUP_ID | fsm-calculator |
kafka topic to which service push data to save new billing slab | PERSISTER_SAVE_BILLING_SLAB_TOPIC | save-fsm-billing-slab |
kafka topic to which service push data to update the existing billing slab | PERSISTER_UPDATE_BILLING_SLAB_TOPIC | update-fsm-billing-slab |
mdms service host | EGOV_MDMS_HOST | egov-mdms-service from egov-service-host |
billing-service host | EGOV_BILLINGSERVICE_HOST | billing-service from egov-service-host |
fsm service host | EGOV_FSM_HOST | fsm from egov-service-host |
Description | name in values.yml | Current Value |
id-gen host, to generate the application number | EGOV_IDGEN_HOST | egov-idgen from egov-service-host |
Kafka Consumer Group | SPRING_KAFKA_CONSUMER_GROUP_ID | egov-fsm-service |
kafka topic to which service push data to save new fsm application | PERSISTER_SAVE_FSM_TOPIC | save-fsm-application |
kafka topic to which service push data to save workflow status | PERSISTER_UPDATE_FSM_WORKFLOW_TOPIC | update-fsm-workflow-application |
kafka topic to which service push data to update the existing fsm application | PERSISTER_UPDATE_FSM_TOPIC | update-fsm-application |
mdms service host | EGOV_MDMS_HOST | egov-mdms-service from egov-service-host |
billing-service host | EGOV_BILLINGSERVICE_HOST | billing-service from egov-service-host |
fsm-calculator service host | EGOV_FSM_CALCULATOR_HOST | fsm-calculator from egov-service-host |
workflow v2 service host | WORKFLOW_CONTEXT_PATH | egov-workflow-v2 from egov-service-host |
ui host, to return send the url of new application in sms notification | EGOV_UI_APP_HOST | egov-services-fqdn-name from egov-service-host |
vendor service host, to get DSO details | EGOV_VENDOR_HOST | vendor from egov-service-host |
Vehicle service host, to get vehicle details and manage vehicleTrip | EGOV_VEHICLE_HOST | vehicle from egov-service-host |
Collection service host, to get the payment details | EGOV_COLLECTION_SERVICE_HOST | collection-services from egov-service-host |
localization service host, to get the locale data | EGOV_LOCALIZATION_HOST | egov-localization from egov-service-host |
user service host, to get the locale data | EGOV_USER_HOST | egov-user from egov-service-host |
pdf service host, to get the locale data | EGOV_PDF_HOST | pdf-service from egov-service-host |
url shortening service host, to get the short url for the long once | EGOV_URL_SHORTNER_HOST | egov-url-shortening from egov-service-host |