Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
S.No
Section Heading
Chart Heading
Subheading
Definitions (This will appear on the dashboard whenever a user hovers on the metric, wherever applicable).
Chart Type
X-Axis
Y-Axis
Value
Columns
How to calculate
Boundary
Drilldown/Toggle
Comparison KPIs, if any
Show comparison in
Specific to State/ULB/ TRP/all
Tooltip on Hover on Data Point
Input Fields
1
Input
Total incoming sludge
NA
The total incoming sludge from registered and unregistered vehicles.
KPI
NA
NA
Total incoming sludge.
NA
Total incoming sludge = (Volume of waste disposed of for registered vehicles) + (Volume of waste disposed of for unregistered vehicles).
State, Plant ULB
NA
NA
NA
NA
2
Input
Number of incoming trips
NA
The number of trips disposed of at the treatment plant.
KPI
NA
NA
The count of trips to the treatment plant from registered and unregistered vehicles.
NA
Number of trips disposed = Count (Distinct trip ID).
State, Plant ULB
NA
NA
NA
NA
3
Treatment Quality
Overall quality
NA
The percentage of tests where all parameters are as per the benchmarks.
KPI
NA
NA
The percentage of test results meeting the benchmarks.
NA
Overall quality = (Number of tests where all parameters meet benchmarks/ The total number of tests) * 100.
State, Plant ULB
NA
NA
NA
NA
4
Treatment Quality
Compliance
NA
The percentage of tests where results have been recorded.
KPI
NA
NA
The percentage of tests with the status as submitted out of the total tests.
Compliance percentage = (Count of test ID in status 'Submitted'/Count (distinct trip ID) * 100.
State, Plant ULB
NA
NA
NA
NA
5
Alerts
Total alerts
NA
The total alerts raised by the system in the following categories: 1) Test results not as per the benchmark, 2) No reading from the IoT device, 3) Lab results and IoT results not matching.
KPI
NA
NA
Total alerts.
Count (Distinct alert ID).
State, Plant ULB
NA
NA
NA
NA
6
Treatment Quality Plants
Total plants
NA
NA
NA
NA
NA
Count of plants
Count (Distinct plant ID).
State, Plant ULB
NA
NA
NA
NA
7
Treatment Quality Plants
Treatment quality passed
NA
Treatment quality is considered passed if all parameters of both biosolids and effluents are as per the benchmarks for the output of the treatment process in the last test recorded.
NA
NA
NA
The count of plants with treatment quality passed.
Treatment quality for the output type =If(COUNT IF (All parameters meet benchmarks, FALSE) = 0, "Treatment Quality for Output type passed", "Treatment Quality for Output type failed"). Treatment quality for plant passed = =IF(COUNT IF (Treatment quality for output type, FALSE) = 0, " Treatment Quality Passed", "Treatment Quality Failed").
State, Plant ULB
NA
NA
NA
NA
8
Treatment Quality Plants
Treatment quality failed
NA
Treatment quality is considered failed when one or more parameters of biosolids or effluents are not as per the benchmarks for the output of the treatment process in the last test recorded.
NA
NA
NA
The count of plants with treatment quality failed.
Count (Distinct plant ID) - Treatment quality passed.
State, Plant ULB
NA
NA
NA
NA
9
Treatment Quality Plants
NA
NA
NA
Map
NA
NA
Point = Geolocation of plant. Plant icon colour is green if the treatment quality is passed, and red if the treatment quality failed.
State, Plant ULB
NA
NA
NA
NA
Name of the plant
10
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Plant Name, Test Result, Compliance %
Test result same as S.No 7 and S.No 8
State, Plant ULB
NA
NA
NA
NA
11
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Stage, Output Type, Parameters 1...n, Compliance %
Mentioned above
State, Plant ULB
NA
Compliance percentage.
The percentage from the last month.
NA
12
Trend in [Parameter Name] Readings
NA
NA
NA
Multi-line chart
Test dates
Parameter Value
- Value of device reading. - Value of lab results.
NA
NA
Plant
NA
NA
NA
NA
Date Lab result - X Device reading - Y
This user story aims to enable the state and the urban local body (ULB) admin to define the input/output quality testing requirements, including testing parameters, benchmarks, and frequencies for each treatment process and stage in the plant. It also allows them to specify different standards for manual and IoT-based testing for particular input/output types and stages. No UI is required for the same.
- State admin
- ULB admin
- Setup plants, treatment process and stages, and map plants to the treatment process and stages.
- Plants may contain one or more treatment processes, each with a treatment process type and capacity.
- A treatment process will contain multiple stages.
- The state or ULB admin can define input/output testing requirements for each treatment process type and stage.
- They can enable or disable testing for specific input/output types and stages.
- They can define testing parameters, benchmarks, and frequencies at the national/state level.
- They can adjust testing frequencies for plants based on their testing results.
- They can specify different testing standards for manual and IoT-based testing for specific input/output types and stages.
Treatment process
Plants
Stages:
Testing Parameters
This feature can be managed through backend configurations and databases, allowing administrators to make changes easily.
- A state or ULB admin can define the input/output testing requirements for each treatment process and stage.
- They can enable or disable testing for specific input/output types and stages.
- They can define and edit testing parameters, benchmarks, and frequencies at the national/state level.
- They can adjust testing frequencies for plants based on their testing results.
- They can specify different testing standards for manual and IoT-based testing for specific input/output types and stages.
No specific notifications are required for this user story.
N/A
1. A state and ULB admin can define input/output testing requirements for each treatment process and stage.
2. They can enable or disable testing for specific input/output types and stages.
3. Testing parameters, benchmarks, and frequencies can be defined and managed at the national/state level.
4. Administrators can adjust testing frequencies for plants based on their testing results.
5. Different testing standards can be specified for manual and IoT-based testing for specific input/output types and stages.
This user story aims to automate the generation of schedules for tests based on the frequency of testing for various parameters. The generated schedule will be used for manual and IoT tests to display upcoming tests, generate alerts, and facilitate escalation in case of non-adherence to the test schedule.
DIGIT Sanitation
As a system, I want to automatically generate schedules for tests based on the frequency of testing for various parameters. This will help in displaying upcoming tests to the plant operator and stakeholders, generate alerts for upcoming tests, and escalate in case of non-adherence to the test schedule.
This feature can be managed through backend configurations and databases, allowing administrators to make changes easily.
- The system will automatically generate schedules for tests based on the frequency of testing for various parameters.
- Plant operators and stakeholders can view the upcoming tests on the schedule.
- Alerts will be automatically generated for upcoming tests to notify the relevant parties.
- In case of manual testing, escalations will be triggered if test results are pending beyond the specified days as per the test schedule.
- Alerts will be sent to the the plant operator and stakeholder [X] days prior to the test date.
- Escalations will be triggered for pending test results as per the test schedule.
N/A
1. The system should automatically generate schedules for tests based on the frequency of testing for each parameter.
2. Upcoming tests should be displayed to the plant operator and stakeholders.
3. Alerts should be generated for upcoming tests to notify the relevant parties.
4. Escalations should be triggered in case of non-adherence to the test schedule for manual testing.
5. The implementation of this feature should allow for easy configuration and management through backend settings.
This user story aims to implement an anomaly detection system that generates alerts in case of the following anomalies:
1. Lab results not as per the benchmark.
2. IoT device results not as per the benchmark.
3. Lab results and device results do not match.
- Test uploader
- IoT system
- DIGIT Sanitation
As a system, I want to detect anomalies in the test results and generate alerts for the following scenarios:
Lab results not as per the benchmark
This is to be generated when the manual test results uploaded by the test uploader are not as per the benchmarks defined (adjusted for deviations, configurable at plant level).
IoT results not as per the benchmark
This is to be generated when the loT test results recorded via the integration are not as per the benchmarks defined for [X] days (adjusted for deviations defined while setting testing parameters).
Generation of alerts: Device results and lab results do not match
In case the data that is recorded by the sensor does not match the data in the lab test result, an auto alert will generated.
- Anomaly types, benchmark values, and allowed deviations should be specified for each scenario.
- The matching date for comparing IoT results should be calculated based on the sample collection date and the closest available IoT result date.
- Anomalies should be detected and alerts generated only if the deviation from the benchmark exceeds the allowed deviation.
- The anomaly detection system should be configurable to define benchmark values, allowed deviations, and matching date logic.
- The test uploader uploads manual lab test results.
- The integration system records IoT test results.
- DIGIT Sanitation continuously monitors the test results.
- Alerts are automatically generated if anomalies are detected.
- Alerts will be generated automatically when any of the specified anomalies are detected.
- The alert message will indicate the type of anomaly and the details of the test results.
1. The system should be able to detect and generate alerts for the specified anomalies, including lab results not meeting benchmarks, IoT results not meeting the benchmark, and lab and IoT results not matching
2. The allowed deviation for each anomaly type should be configurable.
3. The anomaly detection should be based on the benchmark values set for each type of test result.
4. Alerts should be generated with the appropriate message indicating the type of anomaly detected and the details of the test results.
5. The implementation should allow for easy configuration and management through backend settings.
This user story aims to implement an automated alert generation system for cases where no reading is received from a sensor based on the scheduled frequency.
- Integration system
- DIGIT Sanitation
As a system, I want to automatically generate an alert in case no reading is received from a sensor based on the scheduled frequency.
No reading received from device
- The system should check if a reading was expected from the sensor on the scheduled date.
- An alert should be generated only if no reading is received from the sensor on the scheduled date.
- The alert generation system should be configured to specify the frequency of expected readings from each sensor.
- The integration system monitors the scheduled frequency of readings from the sensors.
- The alert generation system automatically detects cases where no reading is received from a sensor based on the scheduled date.
- An alert will be generated automatically if no reading is received from a sensor based on the scheduled date.
- The alert message will indicate the sensor ID and the scheduled date for the next reading.
1. The system should automatically detect cases where no reading is received from a sensor on the scheduled date.
2. Alerts should be generated with the appropriate message indicating the sensor ID and the scheduled date for the next reading.
3. The alert generation should be based on the configured frequency of expected readings from each sensor.
4. The implementation should allow for easy configuration and management through backend settings.
As a plant operator, I want to access the home page after logging in to the system. The landing page should provide an overview of relevant modules, pending tasks, and options to navigate to the specific sections.
Plant operator
Upon successful login, the plant operator will be directed to the home page. The home page will display the following elements:
Plant name: A visible label located on the top right-hand corner of the screen, indicating the name of the plant or facility the operator is associated with.
Help button: A clickable button available on every page to provide a guided view and assistance for users.
Modules cards: The home page will show cards for the following modules:
a) Vehicle Log Module: Allows the user to record incoming vehicles.
b) Treatment Quality Monitoring module: Provides access to information related to treatment quality.
c) View Dashboard: Navigates the user to a comprehensive dashboard.
List of pending tasks: This section will display a list of tests and tasks pending for the plant operator within the next [X] days. Alongside each pending task, the next action item in the workflow will be displayed, enabling the plant operator to take prompt action.
View all pending tasks button: A clickable button that redirects the user to a page displaying all pending tasks.
N/A
N/A
N/A
The plant operator can click on the cards representing different modules to navigate to their respective home pages for detailed information and functionalities.
The plant operator can click on the "View All Pending Tasks" button to view all pending tasks in the system.
N/A
The plant operator should be able to log in successfully and land on the home page.
The home page should display the plant name, help button, and cards for the different modules.
The list of pending tasks should be visible and filtered based on the defined time frame [X].
Each pending task should display the next action item required.
Clicking on the cards should redirect the user to the respective module's home page.
Clicking on the "View All Pending Tasks" button should redirect the user to a page displaying all pending tasks.
As a plant operator, I want to access the Treatment Quality Monitoring (TQM) home page by clicking on the treatment quality card. The TQM home page should provide access to upcoming tests, past test results, IoT readings, sensor monitoring, treatment quality dashboard, and performance metrics related to treatment quality.
Plant operator
After clicking on the treatment quality card, the plant operator will be redirected to the TQM home page, which will offer the following functionalities:
Inbox: The inbox will display the upcoming tests, and the plant operator can take necessary actions related to these tests. A count of the upcoming tests for quick reference will be displayed in brackets.
View past test results: This section will present the past results from both ab and IoT devices, allowing the plant operator to review historical data.
View IoT readings: The user can access records of IoT readings, enabling them to monitor IoT devices' data.
Sensor monitoring: A list of IoT devices and their status will be available for the plant operator to monitor and ensure smooth operations.
View dashboard: Clicking on this option will direct the plant operator to the treatment quality dashboard, providing comprehensive insights and visualisations.
View performance: This widget will display key performance indicators (KPIs) related to treatment quality, including:
a. Test compliance: The percentage of plant compliance with treatment quality standards, compared to the state-level compliance percentage.
b. Last treatment quality result: Indicates whether the last test result was a pass or fail, along with the date of the test.
c. Count of alerts raised in the past 30 days: Shows the number of alerts generated within the last 30 days.
d. Distribution of alerts based on the alert category: Presents a breakdown of alerts by their respective categories. (Note: For details about the calculation of these metrics, refer to the dashboard user stories).
e. Go back to the landing page: The plant operator can use the back button to return to the landing page.
f. Help button: A clickable button available on every page to provide a guided view and assistance for users.
N/A
N/A
N/A
The plant operator can view upcoming tests in the inbox and take necessary actions related to them.
The plant operator can review past test results from both Lab and IoT devices.
The plant operator can access records of IoT readings for monitoring purposes.
The plant operator can check the status of IoT devices through sensor monitoring.
The plant operator can navigate to the treatment quality dashboard for comprehensive insights.
The plant operator can view the performance metrics related to the treatment quality.
The plant operator can use the back button to return to the landing page.
N/A
The plant operator should be able to access the TQM home page by clicking on the treatment quality card.
The TQM home page should display the inbox with the upcoming tests and their count.
The plant operator should be able to view the past test results from lab and IoT devices.
The plant operator should be able to access the records of IoT readings.
The plant operator should be able to monitor IoT devices and their statuses.
Clicking on "View Dashboard" should redirect the plant operator to the treatment quality dashboard.
The performance metrics should accurately display test compliance, last treatment quality result, count of alerts raised in the past 30 days, and distribution of alerts based on their categories.
The back button should allow the plant operator to return to the landing page.
The help button should be available and functional on the TQM home page.
As a plant operator, I want to view a list of the upcoming tests by clicking on the 'Inbox' option. The list should display only lab tests, and I should be able to perform various tasks, such as filtering, sorting, and accessing the test details.
Plant operator
Upon clicking on the 'Inbox', the plant operator will be redirected to the list of upcoming lab tests. The following functionalities will be available:
Total count of the upcoming tests: The total count of the upcoming tests will be displayed beside the inbox, enclosed within brackets.
View list of the upcoming tests: The list will contain the following fields:
- Test ID
- Treatment process (if there is only one treatment process configured for the plant, this field will not be displayed)
- Stage: Indicates the process stage where the sample is to be collected from.
- Output type: Specifies whether the test is for biosolids or effluents.
- Pending date: The scheduled test date.
- Status: The current status of the test.
- SLA: Displays the difference between the test due date and today.
N/A
The plant operator can view a list of the upcoming tests with the relevant details.
The plant operator can use filters to refine the view based on the treatment process, output type, status, and date range.
The plant operator can use sorting options to arrange the tests based on "Pending Date."
The plant operator can access test details for further actions.
N/A
The plant operator should be able to view the list of the upcoming lab tests by clicking on 'Inbox'.
The list should only display the lab tests.
The list of the upcoming tests should contain relevant fields as described in the user story.
The action item displayed should be based on the next status in the treatment quality workflow for each test.
The filters and sorting options should work correctly, and the data should be updated based on the selected criteria.
The user should be able to navigate back to the landing page using the back button.
The help button should be available and functional on the list of upcoming tests page.
As a plant operator, I want to view the test details for the upcoming tests either via the pending tasks or the inbox. The test details page will provide comprehensive information about the test and allow me to perform specific actions based on the test's status.
Plant operator
View test details via the pending tasks:
The list of pending tasks can be accessed via the landing page for the Treatment Quality Monitoring (TQM).
The list will display the tests pending within the next [X] days.
The next action item in the task workflow will be displayed as a button beside each pending task for the plant operator to take prompt action.
Clicking on the action item button will redirect the user to the test details page.
View test details via the inbox:
A list of tests can be accessed from the Inbox.
An action item will be displayed based on the next status in the treatment quality workflow for each test:
- For tests in the 'Scheduled' stage, the "Update Status" will be shown.
- For tests in the "Pending Results" stage, the "Update Results" will be displayed.
Clicking on the action item button will redirect the user to the test details page.
Test details page:
It will consist of two cards. First card - Test Information: The first card will display the following fields:
Test ID
Treatment process
Stage
Output type
Pending date
Status
Parameters to be tested along with their unit of measurement
SLA (Displayed in red/green basis SLA. If today is greater than the pending date, it will be shown in red. If today is less than the Pending Date, it will be shown in green.)
Second card - Test status specific action: The second card will vary based on the test status:
For tests in the 'Scheduled' status, the user will be asked to select the lab.
For tests in the "Pending Results" status, the user will be asked to add test results.
Go back using the back button:
The user can go back using the back button, and the redirection will be based on the page from which the user accessed the test details page (either pending tasks or inbox).
Help button:
A clickable button available on every page to provide a guided view and assistance for users.
N/A
The plant operator can view test details either via the pending tasks or the Inbox.
The plant operator can perform specific actions based on the test's status, such as updating the status or adding test results.
The plant operator should be able to view the test details via the pending tasks or the inbox.
The test details page should display accurate information, including test ID, treatment process, stage, output type, pending date, status, parameters, unit of measurement, and SLA.
The second card on the test details page should prompt the plant operator to take the appropriate action based on the test's status ("Select Lab" for tests in the 'Scheduled' status and "Add Test Results" for tests in the "Pending Results" status).
The plant operator should be able to navigate back using the back button, and the redirection should be appropriate based on the page from which the test details page was accessed.
The help button should be available and functional on the test details page.
As a plant operator, I want to update the test details from the test details page. The test details page will display the next action item based on the workflow status of the test. I should be able to perform different actions depending on whether the test is in the 'Scheduled' status or the "Pending Results" status.
- Plant operator
Updating tests with the workflow status 'Scheduled':
- The test details page will prompt the plant operator to confirm if the sample has been submitted to the lab for testing.
- The user can perform the following actions:
- Select lab: The plant operator can choose a lab from a dropdown list configured in the Master Data Management System (MDMS).
- Update the status of the test: The button to update the test status will be deactivated until the lab is selected. Once the lab is chosen, the button will be activated.
- After clicking on "Update Status," the plant operator will be redirected back to the page from which the test details were accessed. A snack bar will confirm the status update, and the action item button will show the updated next step in the workflow.
- In case the update of the status fails, the plant operator will remain on the same page, and a failure message will be displayed.
Updating tests with the workflow status "Pending Results":
- The test details page will prompt the plant operator to fill in the test results.
- The user can perform the following actions:
- Update parameter readings (mandatory fields): Only numerical values will be allowed. In case non-numerical values are entered, an error message will be displayed, stating "Only numeric values allowed. Please input in the required format."
- Attach documents (non-mandatory): Only files in the formats .png, .jpg, .pdf will be supported. If a file of an unsupported format is selected, an error message will be displayed, stating: "The file type is not supported. Please upload in the following formats: .pdf, .png, .jpg." The file size should be within the permissible limit (X mb), and if the file size is larger, an error message will be displayed, stating: "The file size is too large. Please upload a file below X MB."
- Submit the test results: The button to submit the test results will be deactivated until all mandatory fields are filled. Once all required fields are completed, the button will be activated.
- On clicking the 'Submit' button, a pop-up will be displayed to the user to confirm the submission.
- The following actions will be available to the user in the pop-up:
- Confirm submission: The plant operator can confirm the submission by clicking on the 'Submit' button.
- Go back to the test details page: The plant operator can go back to the test details page by clicking on the "Go back" button.
- In case the submission of test results fails, the plant operator will remain on the same page, and a failure message will be displayed.
After the successful submission of the test results:
- Upon successful submission, the plant operator will be redirected to the summary page, and a snack bar will confirm the submission.
- The summary page will display the test results and whether they have passed or failed based on a comparison between the values entered by the user and the benchmarks.
- If all values are as per the benchmarks, the test results will be displayed as 'Pass'. All values will be shown in green, and the plant operator will receive feedback that all results are as per the benchmarks. The plant operator can go back to the home page by clicking on the back button.
- If one or more values are not as per the benchmarks, the test results will be displayed as 'Fail'. Values that meet the benchmarks will be shown in green, while values not meeting the benchmarks will be shown in red. The plant operator will be informed that the test results are not as per the benchmark. The plant operator can go back to the home page by clicking on the back button.
N/A
- The plant operator can update the test details, including lab selection and test status for tests in the 'Scheduled' status.
- The plant operator can fill in the test results for tests in the "Pending Results" status, including numerical values for parameter readings and optional document attachments.
- The plant operator can submit the test results and confirm the submission.
- The plant operator can go back to the test details page or the home page .
N/A
- The plant operator should be able to update the test details for tests in both 'Scheduled' and "Pending Results" status.
- The test details page should prompt the plant operator with appropriate actions based on the test's workflow status.
- Validations for numerical values and file formats/sizes should work as described.
- Successful submission of the test results should redirect the plant operator to the summary page and display the results as 'Pass' or 'Fail' based on the benchmarks.
- The plant operator should be able to go back to the test details page or the home page using the appropriate buttons.
- The help button should be available and functional on the test details and summary pages.
As a plant operator, I want to view past test results (both IoT and lab) by accessing the "Past Tests" section from the Treatment Quality Monitoring (TQM) landing page. I should be able to view a list of past tests, filter and sort them, and access detailed test results for each test.
- Plant operator
View past test results:
- Past test results (both IoT and lab) can be accessed via the TQM landing page by clicking on the "Past Tests" option.
- Clicking on "Past Tests" will redirect the user to the list of past tests.
- The user can perform the following tasks:
1. View the list of past tests:
The list will display the following fields for each test:
- Test ID
- Treatment process (if there is only one treatment process configured for the plant, this field will not be displayed).
- Stage: Indicates the process stage where the sample was collected from.
- Output type: Specifies whether the test is for biosolids or effluents.
- Pending date: The test date as per the schedule.
- Test result: Indicates whether the test result is 'Pass' or 'Fail'.
- View test details: The user can view detailed test results by clicking on the "View Results" button on each card.
2. Filter tests:
Clicking on 'Filter' will open a pop-up with the following filter options:
- Treatment process (if there is only one treatment process configured for the plant, this field will not be displayed): A dropdown with values for treatment processes configured for the plant. The selected treatment process will be displayed upon selection.
- Output type: A dropdown with values for the output types configured for the plant. The selected output type will be displayed upon selection.
- Test type: A dropdown with values for test types (IoT/lab). The selected test type will be displayed upon selection.
- Date range: A calendar view to select a date range. The selected date range will be displayed upon selection.
After selecting the filter values, the user can click on 'Filter' to apply the filters to the list of past tests. To clear filters, the user can click on "Clear All." To close the pop-up, the user can click on the cross on the top right-hand corner of the screen. The selected filter will be displayed on the screen, and clicking the cross button near the displayed filter will remove the filter.
3. Sort:
Clicking on 'Sort' will open a pop-up allowing tests to be sorted by "Pending Date" with the following options:
- Date (latest first).
- Date (latest last).
After selecting the sort criteria, the user can click on 'Sort' to apply the sorting to the list of past tests. To clear the sort, the user can click on "Clear All." To close the pop-up, the user can click on the cross on the top right-hand corner of the screen.
4. Go back to the landing page:
The plant operator can use the back button to return to the landing page.
5. Help button:
A clickable button available on every page to provide a guided view and assistance for users.
The user can download the list of tests, filtered by selection in excel and pdf formats.
Test details:
- The test summary page will consist of two cards.
First card - Test information:
The first card will display the following fields:
- Test ID
- Treatment process
- Stage
- Output type
- Test type
- Lab name/Device ID: This will show lab name/device ID based on the test type.
- Test submission on
- Test results: Indicates whether the test result is 'Pass' or 'Fail'.
Second card - Parameter details:
The second card will display the following fields:
- Parameters, their unit of measurement, and the recorded values.
- The values will be shown in green or red basis whether they are as per benchmarks or not.
- The plant operator can view past test results by accessing the "Past Tests" section.
- The plant operator can filter and sort the list of past tests based on treatment process, output type, test type, and date range.
- The plant operator can view detailed test results by clicking on the "View Results" button for each test.
- The plant operator can navigate back to the landing page or the list of past tests as needed.
- The help button is available and functional on the test summary page.
NA
N/A
N/A
- The plant operator should be able to view past test results by clicking on "Past Tests" from the TQM landing page.
- The list of past tests should display the relevant fields as described in the user story.
- The plant operator should be able to filter and sort the list of past tests based on treatment process, output type, test type, and date range.
- Clicking on the "View Results" button for each test should redirect the plant operator to the test summary page.
- The test summary page should display accurate information, including test ID, treatment process, stage, output type, test type, lab name/device ID, test submission on, test results.
- User can download the list past tests (filtered) by clicking on the download button.
As a plant operator, I want to view IoT readings from the Treatment Quality Monitoring (TQM) landing page by clicking on "View IoT Readings." I should be able to access the view tests page and filter the results to view only IoT readings.
- Plant operator
- The functionality of the view tests page for IoT readings remains the same as the view past tests page.
N/A
NA
N/A
View IoT readings:
- IoT readings can be accessed via the TQM landing page by clicking on "View IoT Readings."
- Clicking on "View IoT Readings" will redirect the user to the view tests page, with the filter pre-set to show only IoT readings.
N/A
- The plant operator should be able to view IoT readings by clicking on "View IoT Readings" from the TQM landing page.
- The view tests page should display a list of IoT readings with the relevant fields as described in the user story.
- The plant operator should be able to filter and sort the list of IoT readings based on treatment process, output type, and date range.
- Clicking on the "View Results" button for each IoT reading should redirect the plant operator to the test summary page.
- The test summary page for IoT readings should display accurate information, including test ID, treatment process, stage, output type, test type, lab name/device ID, test submission on, test results.
- The plant operator should be able to navigate back to the landing page or the list of IoT readings as needed.
- The help button should be available and functional on the view tests page for IoT readings.
As a plant operator, I want to access sensor monitoring from the Treatment Quality Monitoring (TQM) landing page. I should be able to view a list of IoT devices and their details. Additionally, I should be able to filter the devices based on various criteria and perform a search based on the device ID.
- Plant operator
Sensor monitoring:
- Sensor monitoring can be accessed by clicking on the "Sensor Monitoring" link on the TQM landing page.
- Clicking on "Sensor Monitoring" will display the list of IoT devices.
- The page will display the total number of IoT devices beside the page heading in brackets.
- For each device, a card will be available, displaying the following details:
- Device ID.
- Treatment process (if there is only one treatment process configured for the plant, this field will not be displayed).
- Stage: Indicates the process stage where the device is used.
- Output type: Specifies whether the device monitors biosolids or effluents.
- Last calibrated date: The date when the device was last calibrated.
- Device Status: Indicates whether the device is 'Active' or 'Inactive'.
- Verification status: Indicates the verification status of the device.
- Last verification date: The date of the last verification.
- Parameters: The parameters that the device monitors.
- Filter devices:
Clicking on 'Filter' will open a pop-up with the following filter options:
- Treatment process (if there is only one treatment process configured for the plant, this field will not be displayed): A dropdown with values for treatment processes configured for the plant. The selected treatment process will be displayed upon selection.
- Output type: A dropdown with values for the output types configured for the plant. The selected output type will be displayed upon selection.
- Device status: A radio button showing options 'Active' and 'Inactive' to filter devices based on their status.
- Parameters: Multi-select displaying all parameters configured on the backend. The plant operator can select multiple parameters to filter devices.
After selecting filter values, the user can click on 'Filter' to apply the filters to the list of IoT devices. To clear filters, the user can click on "Clear All." To close the pop-up, the user can click on the cross on the top right-hand corner of the screen. The selected filter will be displayed on the screen, and clicking the cross button near the displayed filter will remove the filter.
- Search:
Clicking on 'Search' will open a pop-up for the user to search for a device by the device ID. The search should support partial search (part search) to allow the plant operator to find devices quickly based on the device ID.
- Go back to the landing page:
The plant operator can use the back button to return to the landing page.
- Help button:
A clickable button available on every page to provide a guided view and assistance for users.
NA
N/A
N/A
- The plant operator should be able to access sensor monitoring by clicking on the "Sensor Monitoring" link from the TQM landing page.
- The page should display the list of IoT devices with relevant fields as described in the user story.
- The plant operator should be able to filter devices based on treatment process, output type, device status, and parameters.
- The plant operator should be able to perform a search based on the device ID with partial search support.
- The plant operator should be able to navigate back to the landing page or the list of IoT devices as needed.
- The help button should be available and functional on the sensor monitoring page.
As a plant operator, I want to access the dashboards from the Treatment Quality Monitoring (TQM) landing page. I should be able to view different dashboards specific to the treatment process types. Additionally, I should be able to filter the dashboard based on a date range and perform actions like sharing and downloading the dashboard and its charts/tables.
- Plant operator
Dashboards:
- Dashboards can be accessed by clicking on the "View Dashboards" link on the TQM landing page.
Navigation:
- On landing on the dashboard, the user can navigate across the treatment process types to view the dashboard specific to the selected treatment process type.
Filters:
- Date range: The user should be able to filter the dashboard based on a date range to view relevant data for the selected time period.
Share:
- The user should be able to share the filtered dashboard over WhatsApp in the image format.
- The user should be able to share filtered charts/tables over WhatsApp in the image format.
Download:
- The user should be able to download the filtered dashboard in PDF and image formats.
- The user should be able to download filtered charts/tables in PDF and image formats.
Metrics:
- Overall KPIs:
- The dashboard will display the following KPIs:
- Total incoming sludge: The sum of the total sludge that is disposed at the plant for the selected time period.
- Number of trips: The count of the total incoming vehicles at the treatment plant for the selected time period.
- Overall quality: The number of tests where all parameters are as per the benchmarks compared to the total number of test results recorded.
- Compliance percentage: The percentage of tests where results have been recorded.
- Total alerts: The count of total alerts raised for the following types:
1. Test results not as per the benchmark.
2. No reading from the IoT device.
3. Lab results and IoT results not matching.
- N/A
NA
- N/A
- N/A
<To be updated>
- The plant operator should be able to access the dashboards by clicking on the "View Dashboards" link from the TQM landing page.
- The user should be able to navigate across the treatment process types and view specific dashboards accordingly.
- The user should be able to filter the dashboard based on a date range to view relevant data for the selected time period.
- The user should be able to share the filtered dashboard and charts/tables over WhatsApp in the image format.
- The user should be able to download the filtered dashboard and charts/tables in PDF and image formats.
- The dashboard should display the specified KPIs for the plant operator to monitor and analyse treatment quality effectively.
As a plant operator, I want to view a dashboard card named "Treatment Quality Overview" on the dashboards page in the Treatment Quality Monitoring (TQM). The card should display the key performance indicators (KPIs) related to treatment quality, a table with relevant fields, and options to filter the data, share the dashboard, and download the data.
- Plant operator
Treatment quality overview dashboard card:
- The "Treatment Quality Overview" card will be available on the dashboards page in TQM.
- KPIs:
- The card will display the following KPIs:
- Total tests: The count of total tests for the filtered date range.
- The count of tests that have passed treatment quality.
- The count of tests that have failed treatment quality.
- Table:
- Heading: Name of the plant
- The table will display the following fields:
- Stage
- Output type
- Value of Parameters
- Compliance percentage
- Filter:
- The user should be able to filter the data displayed in the card based on a date range. This will allow the plant operator to view data for a specific time period.
- Share:
- The user should be able to share the dashboard card "Treatment Quality Overview" with others over WhatsApp in the image format. This will enable easy sharing of the treatment quality data with relevant stakeholders.
- Download:
- The user should be able to download the data displayed in the "Treatment Quality Overview" card. This should include the KPIs and table data. The download options should include PDF and image formats. This will allow the plant operator to keep a record of the treatment quality data for further analysis and reporting.
- View trends:
- For each stage in the table, a button will be available to "View Trends". Clicking on this button will redirect the user to view trends specific to the selected stage. This will help the plant operator analyse the historical performance of the treatment quality for a particular stage.
- Toggle IoT readings and lab results:
- The user should be able to toggle between viewing the IoT readings and the lab results in the "Treatment Quality Overview" card. This toggle will allow the plant operator to switch between different data sources and gain insights into the treatment quality from different perspectives.
- N/A
NA
- N/A
- N/A
<To be updated>
- The plant operator should be able to view the "Treatment Quality Overview" card on the dashboards page in TQM.
- The card should display the specified KPIs for the filtered date range.
- The table should show relevant fields, including stage, output type, value of parameters, and compliance percentage.
- The plant operator should be able to filter the data based on a date range to view data for a specific time period.
- The plant operator should be able to share the "Treatment Quality Overview" card with others over WhatsApp in the image format.
- The plant operator should be able to download the data from the "Treatment Quality Overview" card in PDF and image formats.
- The plant operator should be able to view trends for specific stages by clicking on the "View Trends" button in the table.
- The plant operator should be able to toggle between viewing the IoT readings and the lab results in the "Treatment Quality Overview" card.
As a plant operator, I want to view the trends of parameter readings over time for a stagein the treatment quality overview table. The chart should provide a comparison with the benchmark and a toggle to navigate between different parameters.
- Plant operator
Trends of parameter readings:
- The chart will be available once the plant operator clicks on the "View Trend" button in the treatment quality overview table.
- Chart features:
- Parameter trend over time:
- The chart will display the trend of one selected parameter over time.
- The x-axis of the chart will represent time, showing the data points over a specific time period.
- The y-axis will represent the parameter values recorded during that time period.
- Benchmark comparison:
- The chart will include a benchmark line to provide a comparison with the benchmark value for the selected parameter.
- The benchmark line will be displayed on the chart, helping the plant operator visualise how the parameter readings compare to the expected standard.
- Toggle for parameters:
- A toggle will be available to navigate between different parameters for which the trends are available.
- The plant operator can select a different parameter from the toggle to view the trend of that particular parameter.
- N/A
NA
- N/A
- N/A
<To be updated>
- The plant operator should be able to view the trends of parameter readings after clicking on the "View Trend" button in the treatment quality overview table.
- The chart should accurately display the trend of the selected parameter over time.
- The benchmark line should be visible and appropriately represent the expected standard for the selected parameter.
- The plant operator should be able to toggle between different parameters to view their trends.
As a ULB employee, I want to have a card for Treatment Quality Monitoring (TQM) on the landing page. The card should provide an overview of pending tests and tests nearing SLA, allow me to view upcoming tests using the inbox, access past test results, IoT readings, and sensor monitoring. Additionally, I should be able to view the treatment quality dashboard, and receive alerts related to TQM, which I can view in detail or dismiss.
ULB employee
Landing page: ULB employee
- Treatment Quality Monitoring card:
- The landing page will include a card for Treatment Quality Monitoring.
- Overview:
- The card will provide an overview of the total pending tests and the count of tests nearing SLA.
- View the upcoming tests using the inbox:
- The card will display the count of the upcoming tests beside in brackets.
- The ULB employee can click on the inbox to view the upcoming tests.
- View the past test results:
- The ULB employee can click on a link to view the past results from both lab and IoT devices.
- View IoT readings:
- The ULB employee can click on a link to access the record of IoT readings.
- Sensor monitoring:
- The ULB employee can click on a link to access a list of IoT devices along with their status.
- View dashboard:
- The ULB employee can click on a link to be directed to the treatment quality dashboard.
- Alerts:
- The card will display a list of alerts regarding Treatment Quality Monitoring.
- Specifically, it will display tests that have crossed SLA for greater than 7 days.
- The ULB employee can view the details of each test by clicking on the "View Details" button.
- The ULB employee can dismiss notifications by clicking on the cross button.
- Other functionality:
- The rest of the functionality on the landing page will remain the same as the current ULB employee landing page.
- N/A
NA
- N/A
- N/A
- The ULB employee should be able to view the Treatment Quality Monitoring card on the landing page.
- The card should display an overview of the total pending tests and tests nearing SLA.
- The ULB employee should be able to click on the inbox to view the upcoming tests.
- The ULB employee should be able to click on links to view past test results, IoT readings, sensor monitoring, and the treatment quality dashboard.
- The card should display alerts related to TQM and allow the ULB employee to view the details or dismiss them.
- The rest of the functionality on the landing page should remain unaffected.
As a user, I want to view a list of the upcoming lab tests in the Treatment Quality Monitoring (TQM) module by clicking on the inbox. The list should be sorted by the pending date, displaying the test with the highest SLA first. I should be able to see the test ID, plant name, treatment process, pending date, status, and SLA. Additionally, I should be able to filter the tests, search for specific tests by test ID or plant name, and redirect to other pages in the TQM module using the provided links.
ULB employee/state employee
View the list of upcoming tests:
- Inbox navigation:
- The user can click on the inbox to be redirected to the list of upcoming lab tests.
- Total count of upcoming tests:
- The total count of upcoming tests will be displayed beside the inbox in brackets.
- Sorting:
- The list of upcoming tests will be sorted by the pending date, where the test with the highest SLA is displayed first.
- Fields displayed:
- The list of upcoming tests will display the following fields:
- Test ID
- Plant Name
- Treatment Process
- Pending Date: This is the test date as per schedule
- Status: Status of the test
- SLA: The difference between the test due date and today. This will be displayed in red if the test due date is earlier than today, and in green if today is before the test due date.
- View test details:
- The user can view detailed information about a specific test by clicking on the test ID.
- Filters:
- Filters are displayed on the left-hand panel of the screen.
- The following filters are available:
- Treatment process: A multi-select showing values for the treatment processes configured for the urban local body (ULB). The selected treatment processes will be displayed as ticks on the multi-select box. If not selected, it is left blank.
- Stages: A dropdown showing values for the stages configured for the plant. The selected stage is displayed here on selection. If not selected, the field is left blank.
- Status: A multi-select showing values for the status in the treatment quality workflow.
- The user can select values for the filters and click on filter to filter the inbox accordingly.
- To clear the filters, the user can click on the refresh icon on the top right of the filter panel.
- Search:
- The user can search for specific tests using the following:
- Test ID
- Plant name
- Part search is enabled for both fields.
- The user can fill in either the test ID or the plant name or both and click on the search button.
- The user can clear the search by clicking on the clear search link.
- Retaining filters, sort, and search:
- In case filters, sort, or search are applied, and the user navigates to the test details page, on going back, the values of the filters, sort, and search should remain the same.
- Redirecting to other Links:
- Users can redirect to other pages in the TQM module via the links provided on the top left of the page.
- The following links will be displayed:
- View past results
- View IoT results
- Sensor monitoring
- View dashboard
- N/A
NA
- N/A
- N/A
- The user should be able to view the list of the upcoming lab tests by clicking on the inbox.
- The list of upcoming tests should be sorted correctly by the pending date, showing the test with the highest SLA first.
- The user should be able to see the test ID, plant name, treatment process, pending date, status, and SLA in the list of upcoming tests.
- The user should be able to view detailed information about a specific test by clicking on the test ID.
- The user should be able to filter the tests based on the treatment process, stages, and status.
- The user should be able to search for specific tests by the test ID or the plant name.
- The user should be able to redirect to other pages in the TQM module using the provided links.
As a user, I want to access a test details page by clicking on the test ID in the inbox. The test details page should display detailed information about the test, including test ID, plant name, treatment process, stage, output type, test type, test scheduled date, status, lab name, sample submission date, test results submission date, SLA, a table with parameter details, overall test results (pass/fail), attached documents (if any), and the test timeline. Additionally, I should be able to go back using the breadcrumbs of the page and download the test report.
ULB employee/State employee
- Accessing the test details page:
- The user can access the test details page by clicking on the test ID in the inbox.
- Fields displayed:
- The following information will be displayed on the test details page:
- Test ID
- Plant Name
- Treatment Process
- Stage
- Output Type
- Test Type
- Test Scheduled Date
- Status
- Lab Name
- Sample Submission Date
- Test Results Submission Date
- SLA: This will be displayed in red/green based on SLA. If today > pending date, it will be displayed in red. If today < pending date, it will be displayed in green for open tests. For closed tests, SLA will be displayed.
In case the information on any field is not available, such as lab name/value against parameters based on the status of the test, the value against the fields will be displayed as "To be Updated".
- Table with parameter details:
- The page will include a table with the following details:
- S.No
- Parameter
- Unit of Measurement (UoM)
- Benchmark
- Value Recorded: The value will be displayed in red/green based on comparison to the benchmark.
- Overall Test Results: Pass/Fail
- Attached documents:
- The user should be able to view attached documents (if any) by clicking on the document icon. No icon will be present if the documents are not attached.
- Test timeline:
- The test details page will display the test timeline, showing the various stages and dates involved in the test process.
- Go back:
- The user can go back to the previous page using the breadcrumbs of the page.
- Download test report:
- The user can download the test report by clicking on the download button.
N/A
N/A
N/A
N/A
- The user should be able to access the test details page by clicking on the test ID in the inbox.
- The test details page should display detailed information about the test, including Test ID, Plant Name, Treatment Process, Stage, Output Type, Test Type, Test Scheduled Date, Status, Lab Name, Sample Submission Date, Test Results Submission Date, and SLA.
- The page should include a table with parameter details, showing S.No, Parameter, UoM, Benchmark, Value Recorded, and Overall Test Results (Pass/Fail).
- The user should be able to view attached documents (if any) by clicking on the document icon.
- The test details page should display the test timeline, showing the various stages and dates involved in the test process.
- The user should be able to go back using the breadcrumbs of the page.
- The user should be able to download the test report by clicking on the download button.
As a user, I want to view past test results (both IoT and lab) via the TQM landing page by clicking on the "Past Tests" link. The Past Test Results page will display a list of past tests with the ability to view test details, search tests based on various criteria, sort the results, and download test results in Excel and PDF formats. Additionally, I should be able to go back using the breadcrumbs on the top of the page and clicking on the Test ID will redirect me to the test details page.
ULB employee/State employee
Past Test Results Page:
- Accessing Past Test Results Page:
- The user can access the Past Test Results page by clicking on the "Past Tests" link on the TQM landing page.
- Fields Displayed:
- The page will display a list of past tests sorted on the test date, with the following fields for each test:
- Test ID
- Plant
- Treatment Process (in case there is only 1 treatment process for the plant, this field will not be displayed)
- Test Type
- Test Date: This is the date the test results are updated
- Test Result: Pass/Fail
- View Test Details:
- The user can view test details by clicking on the "Test ID" link on each row, which will redirect the user to the test details page (same as redirection from Inbox).
- Search Tests:
- The user can search for past tests based on the following criteria:
- Test ID: Input Text field, Part search should be enabled.
- Plant: Dropdown of a list of plants in the ULB.
- Treatment Process: Dropdown of a list of treatment processes in the ULB.
- Test Type: This will be a dropdown showing values for Test Type (IoT/lab). Selected test type is displayed here on selection. If not, the field is left blank.
- Date range: Selection of date range (calendar view): Selected date range is displayed here on selection. If not, the field is left blank.
- Sort:
- Tests can be sorted by the Test Date by clicking on the date column.
- Clear Search:
- To clear the search and view all past tests, the user can click on the "Clear Search" button.
- Download Test Results:
- The user can download list of test results in Excel and PDF formats using the download button.
- Go Back:
- The user can go back to the previous page using the breadcrumbs on the top of the page. If the user has navigated to the Test details page from the Past Test results list, clicking back will redirect the user to the Past Test Results page.
N/A
N/A
N/A
N/A
- The user should be able to access the Past Test Results page by clicking on the "Past Tests" link on the TQM landing page.
- The Past Test Results page should display a list of past tests sorted on the test date, with relevant fields such as Test ID, Plant, Treatment Process, Test Type, Test Date, and Test Result (Pass/Fail).
- The user should be able to view test details by clicking on the "Test ID" link on each row, redirecting the user to the Test details page (same as redirection from Inbox).
- The user should be able to search for past tests based on Test ID, Plant, Treatment Process, Test Type, and Date range criteria.
- The user should be able to sort past tests by the Test Date.
- The user should be able to clear the search and view all past tests.
- The user should be able to download test results in Excel and PDF formats.
- The user should be able to go back using the breadcrumbs on the top of the page and clicking on the Test ID will redirect the user to the test details page.
As a user, I want to view IoT readings via the TQM landing page by clicking on the "View IoT Readings" link. On clicking the link, the user should be redirected to the list of past tests, with the search on Test Type selected as "IoT" and the results filtered for IoT readings only. All other functionality of the page should remain the same.
ULB employee/State employee
- Details:
- Accessing "View IoT Readings" Page:
- The user can access the "View IoT Readings" page by clicking on the "View IoT Readings" link on the TQM Landing Page.
- Redirected Page:
- On clicking "View IoT Readings," the user is redirected to the list of past tests, with the search on Test Type selected as "IoT," and the results filtered for IoT readings only.
- Other Functionality:
- All other functionality available on the page (e.g., viewing past test results, searching tests, sorting, downloading test results) will remain the same.
N/A
N/A
N/A
N/A
N/A
- The user should be able to access the "View IoT Readings" page by clicking on the "View IoT Readings" link on the TQM landing page.
- On clicking the link, the user should be redirected to the list of past tests, with the search on Test Type selected as "IoT," and the results filtered for IoT readings only.
- All other functionality available on the page should remain the same, allowing the user to view past test results, search tests, sort, and download test results as before.
As a user, I want to view the list of devices via the TQM landing page by clicking on the "Sensor Monitoring" link. On clicking the link, the user should be redirected to the list of devices. The page should display the total number of IoT devices, and each device should have specific details such as Device ID, Plant, Treatment Process, Stage, Output Type, Device Status, and the parameters it is monitoring. The user should be able to search and filter devices based on various criteria.
ULB employee/State employee
- Accessing "Sensor Monitoring" Page:
- The user can access the "Sensor Monitoring" page by clicking on the "Sensor Monitoring" link on the TQM Landing Page.
- Redirected Page:
- On clicking "Sensor Monitoring," the user is redirected to the list of devices.
- List of Devices:
- The page will display the total number of IoT devices beside the page heading in brackets.
- Each device will have a row displaying the following details:
- Device ID
- Plant
- Treatment Process
- Stage
- Output Type
- Device Status
- Parameters: One or multiple parameters that the device is monitoring.
- Filters:
- Search Devices:
- On clicking "Filter," a pop-up will be displayed with the following filters:
- Device ID: Allows part search for Device ID.
- Plant: Dropdown with values based on plants configured in the MDMS.
- Treatment Process: Dropdown with values based on the Treatment process type.
- Stage: Dropdown with values based on the Stage of the selected Treatment process.
- Output Type: Dropdown with values based on Output types configured for the plant.
- Device Status: Dropdown with options for Active/Inactive.
- The user can select values for the filters above and click on "Search" to filter the list of devices based on the selected criteria.
- To clear the search and view all devices, the user can click on "Clear All."
N/A
N/A
- The user should be able to access the "Sensor Monitoring" page by clicking on the "Sensor Monitoring" link on the TQM landing page.
- On clicking the link, the user should be redirected to the list of devices, displaying the total number of IoT devices and specific details for each device, such as Device ID, Plant, Treatment Process, Stage, Output Type, Device Status, and the parameters it is monitoring.
- The user should be able to search and filter devices based on various criteria using the available filters.
- The user can perform the search and view the filtered devices based on the selected criteria.
- The user can clear the search and view all devices if needed.
As a user, I want to be able to record test results without a schedule for adhoc tests. A provision will be provided for the user to record test results by clicking on the "Add Test Result" link on the card. Clicking on the link should redirect the user to the "Add Test Result" page where the user can enter the required fields, including Plant Name, Treatment Process, Treatment Stage, Output Type, values against parameters, and any attachments. After submitting the test results, the user should be redirected to the test results page with specific changes to the display compared to the View Test Results page.
ULB employee/State employee
- Accessing "Add Test Result" Page:
- The user can access the "Add Test Result" page by clicking on the "Add Test Result" link on the Card.
- Redirected Page:
- Clicking on "Add Test Result" will redirect the user to the "Add Test Result" page.
- Fields to Enter:
- The user needs to enter the following fields:
- Plant Name: A dropdown based on the list of Plants available in the system. For a state-level user, this should display all plants in the state. For a ULB, it should also display the names tagged to the ULB.
- Treatment Process: A dropdown based on the list of Treatment Processes in the selected plant.
- Treatment Stage: A dropdown based on the list of Stages in the selected Treatment Process.
- Output Type: A dropdown based on the Output types available in the selected stage.
- Values against parameters: The user should fill in at least 1 parameter for the Submit button to be enabled. If no parameter is filled, and the user clicks on the Submit button, an error message is displayed as a snack bar.
- Attachments: If any attachments are required, the user can add them.
- Submitting Test Results:
- Once the user clicks on the Submit button, the test results page is displayed.
- Changes to "View Test Results" Page:
- The "View Test Results" page with the following changes:
- Test Type will be displayed as "Lab."
- Status, Lab Name, and SLA fields are not displayed.
- Workflow will not be displayed.
<To be updated>
N/A
N/A
- The user should be able to access the "Add Test Result" page by clicking on the "Add Test Result" link on the card.
- The "Add Test Result" page should allow the user to enter required fields, including Plant Name, Treatment Process, Treatment Stage, Output Type, values against parameters, and any attachments.
- After submitting the test results, the user should be redirected to the test results page with specific changes to the display compared to the "View Test Results" page.
The TQM Dashboard will be available to both ULB employees and state employees. It can be accessed by clicking on the 'Dashboard' link on the landing page. Upon clicking, the user will be directed to the dashboard view. The access to data in the dashboard will be based on the roles, where the ULB admin can view the dashboard for all plants in the ULB, and the state admin can view the dashboard for all plants in the state.
- ULB Employee
- State Employee
- Accessing "Dashboard" View:
- The user can access the Dashboard by clicking on the "Dashboard" link on the landing page.
- Redirected to Dashboard:
- Clicking on "Dashboard" will direct the user to the Dashboard view.
- Navigation:
- On the Dashboard view, the user can navigate across Treatment Process Types to view specific dashboards for each Treatment Process Type.
- Filters:
- Date Range: The user should be able to filter the dashboard based on the selected date range.
- ULB: For ULB Employees, the ULB is automatically selected to the ULB the Plant and Employee is tagged to. For State users, all ULBs should be available in the dropdown.
- Plant: For ULB Employees, plants tagged to the ULB to which the employee belongs should be available in the dropdown. For State users, all plants should be available.
- Share Functionality:
- User should be able to share the filtered dashboard over WhatsApp in image format.
- User should be able to share filtered charts/tables over WhatsApp in image format.
- Download Functionality:
- User should be able to download the filtered dashboard in PDF and image format.
- User should be able to download filtered charts/tables in PDF and image format.
- Metrics - Overall KPIs:
- The Dashboard will display the following KPIs:
- Total Incoming Sludge: Sum of the total sludge that is disposed of at the plant for the selected time period.
- # of trips: Count of total incoming vehicles at the Treatment Plant for the selected time period.
- Overall Quality: Number of tests where all parameters are as per benchmarks compared to the total number of test results recorded.
- Compliance %: % of tests where results have been recorded.
- Total Alerts: Count of total alerts raised of the following types:
1) Test Results not as per benchmark.
2) No reading from IoT device.
3) Lab results and IoT results not matching.
N/A
The user should have appropriate access rights to view the dashboard.
N/A
N/A
- The user should be able to access the dashboard by clicking on the 'Dashboard' link on the landing page.
- The Dashboard view should allow the user to navigate across Treatment Process Types to view specific dashboards for each Treatment Process Type.
- The user should be able to filter the dashboard based on the selected date range, ULB, and plant.
- The user should be able to share the filtered dashboard and charts/tables over WhatsApp in the image format.
- The user should be able to download the filtered dashboard and charts/tables in PDF and image formats.
- The Dashboard should display the overall KPIs, including Total Incoming Sludge, the number of trips, Overall Quality, Compliance percentage, and Total Alerts.
The Treatment Quality Overview provides an overview of the Treatment Quality for a particular Treatment Process. It includes KPIs, a map view, and a table of plant-wise details of test results (Pass/Fail) and compliance percentage.
ULB Employee/State Employee
- KPIs:
- Total Plants: The count of unique plants for the particular Treatment Process.
- Count of Plants Passed Treatment Quality: The count of plants that have passed Treatment Quality as per the last recorded test. Treatment Quality is said to have passed if all parameters for final output(s) of a Treatment process are as per benchmarks.
- Count of Plants Failed Treatment Quality: The count of plants that have failed Treatment Quality as per the last recorded test. Treatment Quality is said to have failed if one or more parameters for final output(s) of a Treatment process are not as per benchmarks.
- Map View:
- The dashboard will display a map view showing the location of each plant. Plants will be colour-coded based on whether they have passed or failed the last output quality test Treatment Quality. (Red = Failed, Green = Passed)
- Table:
- The table will show plant-wise details of Test Results (Pass/Fail) and Compliance Percentage based on the last recorded test. The user will also be able to see the change in compliance percentage compared to the last month.
- Drill down functionality will be available for a plant via this table.
- For Treatment Plant Users (TRP) and ULBs where only one plant is tagged for the process type, the drilled table is automatically visible.
- Drill Down:
- When a user drills down on a specific plant from the table, the following information will be viewable:
- Heading: Name of the Plant
- Table displaying the following fields:
- Stage
- Output Type
- Value of Parameters
- Compliance Percentage
- Button to view Trends for a particular Stage.
- Toggle to Toggle between IoT readings and Lab Results. The selected test type will appear highlighted.
- Switching between Process Flows:
- If there are multiple process flows, the user can switch between them using buttons. The selected process flow will appear highlighted.
- Filters:
- Date Range: The user should be able to filter the dashboard based on the selected date range.
- ULB: For ULB Employees, the ULB is automatically selected to the ULB the Plant and Employee is tagged to. For State users, all ULBs should be available in the dropdown.
- Plant: For ULB Employees, plants tagged to the ULB to which the employee belongs should be available in the dropdown. For State users, all plants should be available.
- Share Functionality:
- User should be able to share filtered charts/tables over WhatsApp in image format.
- Download Functionality:
- User should be able to download filtered charts/tables in PDF and image format.
N/A
The user should have appropriate access rights to view the Treatment Quality Overview dashboard.
N/A
N/A
- The Treatment Quality Overview dashboard should display the KPIs: Total Plants, Count of Plants Passed Treatment Quality, and Count of Plants Failed Treatment Quality.
- The map view should show the location of each plant, color-coded based on whether they have passed or failed Treatment Quality.
- The table should show plant-wise details of Test Results (Pass/Fail) and Compliance Percentage based on the last recorded test. The user should be able to see the change in compliance percentage compared to the last month.
- The drill down functionality should provide detailed information about a specific plant's Stage, Output Type, Value of Parameters, and Compliance Percentage.
- The user should be able to toggle between IoT readings and Lab Results for a specific plant.
- If there are multiple process flows, the user should be able to switch between them using buttons.
Scope:
The user will be able to view a trend chart for a specific parameter over time once they click on the "View Trend" button in the table. The chart will provide a comparison with the benchmark.
Actors:
- ULB Employee/State Employee
- Details:
- When the user clicks on the "View Trend" button in the table, a trend chart will be displayed.
- The chart should not be visible before Trend chart has been clicked on
- The chart will show the trend of the selected parameter over time.
- The chart will also display the benchmark for the selected parameter to provide a comparison.
- A toggle will be available to navigate between different parameters for viewing their trend charts.
Attributes Table:
- N/A
Validations:
- The user should have appropriate access rights to view trend charts.
Configurations:
- N/A
Notifications:
- N/A
User Interface:
Acceptance Criteria:
- The user should be able to view a trend chart for a specific parameter by clicking on the "View Trend" button in the table.
- The chart should show the trend of the selected parameter over time.
- The chart should display the benchmark for the selected parameter to provide a comparison.
- The user should be able to navigate between different parameters using the toggle to view their trend charts.
The following features/components will be available in the TQM module:
The aim is to provide the users of the system information and control at each level - from defining how operations will be run, to updating status against pending operational tasks, and viewing operational data to draw insights to refine operations.
Computation of the KPIs can be accessed .
Detailed metric calculations for the Treatment Quality Monitoring dashboard are viewable .
Detailed Metric calculations for the Treatment Quality Monitoring Dashboard are viewable .
Detailed Metric calculations for the Treatment Quality Monitoring Dashboard are viewable .
Attribute | Type | Mandatory | Comments | Validation Required? |
Treatment Process ID | Numeric | Y | Auto-generated numeric value which will act as a unique identified for a process flow. | N, this value should be system-generated. |
Process Name
| Text | Y | This is the commonly-used identifier for the process flow. | Max characters - 256 |
Status | Array | Y | Status of the process flow. | Active/Inactive, Single Select |
Treatment Process Type | Array | Y | The dropdown will be auto-populated basis the list of waste maintained in the MDMS. | Single Select
|
Treatment Process Sub-type | Array | Y | The dropdown will be auto-populated basis the list of waste maintained in the MDMS. | Single Select
|
Attribute | Type | Mandatory | Comments | Validation Required? |
Plant ID | Numeric | Y | Auto-generated numeric value which will act as a unique identifier for a plant. | Auto-generated |
Plant Name
| Text | Y | This is the commonly-used identifier for the plant | Maximum characters - 128 |
Plant Type | Array | Y |
| Single Select only, Faecal Sludge, Solid Waste, Co-treatment |
Tenant Id | Text | Y |
|
|
Status | Array | Y | Status of the plant | Active/Inactive, Single Select |
Geolocation | Lat,Long | Y |
| Capture the exact latitude-longitude |
Attribute | Type | Mandatory | Comments | Validation Required? |
Stage ID | Numeric | Y | Auto-generated numeric value which will act as a unique identifier for a job ID. | Auto-generated |
Stage Name
| Text | Y | This is the commonly-used identifier for the job. | Maximum characters - 128 Minimum Characters - NA |
Status | Boolean | Y | Status of the stage. | Active/Inactive, Single Select |
Input Quality Measurement Required | Boolean | Y | This selection will allow the user to setup if the input quality for the particular input type needs to be monitored. The user should be able to enable and disable input quality measurement requirement independently for each type. | Yes/No, Single Select |
Output Type | Array | Y | The dropdown will be auto-populated basis the list of output types. | Multi-select |
Output Quality Measurement Required | Boolean | Y | This selection will allow the user to setup if the output quality for the particular job needs to be monitored. The user should be able to enable and disable output quality measurement requirement independently for each type.
| Yes/No, Single Select
|
Attribute | Type | Mandatory | Validation |
Quality Parameter | Array | Y | Selecting from the predefined of the above-mentioned quality parameters and standards, Single Select. |
Quality Parameter Unit of Measurement | Array | Y | Selection of unit of measurement (mg/L, Absolute value etc.), Single Select. |
Benchmark Rule | Array | Y | Options include X>=,<=R, =<Y and >=Z, Single Select. |
Benchmark Value | Numeric | Y | Entered by user, numeric only. |
Testing Frequency - Manual (Days) | Numeric | Y | Selecting a custom frequency range for laboratory testing based on consent to operate, numeric only. |
Monitoring Frequency - Quality Sensor (Days) | Numeric | N | Selecting a custom frequency Note: Should be optional if the ULB/State chose NOT to have sensor-based monitoring, numeric only. |
Attribute | Type | Mandatory | Validation |
Test ID | Alphanumeric | View Only | Auto-generated on the creation of schedule. |
Plant Name | Text | View Only | Auto-populated on the creation of schedule. |
Treatment Process | Text | View Only | Auto-populated on the creation of schedule. |
Treatment Process Type | Text | View Only | Auto-populated on the creation of schedule. |
Stage | Text | View Only | Auto-populated on the creation of schedule. |
Output Type | Text | View Only | Auto-populated on the creation of schedule. |
Test Type | Array |
| Lab/IoT, auto-selected to the lab. |
Parameter 1…n | Text | View Only | Auto-populated on the creation of schedule. |
Testing Date | Date | View Only | Date calculated through a predefined laboratory testing schedule. |
SLA | Numeric | View Only | Difference between the current date and testing date. The compliance to a testing schedule can be checked through this field. However, the actions based on failed/successful compliance falls under vendor management, which is not in scope currently and will be taken up separately under vendor management. |
Status | Text | View Only | Status to be auto set to 'Scheduled'. |
Field | Data Type | Required | Description |
Anomaly Type | String | Yes | Specifies the type of anomaly detected. |
Benchmark | Float | Yes | Defines the benchmark value for the test results. |
Deviation Allowed | Float | Yes | Specifies the allowed deviation from the benchmark for anomaly detection. |
Sample Collection Date | DateTime | Yes | The date on which the sample was collected for testing. |
IoT Result Date | DateTime | Yes | The date on which the IoT result was recorded. |
Alert Generation Date | DateTime | Yes | The date on which the alert was generated. |
Lab Test Result | Float | Yes | The result of the manual lab test. |
IoT Test Result | Float | Yes | The result recorded via IoT integration. |
Alert Message | String | Yes | Specifies the message for the generated alert. |
Matching Date | DateTime | Yes | The date on which the closest IoT result is matched with the sample collection date for comparison. |
Date to be matched on | Sample Collection Date
If the IoT result is not available for the sample collection date, the closest date after for which IoT data available to be considered. |
Deviation allowed | x% |
Attribute | Type | Required? | Comments |
Alert DateTime | Datetime | Y | Auto-captured based on the date-time. |
Alert Type | Text | Y | Auto-captured
|
Plant Name | Text | Y | Auto-captured |
Process Name | Text | Y | Auto-captured |
Process Type | Text | Y | Auto-captured |
Device ID | Numeric | Y | Auto-captured |
Attribute | Type | Mandatory | Validation |
Test ID | Alphanumeric | View Only | Auto-generated on the creation of the schedule. |
Plant Name | Text | View Only | Auto-populated on the creation of the schedule. |
Treatment Process | Text | View Only | Auto-populated on the creation of the schedule. |
Treatment Process Type | Text | View Only | Auto-populated on the creation of the schedule. |
Stage | Text | View Only | Auto-populated on the creation of the schedule. |
Output Type | Text | View Only | Auto-populated on the creation of the schedule. |
Test Type | Array |
| Lab/IoT, auto-selected to the lab. |
Parameter 1…n | Text | View Only | Auto-populated on the creation of the schedule. |
Testing Date | Date | View Only | Date calculated through predefined laboratory testing schedule. |
SLA | Numeric | View Only | Difference between the current date and testing date. The compliance to a testing schedule can be checked through this field. However, the actions based on failed/successful compliance falls under vendor management, which is not in scope currently and will be taken up separately under vendor management. |
Status | Text | View Only | Status to be auto set to 'Scheduled'. |
Test Result Status | Roles | Action | Next Status |
Scheduled | FSTPO ULB employee | Submit the sample for testing. | Pending results |
Pending Results | FSTPO ULB employee | Update the results. | Submitted |
Attribute | Type | Required? | Comments/Validations |
Test ID | Numeric | Y | Auto-generated by the system. |
Plant Name | Array | View Only | Auto-populated on the creation of the schedule, dingle select for on-demand test. |
Treatment Process | Array | View Only | Auto-populated on the creation of schedule, single select for on-demand test. |
Treatment Process Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Stage | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Output Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Test Type | Array |
| Lab/IoT, auto-selected to the lab for on-demand. |
Lab Submitted to | Text | Y | This will not be required in case test type = IoT. |
Quality Parameter 1 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 2 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 3 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter n | Numeric | Y | Validation to be applied at impel. |
Collection Time | Date | Y | This is the date-time on which user updates the status to pending results. For IoT, this is the time sensor records reading. |
Attachment | Document | Y | For a given collection location, the photo or PDF proof of the laboratory result mentioning the information of the above-mentioned parameters. |
Attribute | Type | Required? | Comments/Validations |
Test ID | Numeric | Y | Auto-generated by the system. |
Plant Name | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Treatment Process | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Treatment Process Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Stage | Array | View Only | Auto-populated on the creation of the schedule, dingle select for on-demand test. |
Output Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Test Type | Array |
| Lab/IoT, auto-selected to the lab for on-demand. |
Lab Submitted to | Text | Y | This will not be required in case test type = IoT. |
Quality Parameter 1 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 2 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 3 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter n | Numeric | Y | Validation to be applied at impel. |
Collection Time | Date | Y | This is the date-time on which user updates the status to pending results. For IoT, this is the time sensor records reading. |
Attachment | Document | Y | For a given collection location, the photo or PDF proof of the laboratory result mentioning the information of above-mentioned parameters. |
Attribute | Type | Required? | Comments |
Test ID | Numeric | Y | Auto-generated by the system. |
Plant Name | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Treatment Process | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Treatment Process Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Stage | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Output Type | Array | View Only | Auto-populated on the creation of the schedule, single select for on-demand test. |
Test Type | Array |
| Lab/IoT, auto-selected to the lab for on-demand. |
Lab Submitted to | Text | Y | This will not be required in case test type = IoT. |
Quality Parameter 1 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 2 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter 3 | Numeric | Y | Validation to be applied at impel. |
Quality Parameter n | Numeric | Y | Validation to be applied at impel. |
Collection Time | Date | Y | This is the date-time on which user updates status to pending Results. For IoT, this is the time sensor records reading. |
Attachment | Document | Y | For a given collection location, the photo or PDF proof of the laboratory result mentioning the information of above-mentioned parameters. |
Attribute | Type | Required? | Comments |
Configuration Date | Datetime | Y |
|
Device Type | Text | Y | Selection from the device master data. [“GPS Sensor”, “pH Sensor”, “Accelerometer”, “Light Sensor”] |
Plant | Text | Y |
|
Treatment Process | Text | Y |
|
Stage | Text | Y |
|
Output Type | Text | Y |
|
Parameters | Array | Y | The parameters monitored by the device. |
Monitoring Frequency | Numeric | Y | Custom frequency for the device. |
Calibration Date | Datetime | Y | Input from the user about any change in the calibration or maintenance of the device. |
Calibration Accuracy | Array | Y | Range to indicate the permissible deviation in the accuracy. |
IsConnected? | Boolean | Y | To indicate the connectivity of the device. |
Connectivity History | ? | Y | Date-wise device audit log to know the connectivity status. |
Verification History | ? |
| Date-wise device verification log to know the days when device input was verified with the laboratory results. |
Attribute | Type | Required? | Comments and Validations |
Configuration Date | Datetime | Y |
|
Device Type | Text | Y | Selection from device master data
[“GPS Sensor”, “pH Sensor”, “Accelerometer”, “Light Sensor”] |
Plant | Text | Y |
|
Treatment Process | Text | Y |
|
Stage | Text | Y |
|
Output Type | Text | Y |
|
Parameters | Array | Y | The parameters monitored by the device |
Monitoring Frequency | Numeric | Y | Custom frequency for the device |
Calibration Date | Datetime | Y | Input from the user about any change in the calibration/maintenance of the device |
Calibration Accuracy | Array | Y | Range to indicate the permissible deviation in the accuracy |
IsConnected? | Boolean | Y | To indicate the connectivity of the device |
Connectivity History | ? | Y | Date-wise device audit log to know the connectivity status |
Verification History | ? |
| Date-wise device verification log to know the days when device input was verified with laboratory results |
Attribute | Type | Required? | Comments/Validations |
Test ID | Numeric | Y | Autogenerated by system |
Plant Name | Array | View Only | Auto populated on creation of schedule, Single select for on demand test |
Treatment Process | Array | View Only | Auto populated on creation of schedule, Single select for on demand test |
Treatment Process Type | Array | View Only | Auto populated on creation of schedule, Single select for on demand test |
Stage | Array | View Only | Auto populated on creation of schedule, Single select for on demand test |
Output Type | Array | View Only | Auto populated on creation of schedule, Single select for on demand test |
Test Type | Array |
| Lab/IoT, Autoselected to Lab for on demand |
Lab Submitted to | Text | Y | This will not be required in case Test Type = IoT |
Quality Parameter 1 | Numeric | Y | Validation to be applied at impel |
Quality Parameter 2 | Numeric | Y | Validation to be applied at impel |
Quality Parameter 3 | Numeric | Y | Validation to be applied at impel |
Quality Parameter n | Numeric | Y | Validation to be applied at impel |
Collection Time | Date | Y | This is the date-time on which user updates status to Pending Results. For IoT, this is the time sensor records reading |
Attachment | Document | Y | For a given collection location, photo or PDF proof of laboratory result mentioning the information of above-mentioned parameters |
Features/Components | Description | Functionality |
Schedule of Tests | This component will be used by treatment plant operators and urban local body (ULB) employees to see the schedule of tests. | View the schedule of lab tests and track the compliance. Track the compliance of IoT test results and cases of failures. |
Recording Test Results | This component will be used by treatment plant operators and ULB employees to upload results manually and track IoT readings. | Create digital records of quality test results alerts in the following cases:
|
Anomaly Detection | This component will be used by treatment plant operators and ULB employees to interpret test results. | Identify in real-time/near real-time when the results of a particular test are not as per the benchmarks. Alerts in the following case:
|
Dashboards | This module will give stakeholders insights and information regarding the operations of the treatment plant. Users can use this to drill down and identify plants and processes where compliance to testing and/or test results are not upto benchmarks. Dashboards will also help users see trends over time to see patterns and identify long-term problem areas. | Dashboard to analyse trends in the treatment quality and compliance with the treatment schedule. Drill down will be made available from state to the ULB and at a plant level. Dashboard to analyse patterns in issues. A drill down will be made available from state to ULB and at a plant level. |
The waste management process comprises five key stages: Generation, containment, transport, treatment, and reuse. To ensure effective waste management, each of these stages must receive adequate attention. Inadequate waste treatment and the subsequent discharge into the environment directly harm both the environment and water quality. Often, effluents find their way into surface water sources. The poor quality of wastewater effluents contributes to the deterioration of the receiving surface water bodies and poses risks to users' health.
Quality and reusability are intimately linked: Well-treated waste output can be repurposed, whether as recycled water or compost. Consumer acceptance of this practice largely hinges on the quality of the output and the benefits it offers.
At present, there is minimal visibility into the handling and treatment of waste at treatment plants. With this in mind, we are introducing Treatment Quality Monitoring (TQM) as an extension to DIGIT Sanitation.
The vision of DIGIT Sanitation is “Zero untreated waste in 1,000 ULBs in 1000 days”. Read more about why this matters here.
To achieve “zero untreated waste” it is pertinent to ensure that waste is disposed of in the right method, right place, and right time. Subsequently, all waste disposed of should be processed and treated effectively. We can ensure that the waste is being effectively treated by focusing on the following:
Design the best process for waste treatment.
Ensure adherence to the designed process.
Monitor and track the effectiveness of the process.
We now explore each issue separately:
How do you know that the process you have designed is correct/needs modifications?
The success of a treatment will reflect directly in the results of the process, in terms of quality of output and process efficiency.
Testing at various stages of the process and comparing the parameters from the test results against a benchmark will be an indicator of an effective process.
Based on variations between {what the values should be in an ideal condition} and {the measured values} of the parameters, the changes required in each specific part of the process can be identified.
How do you ensure the process is being followed?
There are no digital touchpoints in the process, so there is no unbiased/people-independent method to track adherence.
An indirect method to measure adherence is through the “measure of outcomes” (parameter testing) and review of the process based on the results thereby leading to the detection of deviations from the SOP.
How do you assess if the process is effective?
Test the treated output at various stages and check if all test parameters are within acceptable range as per standards
Based on the above, monitoring parameters is a key step in ensuring waste is treated and disposed of properly.
Treatment quality monitoring (TQM) is not being carried out in most urban local bodies (ULBs).
Test data is not recorded/analysed for process improvements.
No digital touchpoints in the TQM lifecycle: Lack of such touchpoints create data silos that hinder the relevant information from being accessible to the right person at the right time.
No accountability system in place to assign responsibility to stakeholders to make necessary changes.
TQM is dependent on having access to a lab and also the logistics of sample collection, transportation, interpersonal communications, and waiting times (for tests/test results).
To address these issues, we want to introduce a product for Treatment Quality Monitoring. The objective is to improve the quality of treated waste.
Our hypothesis of how treatment quality can be improved is as follows:
Configure Treatment Processes and Testing Requirements: Configuration of plants and treatment processes provides a consolidated view to the state/TSUs/stakeholders of the treatment processes operational at plants. Additionally, this allows for the definition of treatment requirements, frequencies, and benchmarks at the state level for various processes and their adoption by plants.
Ensure Regular Testing: Implementing regular and high-frequency testing enables the early detection of deviations from the established benchmarks.
Quick Issue Resolution: Identifying deviations and diagnosing them through an analysis of both current and historical data facilitates rapid issue resolution.
Deepen Enquiry: Is a particular plant consistently performing badly? Is a particular treatment process regularly leading to poor-quality output? Which step in the process flow is the deviation starting from? Do quality test results fluctuate often? Looking at trends in data can help deepen enquiry and identify problem areas.
Urban local body (ULB) officials/employees are responsible for ensuring that regular treatment quality tests are performed at plants. In certain cases, the ULB officials/employees or third-party agencies such as the Pollution Control Board may perform independent tests. In case of non-compliance with treatment quality standards, the ULB officials/employees are to ensure that corrective actions are promptly taken.
Based on the role and responsibilities of the ULB official/employee, the following functionalities are available:
View Test Schedule: Based on the frequency of testing that is configured on the backend, the system generates a testing schedule. ULB officials/employees can view the upcoming tests for all plants tagged to the ULB.
View Past Test Results: A digital record of all the past test results uploaded on the platform can be accessed. Sort and filter functionalities make it easy to view all the test results.
Record Test Result: In case adhoc tests are conducted, employees have a provision to record the test results. Adhoc test results will also be available to view in the past test results section.
Employee login credentials (login ID and password) can be created for employees via the HRMS. Using this, the employees can sign in.
The user can perform the following actions:
Login by entering a username and password, and selecting the city.
Use the "Forgot Password" link to reset password.
After Logging in, the user will land on the home screen which contains a card with links to actions that can be performed by the user.
The "Treatment Quality" card contains the following:
Count of the total pending tests.
Inbox to view the upcoming tests.
View past test results.
Add test result.
Notifications: The notification panel displays the following:
Tests that are overdue for >7 days.
Failed tests where test results are not as per the parameter.
The user can view the details of the test by clicking on the “View Details” button. The user can dismiss the notification by clicking on the cross button.
The inbox link will redirect the user to the list of upcoming tests for plants tagged to the ULB.
View the list of upcoming tests: The list of upcoming tests will be sorted by the pending date, where the test with the highest SLA is displayed first. The user can sort the list by clicking on the table headers.
Filter tests: Filters are displayed on the left-hand panel of the screen. The following filters are available:
Treatment process: This will be multi-select showing values for the treatment processes configured for the ULB.
Stages: This will be a dropdown showing values for stages configured for the plant.
Status: This will be a multi-select showing the value for the status in the treatment quality workflow.
Selected filters can be removed by clicking on the refresh button on the top right hand corner of the filter pane or the clear button. Sort: Tests can be sorted by the pending date by clicking on the date column.
Search:
Tests can be searched using the following:
Test ID
Plant name
Users can fill either test ID or plant or both and click on the search button.
Users can clear search by clicking on the "Clear Search’"
The user can redirect to the home screen by using the breadcrumbs or the back button of the browser.
A test details page can be accessed by clicking on the "Test ID" hyperlink in the inbox.
This will show details about the upcoming test.
The past test results can be viewed from the TQM landing page and clicking on "View Past Results".
On clicking on "View Past Results", the user is redirected to the list of past tests.
The user can perform the following tasks:
View the list of past tests: The results will be sorted on the test date.
View test details: The user can view test details by clicking on the “Test ID” link in each row.
Search tests: The user can search based on the following:
Test ID:
Plant Name.
Treatment process
Test type
Date range
To clear search and view all past tests, the user can click on 'Clear'.
Sort: Tests can be sorted by the pending date by clicking on the date column.
Download test results in Excel and PDF formats using the download button.
On clicking on any "Test ID" hyperlink, the user will be redirected to the test details page. The test details page will display the summary of the submitted test along with the timelines/workflow.
A user can record test results by clicking on the “Add Test Result” link on the treatment quality card.
Clicking on the “Add Test Result” button will redirect the user to the “Add Test Result” page.
The user has to enter the below fields:
Plant Name
Treatment Process
Treatment Stage
Output Type
Quality Parameters: Based on the selections for the above, the parameters to be tested are populated in the form. At Least one parameter needs to be filled in for the submit button to be enabled. If no parameter is filled and the user clicks on the submit button, an error message is displayed saying “At Least one parameter needs to be filled to Add Test Result”.
Attachments, if any.
Once the user clicks on the ‘Submit’ button, the test results page is displayed.
Based on the benchmarks for an acceptable range of parameters defined, the user will be able to view whether the test overall and each parameter has passed or failed.
The user will land on the home page post login. The following actions can be performed by the user:
The plant name is visible on the top right hand corner of the screen.
A help button is available for the user to get a guided view of the page. This is available on every page.
Cards are viewable for the following modules:
a. Vehicle log module (record incoming vehicles)
b. Treatment quality module
c. View dashboard
Clicking on each of these cards will take the user to the Homepage for the specific module.
List of pending tasks: This will show the list of tests pending within the next [X] days. The next action item in the task workflow will be displayed beside a pending task for a user to take prompt action. A button for “View All Pending Tasks” will be displayed which will redirect the user to “All Pending Tasks”.
On clicking on the treatment quality card, the user is redirected to the TQM home page. The following actions can be performed by the user:
View and take action on upcoming tests using the inbox. The inbox will show a count of upcoming tests beside it.
View past test results: Past results from both lab and IoT devices will be displayed here.
View IoT readings: The user can access the record of IoT readings here.
Sensor Monitoring: The user can access a list of IoT devices along with their status here.
View dashboard: The user will be directed to the treatment quality dashboard.
View performance: This widget will show the performance of the plant in regards to treatment quality and will display the following KPIs:
a. Test compliance: Compliance percentage of plant with regards to the treatment quality and its comparison to state level compliance percentage.
b. Last treatment quality result - Pass/fail and date of the test.
c. Count of alerts raised in the past 30 days.
d. Distribution of alerts based on the alert category.
Go back to the Landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
View List of Upcoming Tests
On clicking on inbox, the user is redirected to the list of upcoming tests. This will show only a list of lab tests. The user can perform the following tasks:
Total count of upcoming tests are displayed beside the inbox in brackets.
View list of upcoming tests. The following will be the fields displayed:
Total count of upcoming tests are displayed beside the inbox in brackets.
View list of upcoming tests. The following will be the fields displayed:
a. Test ID.
b. Treatment process (in case there is only 1 treatment process for the plant, this field will not be displayed).
c. Stage: This will display the process stage where the sample is to be collected from.
d. Output type: Biosolids/effluents.
e. Pending date: This is the test date as per schedule.
f. Status: status of the test.
g. SLA: Show difference between test due date and today.
An action item available based on the next status in the workflow will be displayed:
a. For test results in the scheduled stage, update status will be displayed.
b. For tests in the pending results stage, update results will be displayed.
Filter tests: On clicking on filter, a pop-up will be displayed:
a. The following filters are available:
Treatment process: (In case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing the values for treatment processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
Output type: This will be a dropdown showing values for output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
Status: This will be a dropdown showing values for the status in the treatment quality workflow. The selected status is displayed here on selection. If not, the field is left blank.
Date range: Selection of date range (calendar view): Selected date range is displayed here on selection. If not, the field is left blank.
b. On selecting values for the filters above, a user can click on filter to filter the inbox.
c. To clear filters, a user can click on clear all.
d. To close the pop-up, a user can click on the cross on the top right hand corner of the screen.
Sort: On clicking on sort, a pop-up will be displayed:
a. Tests can be sorted by the pending date:
Date (Latest first)
Date (Latest Last)
b. On selecting values for sort above, the user can click on sort to sort the inbox.
c. To clear sort, a user can click on clear all.
d. To close the pop-up, a user can click on the cross on the top right hand corner of the screen.
Go back to the landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
View Test Details
Test Details can be viewed by the user in 2 ways:
Via the pending tasks.
Via inbox.
View Test Details Via Pending Tasks
The list of pending tasks can be accessed via the landing page for TQM. This will show the list of tests pending within the next [X] days.
The next action item in the task workflow will be displayed as a button beside a pending task for a user to take prompt action. On clicking on the button, the user will be redirected to the test details page.
View Test Details Via Inbox
A list of tests can be accessed on the inbox. An action item is available based on the next status in the workflow will be displayed:
For test results in the scheduled stage, the update status will be displayed.
For tests in the pending results stage, the update results will be displayed.
On clicking on the action item, the user will be redirected to the test details page.
The test details page will consist of 2 cards:
The first card will display the following fields:
Test ID
Treatment Process
Stage
Output Type
Pending Date
Status
Parameters to be tested along with their unit of measurement
SLA (This will be displayed in Red/green basis SLA. If today>Pending date, this is red, If today<pending date, then green).
The second card will be based on the test status:
For tests in status ‘Scheduled’, the user will be asked to select a lab.
For tests in status “Pending Results”, the user will be asked to add test results
The user can go back using the back button - The redirection will be based on the page the user has accessed the test details page from. A help button is available for the user to get a guided view of the page. This is available on every page.
Update Tests
Tests can be updated by the user from the test details page. The test details page will display the next action item for the user based on the workflow test. For tests with the wWorkflow status ‘Scheduled’, the user will be prompted to confirm if sample has been submitted to the lab for testing.
The user can perform the following actions:
Select a lab from a dropdown list configured in the MDMS.
Update status of the test. The button will be deactivated if Lab is not selected, and will only be activated once selection is made. Once the user clicks on update status, he/she is redirected back to the page from which test details were accessed and a snack bar confirms the status update and the action Item button shows updated next step.
In case an update of status fails, the user will remain on the same page and a failure message will be displayed to the user.
For tests with the workflow status “Pending Status”, the user will be prompted to fill test results.
The user can perform the following actions:
Update parameter readings (mandatory fields) The following validations will be applied:
a. Only numerical values will be viewable here. In case of non numerical values, the following error message is displayed “Only numeric values allowed. Please input in the required format”.
Attach documents (non-mandatory): The following validations will be applied:
a. Only files in the following formats will be supported: .png, .jpg. .pdf. In case a file of unsupported format is selected, the following error will be displayed “The file type is not supported. Please upload in the following formats: .pdf, .png, .jpg”.
b. File size of maximum X mb allowed. In case file size is larger than permitted value, the following error will be displayed “The file size is too large. Please upload a file below x mbs”.
Submit test results by clicking on the ‘Submit’ button. The button will be deactivated if all is not selected, and will only be activated once selection is made. On clicking the submit button, a pop-up will be displayed to the user to confirm submission.
The following actions will be made available to the user:
Confirm submission by clicking on the ‘Submit’ button.
Go back to the test details page by clicking on the “Go back” button.
On clicking the submit button and failure to submit test results, the user will remain on the same page and a failure message will be displayed.
On clicking the submit button and successful submission of the test results, the user will be redirected to the summary page and a snack bar will confirm the submission.
At this stage, the user will be displayed the summary of test results and whether it has passed/failed based on a comparison between the values entered by the user and the benchmarks. In case all values are as per the benchmarks, the test results will be displayed as ‘Pass’. All values will be shown in green and the user receives feedback that all results are as per benchmarks. The user can go back to the home page by clicking on the ‘Back’ button.
In case one or more values are not as per the benchmarks, the test results will be displayed as ‘Fail’. All values as per benchmarks will be shown in green. Values not as per the benchmarks are shown in red. The user is provided with information that the test results are not as per benchmark. The user can go back to the Home page by clicking on the ‘Back’ button.
View Past Test Results
Past test results (both IoT and Lab) can be viewed via the TQM landing page and by clicking on past tests.
On clicking on past tests, the user is redirected to the list of past tests.
The user can perform the following tasks:
View the list of past tests. The following will be the fields displayed:
a. Test ID.
b. Treatment process (in case there is only 1 treatment process for the plant, this field will not be displayed).
c. Stage: This will display the process stage where the sample is to be collected from.
d. Output type: Biosolids/effluents
e. Pending date: This is the test date as per schedule.
f. Test result: Pass/fail.
2. View test details: The user can view test details by clicking on “View Results” button on each card.
Filter tests: On clicking on Filter, a pop-up will be displayed. The following filters are available:
i. Treatment process: (in case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing values for Treatment Processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
ii. Output type: This will be a dropdown showing values for the output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
iii. Test type: This will be a dropdown showing values for the test type (IoT/Lab). The selected test type is displayed here on selection. If not, the field is left blank.
iv. Date range: The selection of date range (calendar view) - The selected date range is displayed here on selection. If not, the field is left blank.
On selecting values for filters above, the user can click on filter to filter the inbox. To clear filters, the user can click on clear all. To close the pop-up, a user can click on the cross on the top right hand corner of the screen. On selection of the filter, the selected filter is displayed on the screen. On clicking the cross button near the displayed filter, the filter is removed.
Sort: On clicking on sort, a pop-up will be displayed:
a. Tests can be sorted by the pending date:
Date (Latest first)
Date (Latest last)
b. On selecting values for sort above, the user can click on sort to sort the inbox.
c. To clear sort, the user can click on clear all.
d. To close the pop-up, the user can click on the cross on the top right hand corner of the screen.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
The user can download the list of tests, filtered by selection in Excel and PDF format.
Go back to the Landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
On clicking the “View Results” button, the user will be redirected to the test summary page.
The page will have 2 cards:
The first card will display the following fields:
Test ID.
Treatment Process.
Stage.
Output Type.
Test Type.
Lab Name/Device ID: This will show Lab Name/Device ID based on the Test type.
Test submitted on.
Test Results: Pass/Fail.
The second card will display the following fields:
Parameters, their unit of measurement and the values of the parameters recorded. The values will be read/green basis whether they are as per benchmarks or not.
The user can go back to the list of tests by clicking on the ‘Back’ button, both on the top and bottom of the page.
View IoT Readings
IoT readings can be viewed via the TQM landing page and clicking on view IoT readings.
On clicking on View IoT readings, the user is redirected to the view tests page filter on test type: IoT.
The functionality of the page remains the same as the “View Past Tests” page.
Sensor Monitoring
Sensor monitoring can be accessed by clicking on the sensor monitoring link on the TQM landing page.
On clicking on sensor monitoring, the list of IoT devices are displayed.
The following details are displayed on the page:
Total number of IoT devices is displayed beside the page heading in brackets.
A card is available for each device. The following details will be displayed:
- Device ID.
- Treatment Process.
- Stage.
- Output Type.
- Last Calibrated Date.
- Device Status.
- Verification Status.
- Last Verification Date.
- Parameters that the device monitors.
The user can perform the following actions:
Filter devices: On clicking on filter, a pop-up will be displayed.
a. The following filters are available:
i. Treatment process: (in case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing values for the treatment processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
ii. Output type: This will be a dropdown showing values for the output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
iii. Device status: This will be a radio button showing active/inactive
iv. Parameters: This will be a multi-select displaying all parameters configured on the backend.
b. On selecting values for filters above, the user can click on filter to filter the inbox.
c. To clear filters, the user can click on clear all.
d. To close the pop up, the user can click on the cross on the top right hand corner of the screen.
e. On selection of the filter, the selected filter is displayed on the screen. On clicking the cross button near the displayed filter, the filter is removed.
Search: On clicking on search, a pop-up will be displayed.
a. The user can search a device by device ID. Part search to be enabled.
View Dashboard
Dashboards can be accessed by clicking on the “View Dashboards” link on the TQM landing page.
Navigation:
On landing on the dashboard, the user can navigate across the treatment process types to view the dashboard specific to the treatment process type.
Filter:
Date range: Users should be able to filter based on the date range basis which dashboard is to be filtered.
Other functionalities:
Share:
Users should be able to share a filtered dashboard over WhatsApp in an image format.
Users should be able to share filtered charts/tables over WhatsApp in an image format.
Download:
Users should be able to download the filtered dashboard in PDF and image formats.
Users should be able to download filtered charts/tables in PDF and image formats.
Metrics:
Total incoming sludge: The sum of the total sludge that is disposed of at the plant for the selected time period.
Number of trips: A count of the total incoming vehicles at the treatment plant for the selected time period.
Overall quality: The number of tests where all parameters are as per the benchmarks as compared to the total number of test results recorded.
Compliance percentage: The percentage of tests where results have been recorded.
Total alerts: A count of the total alerts raised of the following types: Test results not as per the benchmark, no reading from the IoT device, and lab results and IoT results not matching.
Treatment Quality Overview:
KPIs:
Total tests - A count of the total tests for the filtered date range.
A count of tests that have passed treatment quality.
A count of tests that have failed the treatment quality.
Table:
Heading - Name of the plant.
Table displaying the following fields:
a. Stage, output type, value of parameters, and compliance percentage.
b. Button to view the trends for a particular stage.
Toggle to toggle between IoT readings and lab results.
Trends of parameter readings:
This chart will be available once the user clicks on the view trend button in the table button.
The table shows the trend for one parameter over time, and provides a view of the benchmark for comparison. A toggle is available to navigate between the parameters. Detailed metric details for the Treatment Quality Monitoring dashboard are viewable below:
A card for Treatment Quality Monitoring will be made available on the landing page of the employee.
The treatment quality contain the following:
a. An overview of the total pending tests and how many are nearing SLA.
b. View upcoming tests using the inbox. The inbox will show a count of upcoming tests beside in brackets.
c. View past test results: Past results from both lab and IoT devices will be displayed here.
d. View IoT readings: The user can access the record of IoT readings here.
e. Sensor monitoring: The user can access a list of IoT devices along with their status here.
f. View dashboard: The user will be directed to the treatment quality dashboard.
Clicking on each of these links will take the user to the specific page.
Notifications: This will show the list of alerts regarding TQM. Currently, this will display the tests that have crossed SLA for greater than 7 days. The user can view the details of the test by clicking on the “View Details” button. The user can dismiss the notification by clicking on the cross button.
Rest of the functionality will remain the same as the current ULB employee landing page.
View List of Upcoming Tests
On clicking on Inbox, the user is redirected to the list of upcoming tests. This will show only a list of lab tests. The user can perform the following tasks:
The total count of upcoming tests are displayed beside the inbox in brackets.
View a list of upcoming tests. The list of upcoming tests will be sorted by the pending date, where the test with the highest SLA is displayed first. The following fields will be displayed:
a. Test ID
b. Plant Name
c. Treatment Process
d. Pending Date: This is the test date as per schedule
e. Status: Status of the test
f. SLA: Show difference between test due date and today. This will be displayed in red if test due date<today and in green if today>test due date.
The user can view test details by clicking on the test ID.
Filter tests: Filters are displayed on the left hand panel of the screen. The following filters are available:
i. Treatment process: This will be multi-select showing values for the treatment processes configured for the ULB. The selected treatment process will be displayed as a tick on the multi-select box. If not, it is left blank.
ii. Stages: This will be a dropdown showing values for stages configured for the plant. The selected stage is displayed here on selection. If not, the field is left blank.
iii. Status: This will be a multi-select showing values for the status in the treatment quality workflow.
On selecting values for filters above, the user can click on filter to filter the inbox. To clear filters, the user can click on the refresh icon on the top right of the filter panel.
Sort: Tests can be sorted by the pending date by clicking on the date column.
Search:
a. Tests can be searched using the following:
i. Test ID.
ii. Plant Name.
b. Part search to be enabled for both.
c. Users can fill either test ID or plant or both and click on the search button.
d. Users can clear search by clicking on the clear search link.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
Redirecting to other links in the TQM module: Users can redirect to other pages in the TQM module via the links provided on the top left of the page. The following links will be displayed:
a. View Past Results
b. View IoT Results
c. Sensor Monitoring
d. View Dashboard
View Test Details
A test details page can be accessed by clicking on the test ID in the inbox. The test details page will consist of the following fields:
The following information will be displayed. In case the information on any field is not available such as the lab name/value against parameters based on the status of the test, the value against the fields will be displayed as “To be Updated”
- Test ID
- Plant Name
- Treatment Process
- Stage
- Output Type
- Test Type
- Test Scheduled on
- Status
- Lab Name
- Sample submitted on
- Test results submitted on
- SLA (This will be displayed in red/green based on the SLA. If today>pending date, this is red, If today<pending date, then green for open tests. For closed tests, SLA will be displayed).
- Table containing the following details:
i. S.No
ii. Parameter
iii. UoM
iv. Benchmark
v. Value Recorded - The value will be displayed in red/green based on comparison to the benchmark
vi. Overall Test results - Pass/Fail
- Attached documents, if any. The user should be able to view the document by clicking on the document icon. No icon will be present if documents are not attached.
- Test Timeline
The user can go back using the breadcrumbs of the page.
The user can download the test report by clicking on the 'Download' button.
View Past Test Results
Past test results (both IoT and Lab) can be viewed via the TQM landing page and clicking on past tests.
On clicking on past test results, the user is redirected to the list of past tests.
The user can perform the following tasks:
View the list of past tests. The results will be sorted on the test date. The following will be the fields displayed:
a. Test ID
b. Plant
c. Treatment Process (in case the there is only 1 treatment process for the plant, this field will not be displayed).
d. Test Type
e. Test Date: This is the date the test results are updated
f. Test Result: Pass/Fail
View test details: The user can view test details by clicking on the “Test ID” link in each row.
Search tests: The user can search based on the following:
i Test ID: Input text field, part search should be enabled.
ii. Plant: Dropdown of list of plants in the ULB.
iii. Treatment process: Dropdown of list oftTreatment Process in the ULB.
iv. Test type: This will be a dropdown showing values for the test type (IoT/Lab). The selected test type is displayed here on selection. If not, the field is left blank.
v. Date range: Selection of date range (calendar view): The selected date range is displayed here on selection. If not, the field is left blank.
On selecting values for search above, the user can click on search to filter the inbox. To clear search and view all past tests, the user can click on “Clear Search”.
Sort: Tests can be sorted by the pending date by clicking on the date column.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
Download test results in Excel and PDF formats using the download button.
The user can go back using the breadcrumbs on the top of the page. In case the user has navigated to the test details page from the past test results list, on clicking back, the user should be redirected to the test details page.
On clicking on any test ID button, the user will be redirected to the test details page (same as redirection from the inbox).
View IoT Readings
IoT readings can be viewed via the TQM landing page and clicking on “View IoT Reading”.
On clicking on IoT readings, the user is redirected to the list of past tests, with the search on test type selected as IoT and the results filtered for IoT readings only. All other functionality will remain the same.
Sensor Monitoring
The list of devices can be viewed via the TQM landing page and by clicking on ‘Sensor Monitoring’’.
On clicking on sensor monitoring, the user is redirected to the list of devices.
The user can perform the following:
View total devices: The total number of IoT devices is displayed beside the page heading in brackets.
A row is available for each device. The following details will be displayed:
- Device ID
- Plant
- Treatment Process
- Stage
- Output Type
- Device Status
- Parameters: One or multiple parameters that the device is monitoring.
The user can perform the following actions:
a. Search Devices: On clicking on the filter, a pop up will be displayed. The following filters are available:
Device ID: Part search should be available here.
Plant: Dropdown based on plants configured in the MDMS.
Treatment process: Dropdown-based on the treatment process type.
Stage: Dropdown-based stage of selected treatment process.
Output type: This will be a drop down showing values for Output types configured for the plant.
Device status: Dropdown contains active/Inactive as options.
b. On selecting values for filters above, the user can click on Search to filter the inbox.
c. To clear search, the user can click on clear all.
Record Test Result
Since actors such as PCB might conduct adhoc tests, a provision will be provided to the user to record test results without a schedule. A user can record test results by clicking on the “Add Test Result” link on the card.
Clicking on the “Add Test Result” button will redirect the user to the “Add Test Result” page.
The following fields need to be entered by the user:
Plant Name: This is a dropdown based on the list of plants available in the system. For a state level user, this should display all plants in the state. For a ULB, it should also display the names tagged to the ULB.
Treatment Process: This is a dropdown based on the list of treatment processes in the plant selected.
Treatment Stage: This is a dropdown based on the list of stages in the treatment process selected.
Output Type: This is a dropdown based on the output types available in the stage
Values against parameters. Atlease 1 parameter needs to be filled for the submit button to be enabled. If no parameter is filled and the user clicks on the submit button, an error message is displayed as a snack bar.
Attachments, if any.
Once the user clicks on the Submit button, the test results page is displayed.
This is the same as the “View Test Results” page with the following changes:
TesttType will be displayed as Lab.
Status, lab name and SLA field are not displayed.
Workflow will not be displayed.
The user can go back to the “Add Test Results” page via the breadcrumbs.
The TQM Dashboard will be made available to both the ULB employee and the state employee, and can be accessed by clicking on the dashboard link on the landing page.
On clicking on the dashboard, the user is directed to the dashboard view.
The access to data in the dashboard will be based on the following roles:
ULB admin will be able to view the dashboard for all plants in the ULB.
A state admin will be able to view the dashboard for all plants in the ULB.
Navigation:
On landing on the dashboard, the user can navigate across the treatment process types to view the dashboard specific to the treatment process type.
Filters:
Date range: Users should be able to filter based on the date range.
ULB: Users should be able to filter based on the ULB. For a ULB employee, the ULB is auto-selected to the ULB the plant and employee is tagged to. For a state user, all ULBs are available.
Plant: Users should be able to filter based on the plant. For ULB employees, plants tagged to the ULB to which the employee belongs should be available in the dropdown. For state users, all plants are available.
Other functionalities:
Share:
- Users should be able to share a filtered dashboard over WhatsApp in an image format.
- Users should be able to share filtered charts/tables over WhatsApp in an image format.
Download:
- Users should be able to download the filtered dashboard in PDF and image formats.
- Users should be able to download the filtered charts/tables in PDF and image formats.
Metrics
Overall KPIs:
The dashboard will display the following KPIs:
Total incoming sludge: The sum of the total sludge that is disposed of at the plant for the selected time period.
Number of trips: A count of total incoming vehicles at the treatment plant for the selected time period.
Overall quality: The number of tests where all parameters are as per benchmarks as compared to the total number of test results recorded.
Compliance percentage: The percentage of tests where results have been recorded
Total alerts: A Count of total alerts raised of the following types: Test results not as per benchmark, the number reading from IoT device, and lab results and IoT results not matching.
Treatment quality overview:
KPIs:
Total plants - A count of the unique plants for the particular treatment process.
Count of plants who have passed the treatment quality as per the last recorded test.
Count of plants who have failed the treatment quality as per the last recorded test.
Treatment quality is said to have passed if all parameters for final output(s) of a treatment process are as per benchmarks. Treatment quality is said to have failed if 1 or more parameters for final output(s) of a treatment process is not as per benchmarks.
Map:
A map view of the location of each plant will be displayed as part of the dashboard. Plants here will be colour coded, based on whether it has passed/failed the treatment quality. (Red = failed, Green = passed).
Table:
A table will be available to plant-wise details of test results (pass/fail), and the compliance percentage. This will be as per the last test result. The user will also be able to see a change in compliance percentage compared to the last month. A drilldown will be made available for a plant via this table. For TRP users and ULBs, where only 1 plant is tagged for the process type, the drilled table is automatically visible.
On drilldown, the following is viewable to the user:
Heading - Name of plant.
Table displaying the following fields:
a. Stage, output type, value of parameters, and compliance percentage.
b. Button to view the trends for a particular stage.
Toggle to toggle between IoT readings and lab results.The selected test type will appear highlighted.
If there are multiple process flows, then the user can switch between process flows by using the buttons. The selected process flow will appear highlighted.
Trends of parameter readings:
This chart will be available once the user clicks on the view trend button in the table button.
The table shows the trend for one parameter over time and provides a view of the benchmark for comparison. A toggle is available to navigate between parameters.
DIGIT Sanitation is enhanced with a new Product -Treatment Quality Monitoring(TQM). The Treatment Quality Monitoring aims to improve the Treatment Quality of plants by ensuring that timely tests are conducted and any deviations are addressed quickly. To know more about Treatment Quality Monitoring(TQM) , please follow the
Treatment Quality Monitoring(TQM)
Click for Known Issue List.
The treatment plant operator is responsible for ensuring that the treatment quality for the plant is as per benchmarks. For this, treatment quality tests need to be performed regularly and any deviations in quality need to be corrected immediately.
The treatment plant operator has the following functionalities:
View the list of upcoming tests: Based on the frequency of testing that is configured on the backend, a testing schedule is generated for the plant. The plant operator can receive timely reminders and can view overdue and upcoming tests.
View test details: Details of upcoming tests can be accessed to verify outputs, parameters, and test due dates.
Update the test status and results: Test status and results against a scheduled test can be updated.
View past test results: A digital record of all past test results uploaded on the platform can be accessed. Sort and filter functionalities make it easy to view all the test results.
Treatment plant operator credentials (login ID and password) can be created for employees via the HRMS. Using this, the treatment plant operator can sign in.
On this page, the following actions can be performed:
Enter username and password.
Select the city for login.
Reset password by clicking on the “Forgot Password” link.
Home Page
The user will land on the home page post-login. The user can perform the following actions:
User Actions
Access treatment quality module (Update test status and results).
Switch between the plants.
Users can view the List of pending tasks: This will show the list of tests pending within the next [X] days. A button for “View All Pending Tasks” will be displayed which will redirect the user to “All Pending Tasks”.
Other actions such as an inbox to view the incoming vehicles for disposal and record incoming vehicles will be available here, based on the functionality assigned.
On clicking on the treatment quality card, the user is redirected to the TQM home page. The user can perform the following actions:
User Actions:
View and take action on upcoming tests using the inbox. The inbox will show a count of upcoming tests beside it.
Past test results: Past test results submitted for the plant can be viewed here.
View performance: This widget shows the performance of the plant in regards to treatment quality and will display the following KPIs:
Test compliance: Compliance percentage of plant with regards to the treatment quality
Percentage of test results passed - The total test results passed/the total test results submitted (for the past 30 days).
Count of alerts raised in the past 30 days.
Distribution of alerts based on the alert category.
Go back to the Landing page using the ‘Back’ button.
Access Help
A help button is available for the user to get a guided view of the page. This is available on every page.
View List of Upcoming Tests
On clicking on the inbox link the user is redirected to the list of upcoming tests. This will show only a list of tests to be completed and updated by the Treatment Plant operator.
User Actions
The total count of upcoming tests is displayed beside the inbox.
View the list of upcoming tests. The following fields will be displayed:
Test ID.
Treatment process
Stage
Output type
Pending date
Status
SLA
Filter tests: On clicking on the filter icon, a pop-up will be displayed. The following filters are available:
Treatment process: The dropdown shows the values for treatment processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
Output type: The dropdown shows values for output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
Status: The dropdown shows values for the status in the treatment quality workflow. The selected status is displayed here on selection. If not, the field is left blank.
Date range: Selection of date range (calendar view): The selected date range is displayed here on selection. If not, the field is left blank.
On selecting values for the filters above, the user can click on search to filter the inbox.
To clear filters, the user can click on 'Clear'.
To close the pop-up, a user can click on the cross in the top right hand corner of the screen.
Sort: On clicking on sort, a pop-up will be displayed:
Tests can be sorted by the pending date:
Date (Latest First)
Date (Latest Last)
On selecting values for the sort above, the user can click on sort to sort the inbox.
To clear sort, a user can click on clear all.
To close the pop-up, a user can click on the cross in the top right hand corner of the screen.
View Test Details
The test details can be viewed by the user in two ways:
Via the pending tasks.
Via the inbox.
View Test Details Via Pending Tasks
The list of pending tasks can be accessed via the landing page for TQM. This will show the list of tests pending within the next 7 days.
This will show tests in both workflow stages:
Submit Sample
Update Results
Based on the workflow stage, the action for the user will be displayed.
On clicking the “Submit Sample/Update Results” button, the user will be redirected to the test details page.
View Test Details Via Inbox
For test results in the scheduled stage, the "Update status" button will be displayed.
For tests in the pending results stage, the "Update results" button will be displayed.
Update Test Details
On clicking on the "Update Status", the user will be redirected to the test details page.
The test details page will consist of 2 cards.
The first card will display the following fields:
Test ID
Treatment Process
Stage
Output Type
Pending Date
Status
Parameters to be tested along with their unit of measurement
SLA (This will be displayed in red/green based on the SLA. If today>pending date, this is red, If today<pending date, then it is green).
The second card will be based on the test status:
For status tests ‘Scheduled’, the user will be asked to ‘Select a lab’
For status tests “Pending Results”, the user will be asked to "Add test results".
The user can :
Update parameter readings (mandatory fields): Only numerical values can be entered.
Attach documents (non-mandatory).
Submit test results by clicking on the ‘Submit’ button. The button will be deactivated unless all values have been updated. On clicking the submit button, a pop-up will be displayed to the user to confirm submission.
The user can confirm submission by clicking on the ‘Submit’ button.
The user can go back to the test details page by clicking on the “Go back” button.
On submission, the system displays the lab results submitted successfully along with the test ID
The user can see the summary of the test results and whether it has passed/failed based on a comparison between the values entered by the user and the benchmarks.
In case all values are as per the benchmarks, the test results will be displayed as ‘Pass’. All values will be shown in green, and the user receives feedback that all results are as per the benchmarks.
In case one or more values are not as per the benchmarks, the test results will be displayed as ‘Fail’. All values as per the benchmarks will be shown in green. Values not as per the benchmarks will be shown in red. The user is provided with information that the test results are not as per the benchmark.
The user can go back to the home page by clicking on the ‘Back’ button.
View Past Test Results
Past test results can be viewed via the TQM landing page and by clicking on "View Past Results’"
On clicking on the past tests, the user is redirected to the list of the past tests.
User Actions :
View the list of past tests . Details of tge past tests can accessed by clicking on the "View Results".
Filter and sort test.
Workbench Setup
For the loading of data, the following will be required:
State-level user for workbench (to allow for loading of state data).
Urban local body (ULB)-level users for workbench (to allow for loading of ULB-specific data). One user for the workbench with access to all ULBs can also be created and the user can navigate across the ULB to upload data for each ULB
ULB employees are provided with credentials to log in to the system. Role-based access for various steps in the workflow, that is, different individuals can be assigned to create an application, modify applications, or manage vendor, driver, and vehicle details.
User actions
On this page, the following actions can be performed:
Enter username and password.
Select a city for login.
Reset your password by clicking on the “Forgot Password” link.
On clicking continue, employees are redirected to the Workbench home page
For the proper working of the platform, the following benchmark rules should be configured. If any other benchmark rules are configured, the system will not function:
Inbox
Click on “Add Master Data” to add a new benchmark rule
Inbox
Click on “Add Master Data” to create new a Quality Test Lab
Inbox
Click on “Add Master Data” to create new a material
Inbox
Click on “Add Master Data” to create a new parameter
Inbox
Click on “Add Master Data” to create a plant configuration
Inbox
Click on “Add Master Date” to create a plant type
Inbox
Click on “Add Master Data” to create a new process type
Inbox
Click on “Add Master Data” to create a new unit
Inbox
Click on “Add Master Data” to create new Waste Type
Inbox
Click on “Add Master Data” to create a new source type
Inbox
Click on “Add Master Data” to add a new stage
Inbox
Click on “Add Master Data” to create a new process
Once the plant is created, the plant user mapping has to be done through the backend. If there is already an FSM instance created and running, then the V1 of codes for plant and ULB has to be taken for V2 and the backend team will have to create the plants using the same codes.
Inbox
Click on “Add Master Data” to add a new plant
Inbox
Click on “Add Master Data” to add a new quality criteria
Inbox
Click on “Add Master Data” to add a new test standard
The user can select the lab from the dropdown and click on the "Update Status".
Goal
How will it be measured via the product?
How will we know TQM is successful?
To ensure treated waste is as per quality standards.
The percentage of plants with output quality as per benchmarks.
Increase in the percentage of plants with output quality as per benchmarks over time
To ensure treated waste is tested regularly for quick identification of issues.
The percentage compliance against the testing schedule.
Increase in the percentage compliance against the testing schedule over time.
Section Heading
Chart Heading
Subheading
Definitions ((This will appear on the dashboard whenever a user hovers on the metric wherever applicable))
Chart Type
X-Axis
Y-Axis
Value
Columns
How to calculate
Boundary
Dtill down/Toggle
Comparsion KPIs, if any
Show comparison in
Specific to State/ULB/ TRP/all
Tooltip on Hover on Data Point
Input Fields
1
Input
Total Incoming Sludge
NA
Total Incoming sludge from registered and unregistered vehicles
KPI
NA
NA
Total Incoming Sludge
NA
Total incoming sludge = (Volume of Waste Disposed for Registered Vehicles) + (Volume of Waste Disposed for Unregistered Vehicles)
State, Plant ULB
NA
NA
NA
NA
2
Input
# incoming trips
NA
Number of trips disposed at the Treatment plant
KPI
NA
NA
Count of Trips to the Treatment plant from registered and unregistered vehicles
NA
Number of trips disposed = Count(DISTINCT Trip ID)
State, Plant ULB
NA
NA
NA
NA
3
Treatment Quality
Overall Quality
NA
% of tests where all parameters are as per benchmarks
KPI
NA
NA
% of test results meeting benchmarks
NA
Overall Quality = (Number of tests where all parameters meet benchmarks / Total number of tests) * 100
State, Plant ULB
NA
NA
NA
NA
4
Treatment Quality
Compliance
NA
% of tests where results have been recorded
KPI
NA
NA
% of tests with Status as submitted out of total tests
Compliance % = (Count of Test ID in status 'Submitted' / Count (Distinct Trip ID) * 100
State, Plant ULB
NA
NA
NA
NA
5
Alerts
Total Alerts
NA
Total Alerts raised by the system in the following categories: 1) Test Results not as per benchmark 2) No reading from IoT device 3) Lab results and IoT results not matching
KPI
NA
NA
Total Alerts
Count(DISTINCT AlertID)
State, Plant ULB
NA
NA
NA
NA
6
Treatment Quality Plants
Total Plants
NA
NA
NA
NA
NA
Count of Plants
Count (Distinct PlantID)
State, Plant ULB
NA
NA
NA
NA
7
Treatment Quality Plants
Treatment Quality Passed
NA
Treatment quality is considered passed if all parameters of both Biosolids and Effluents are as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Passed
Treatment Quality for Output type =IF(COUNTIF(All Parameters Meet Benchmarks, FALSE) = 0, "Treatment Quality for Output type passed ", "Treatment Quality for Output type failed") Treatment Quality for Plant passed = =IF(COUNTIF(Treatment Quality for Output type, FALSE) = 0, " Treatment Quality Passed ", "Treatment Quality Failed")
State, Plant ULB
NA
NA
NA
NA
8
Treatment Quality Plants
Treatment Quality Failed
NA
Treatment quality is considered failed when 1 or more parameters of Biosolids or Effluents are not as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Failed
Count (Distinct PlantID) - Treatment Quality Passed
State, Plant ULB
NA
NA
NA
NA
9
Treatment Quality Plants
NA
NA
NA
Map
NA
NA
Point = Geolocation of Plant Plant Icon Colour - Green if Treatment Quality Passed, Red if Treatment Quality Failed
State, Plant ULB
NA
NA
NA
NA
Name of Plant
10
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Plant Name, Test Result, Compliance %
Test Result Same as S.No 7 and S.No 8
State, Plant ULB
NA
NA
NA
NA
11
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Stage, Output Type, Parameters 1...n, Compliance %
Mentioned above
State, Plant ULB
NA
Compliance %
% from last month
NA
12
Trend in [Parameter Name] Readings
NA
NA
NA
Multi-Line Chart
Test Dates
Parameter Value
- Value of Device Reading - Value of Lab results
NA
NA
Plant
NA
NA
NA
NA
Date Lab result - X Device Reading - Y
Goal | Category | Objective | How will it be measured via the product | How will we know TQM is successful |
Zero deaths, diseases, and environmental contamination resulting from poor sanitation | Primary | To ensure treated waste is as per the quality standards | The percentage of treatment quality tests passed for a specific period | Increase in the percentage of treatment quality tests passed for a specific period |
Zero deaths, diseases, and environmental contamination resulting from poor sanitation | Secondary | To ensure treated waste is tested regularly for quick identification of issues | The percentage compliance against testing schedule for a specific period | Increase in the percentage compliance against testing schedule over time |
How is the schedule generated? A scheduler runs for every ‘X’ days and new tests are generated within the treatment plant operator’s login. |
Workflow for Treatment Quality Tests There are 2 statuses:
Update Status : When the tests are scheduled and pending for status update, the status of the test is 'Scheduled'. Update Results : Once the status of the tests are updated , the status of the tests will change to “Pending Result”, that is, waiting for the results to be updated. |
Verifying against benchmarks and Pass/Fail. The status of the test depends on benchmarks that are pre-defined. |
All the below configurations are in order, and the same order should be followed while creating a new test standard:
Duplications are not allowed. Common user actions for the below configuration:
|
Configuration | Which level this has to be defined? | Definition |
Benchmark rule | State | The rules according to which a test value should be tested. (For example, Greater than, less than, equal to). |
Quality Test Lab | ULB | Quality Test Lab is the laboratory where the testing is happening. This can be configured accordingly if it is in-house or the ULB-Geo Corporation. |
Material | State | Material is a physical substance for which the quality monitoring is done. For example: Effluent or raw water. |
Parameter | State | Criteria used to measure te input and the output for a job. Each parameter will have a unit of measurement. |
Plant configuration | State | Configuration details for a particular plant. |
Plant type | State | The classification of plant based on their processing. |
Process type | State | Defines the type of process. |
Unit | State | The unit for measuring this particular parameter |
Waste type | State | The classification of waste materials based on their characteristics or origin. |
Source type | State | The origin of this particular test standard. |
Stage | State | Each step within the treatment process. Each job may have one or many input quality parameters, and one or many output quality parameters. |
Process | State | Sequential series of steps required to translate input to output. |
Plant | State | Respective Plant for which Test Standards are created |
Quality criteria | State | The quality criteria which is applicable for the unique combination of plant, process, stage and material. |
Test standard | ULB | A combination of a parameter, acceptable benchmark of the parameter and its frequency of testing. For example, ph >= 7 tested weekly. |
Code | Name |
LSTOREQ | Less Than Or Equals |
GTROREQ | Greater Than Or Equals |
NEQ | Not Equals |
EQ | Equals |
OSD | Outside Range |
BTW | Between |
LST | Less Than |
GTR | Greater Than |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | JATNI_In_House Puri_Third_Party |
Name | Textual or human-readable identity given to a record. | Jatni in-house lab, Puri third-party lab |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | MM_01 MM_02 |
Name | Textual or human-readable identity given to a record. | Effluent Treated effluent |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | PP_01 PP_02 |
Name | Textual or human-readable identity given to a record. | COD BOD |
Description | Details or explanation for a record. | Chemical oxygen demand, Biological oxygen demand |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | DEFAULT_CONFIGURATION |
Lab test escalation days | The number of days after which a scheduled test that is still pending requires escalation. | Decided by the state and the programme team to configure these days. |
Pending tests to display within days | The number of days within which pending tests, assessments, or evaluations should be displayed or considered. | Decided by the state and the programme team to configure these days. |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | FECAL_SLUDGE_TREATMENT_PLANT CO_TREATMENT_PLANT |
Name | Textual or human-readable identity given to a record | Faecal sludge treatment plant Co-treatment plant |
Description | Details or explanation for a record. | Any description related to the plant. |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | FECAL_SLUDGE_TREATMENT 5_STAGE_WATER_TREATMENT |
Name | Textual or human-readable identity given to a record | Faecal sludge treatment, 5-stage water treatment |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | MGPL DC |
Name | Textual or human-readable identity given to a record. | Mg per liter Degree celsius |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | FECAL_SLUDGE MEDICAL_INFECTIOUS_WASTE |
Name | Textual or human-readable identity given to a record. | Faecal sludge Medical infectious waste |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | LAB_ADHOC LAB_SCHEDULED |
Name | Textual or human-readable identity given to a record. | Adhoc tests Scheduled tests |
Description | Details or explanation for a record. | Any explanation regarding the tests |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | OUTLET_OF_POLISHING_POND |
Name | Textual or human-readable identity given to a record. | Outlet of Polishing Pond |
Input material | Materials provided as an input to a dtage | Effluent faecal sludge Decided by the state and the programme team to configure the input and output material based on the data collected. |
Output material | Materials provided as an output to a stage | Treated effluent Decided by the state and the programme team to configure the input and output material based on the data collected. |
Description | Details or explanation for a record. | Description of the stage. |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | FECAL_SLUDGE_TREATMENT_PROCESS |
Name | Textual or human-readable identity given to a record. | Faecal sludge treatment process |
Process Type | Defines the type of process. Here the previously defined process type has to be entered. | Faecal sludge treatment |
Order | Order can be a number for which the stage can be defined. This is a non-mandatory field. | 1 |
Stage Code | A list of stages that come under a particular process. Here the previously defined stage has to entered. | Outlet of polishing pond |
Waste Type | The classification of waste materials based on their characteristics or origin. Here the previously defined waste type has to entered. | Faecal Sludge |
Description | Details or explanation for a record. | Description regarding the process can be entered. |
Field | Description | Example |
ULBs | The respective ULBs the plant belongs to. | od.berhampur od.dhenkanal |
Code | The plant code this is the same code from V1. | BEHR_001 DHKL_001 |
Name | Textual or human-readable identity given to a record. | BeMC Plant Dhenkanal Plant |
PlusCode | This is the geo-location of the plant. | 7PJQ+Q8 Kusumi, Odisha |
Latitude | Latitude of the plant. |
|
Longitude | Longitude of the plant. |
|
Plant Type | The classification of plant based on their processing. Note: Here the previously defined plant type has to entered | Faecal sludge treatment plant |
Process | A list of process that happen under a particular plant. Note: Here the previously defined process has to entered. | Faecal sludge treatment process
|
Waste type | The classification of waste materials based on their characteristics or origin. Note: Here the previously defined waste type has to entered. | Faecal sludge Medical infectious waste |
Description | Details or explanation for a record. | Can be the description of a plant. |
Plant Configuration | Configuration details for a particular plant. Note: Here the previously defined plant configuration has to entered. This is also a mandatory field | DEFAULT_CONFIGURATION |
Plant Location | Location of the plant. |
|
Plant Operational timings | Timings of the plant. |
|
Plant Operational Capacity KLD | Total capacity of the plant. |
|
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | COD_MGPL |
Unit Of Measurement | The unit for measuring this particular parameter Note: Here the previously defined units will be shown in the dropdown and the respectives can be selected. | MGPL |
Paramaeter | Anything that is measurable as an input/output for a particular stage. Note: Whatever the quality criteria is entered, the previously defined parameter is shown in the drop-down and the respectives can be selected. | COD |
Benchmark rule | The rules according to which a test value should be tested. (For example, Greater than, less than, equals to). | This is entered as per the master data collected. |
Benchmark values | Specific numbers on which the benchmark rule is applied for a test value. | This is entered as per the master data collected. |
Allowed deviation | The acceptable difference from the benchmark values. | This is entered as per the master data collected. |
Field | Description | Example |
Code | Alphanumeric or numeric representation assigned to uniquely identify the field. | TEST_PURI_QM1 |
Plant code | Plant Code for which this Test Standard is registered. NOTE: The plant codes must match the above added plant codes for Plant | BEHR_001 DHKL_001
|
Stage code | Stage Code for which this Test Standard is registered. | Outlet of polishing pond |
Process code | Process Code for which this Test Standard is registered. | Fecal Sludge Management |
Material code | Material Code for which this Test Standard is registered | Effluent |
Frequency | The frequency at which this test standard should be scheduled. | Decided by the state and the programme team to configure the input and output material based on the data collected. For example: 14. |
Source type | The origin of this particular test standard. | Lab |
Quality criteria | The quality criteria which is applicable for the unique combination of plant, process, stage and material. | COD BOD TSS |
Category | Details |
Known Issues | Email ID is auto-populating when you go to change password |
| The inbox of both plant operator and the ULB employee shows the count as 0 - API is taking some time to load. |
Functional Limitations | Lab Operator and Lab Mapping: While the platform provides the functionality for activating third-party vendors/labs to upload test results, the provision of tagging operators to specific labs needs to be provided. |
| Update Tests: Test results once submitted, cannot be edited. In case the user updates test results wrongly, the provision to go back and modify is not provided. |
| A test once generated cannot be discarded. In case master data is deactivated while a test has already been generated, the test will remain pending. |
| Addition of comments while uploading test results |
Feature | Description
|
|
|
|
|
|
|
4. Dashboards |
|
The waste value chain has 5 main stages:
Generation
Containment
Transport
Treatment
Reuse
For effective waste management, all of these stages need to be focused on. For example, improper containment of Faecal Sludge can lead to percolation of chemicals into the groundwater and its contamination, or ineffective transportation management can lead to waste being dumped illegally into surface water or land. Similarly, an essential part of effective waste management is the proper treatment of Waste.
Ineffective treatment of waste and its discharge into the environment has a direct adverse impact on the environment and water quality. Effluents are often discharged into surface water sources. The poor quality of wastewater effluents is responsible for the degradation of the receiving surface water body and the health of its users.
Quality and re-use have a direct relationship: Treated waste output can be reused, be it recycled water or as compost. Acceptance of this by consumers will largely depend on its quality and the value it provides. Nobody wants water in their flush that stinks, or manure that does not fertilise plants enough.
Currently, there is little to no visibility of how waste is being handled and treated at the treatment plant. Keeping this in mind, a Treatment Quality Monitoring (TQM) will be introduced as an extension to DIGIT Sanitation.
The objective is to improve the quality of treated waste
Our hypothesis of how treatment quality can be improved is as follows:
Define Treatment Processes and Testing Requirements: Mapping of plants and treatment processes will allow for a consolidated view to the state/TSUs/stakeholders of the treatment processes operational at plants. Additionally, this will allow for the definition of treatment requirements, frequencies and benchmarks at the state level for various processes and its adoption by plants. This further allows for analysis at the process level of treatment quality, and provides insights for improvements overtime.
Ensure Regular Testing: Operationalising regular and high-frequency testing will allow for the detection of deviations from benchmarks at the earliest.
Quick Issue Resolution: Identification of deviations and its diagnosis via an analysis of current and historical data will allow for quick issue resolution.
Deepen Enquiry: Is a particular plant consistently performing badly? Is a particular treatment process regularly leading to poor quality output? Which step in the process flow is the deviation starting from? Do quality test results fluctuate often? Looking at trends in data can help deepen enquiry and identify problem areas.
Improvement in processes or a need for further enquiry can be operationalised by redefining treatment processes and testing requirements.
Given the objective to improve quality of treated waste and the steps required to achieve it, the following are needed at each stage:
Keeping this in mind, the following components will be available in the TQM module:
The aim of the modules is to provide the users of the system information and control at each level - from defining how operations will be run, to updating status against pending operational tasks and viewing operational data to draw insights to refine operations.
What is the value being generated?
Through the above, we are looking to address the following challenges:
The following value will be created for users:
The objective of the treatment quality module is to allow data observability of the parameters of treatment quality to improve the performance of the treatment plant. There are two ways to capture data:
Manual input.
Automated capture of test results via configured IoT sensors. While the use of sensors reduces manual intervention in the process, the availability of infrastructure on the ground is a challenge.
Once the initial setup is done, the following image illustrates how the system will be used to monitor treatment quality:
Identified gaps:
Irregularity in laboratory testing.
Unavailability of actionable information on quality.
The flow will be as follows:
The scope of the Treatment Quality Module is as follows:
Register Testing Standards
A plant may have one or multiple treatment processes. For example:
An FSTP is dedicated to the treatment of faecal sludge such as the aerobic treatment process.
A co-treatment plant has two treatment processes: Faecal sludge and septage together.
A treatment process has multiple stages/steps. In case of a plant with multiple processes, stages may converge.
Define input/output quality testing requirements:
Each stage may have multiple input types and output types (for example, effluents and biosolids), and each stage may need to have the input and output quality tested for each of the input and output types.
The state and the ULB admin should be able to perform the following actions:
Define if one or many Input types needs to be tested for a particular stage.
Define if one or many output types need to be tested for a particular stage.
Enable/disable testing of a particular input/output type for a particular stage.
The UI screen for this should be enabled with the launch of workbench.
Define testing parameters, benchmarks and frequency:
Testing parameters and benchmarks are set at the national/state level and adhered to by plants. However. based on the geographical location, the benchmarks may vary.
Testing frequencies are set at the state level and adhered to by plants. These may be adjusted for plants, basis testing results.
Each output and input type will have one or more testing standards (parameters, benchmarks, and frequency). For example, for output type ‘effluent’, one may need to test PH (daily), BOD (weekly) and COD (weekly).
For a particular plant, testing will be done by one or multiple methods including:
- Manual testing in a lab
- Testing via IoT devices
The frequency of this will be different.
The state and the ULB admin should be able to perform the following actions:
Define one or multiple testing standards (parameter, benchmark, and frequency) at an instance level.
Edit testing Standards (parameter, benchmark, and frequency) for a plant.
Define different standards for manual and IoT-based testing for a particular input/output type for a stage.
This does not require a UI screen.
Generation of Schedule
Schedule for tests will be auto-generated for various parameters based on the frequency.
For manual tests, the schedule will be used to:
Display a list of upcoming tests to the plant operator and stakeholders.
Generation of alerts for upcoming tests.
Escalation in case of non-adherence to the test schedule.
For IoT tests, the schedule will be used to:
Generate alerts in case a reading is not received as per the schedule.
Workflow:
Generation of Schedule: For the generation of schedule of testing:
For manual testing:
a. For 1 particular stage of the treatment process:
Multiple parameters that have the same frequency to be combined to one test.
For parameters with different frequency, different tests should be created.
b. For 2 different stages of the treatment process:
The same parameter with the same frequency will be two different tests.
For IoT testing:
a. Multiple parameters captured by the same device will be a single test.
View Schedule:
The schedule will be available for view and action to the following users:
a. For Labs
ULB employee: [X] days prior to test date (Inbox). [X] here is configurable.
Treatment plant operator: [Y] days prior to test date (Inbox). [Y] here is configurable.
Treatment plant operator: [Z] days prior to test date (as a pending task in the UI). [Z] here is configurable.
Tests will continue to remain in the inbox/pending tasks list as long as test results are not submitted.
b. For IoT Testing:
No schedule will be displayed to users.
Generation of alerts in case reading is not received as per schedule.
Escalations:
For Manual Testing: Escalations are triggered in the following case via an in-app alert:
Record Test Results:
Test results may be recorded in 2 ways:
Manual Recording by user (lab testing).
Automated recording via integration with IoT device.
For Manual Recording (Lab Testing)-
Recording of test results will be done by the user in two cases:
Recording results against schedule.
Recording Results on demand: Adhoc tests conducted/instructed by governing authorities.
Recording Results Against Schedule-
The process flow will start with the treatment plant operator receiving a notification regarding an upcoming test. The following is the process flow for the manual recording of test results:
Recording Results: Workflow
The status of a scheduled test is auto set as “Scheduled”.
The sample has to be submitted to the lab (internal or external) for testing. The status of the same will be updated in the system by the user to “Pending Results”.
On recording test results, the status will be updated to “Scheduled”.
Recording Results On Demand-
The functionality will be made available to the record results on demand to the user.
No workflow will be available in case of recording results on demand.
The user will be able to record results by filling a form.
Selection of a lab used for testing is optional in this case.
Status of submitted results will be set as ‘Submitted’.
Automated Recording Via Integration with IoT Device-
In case of integration with IoT devices, results against scheduled tests will be recorded. Alerts in case of non upload are mentioned in the alerts section below.
Anomaly Detection:
Anomalies will be generated in case of the following:
Lab results not as per the benchmark.
IoT device results are not as per the benchmark.
Lab results and device results do not match.
Device not working.
Lab results not as per the benchmark-
This is to be generated when the manual test results uploaded by the test uploader are not as per the benchmarks defined (adjusted for deviations, configurable at plant level).
IoT results not as per the benchmark-
This is to be generated when the loT test results recorded via the integration are not as per the benchmarks defined for [X] days (adjusted for deviations defined while setting testing parameters).
Generation of alerts: Device results and lab results do not match-
In case the data that is recorded by the sensor does not match the data in the lab test result, an auto alert will be generated.
Generation of alert: No reading received from the device-
In case no reading is received from the sensor based on the schedule, an auto alert will be generated in the system.
Treatment Process
Plants
Stages
Testing Parameters
Configure IoT Devices
Testing Schedule
Test Results
Alert: Lab and IoT Result Not As Per the Benchmark; Lab and Device Results Do Not Match
Alert: No Reading Received From the Device
Below is an illustration of the communication between monitoring sensor and DIGIT:
Monitoring sensor - Communicates the actual readings (in AMP etc) to the vendor’s system.
Sensor vendor - Converts the readings into the values against the parameters.
Sensor adaptor - Communicates the values through the standardised APIs designed by DIGIT.
Sensors will be installed at one or multiple stages of the plant, and device details will be recorded in the system.
Sensor monitoring: The user will have a view of all the devices available at the plant along with their status using the sensor monitoring tab at the front end.
The TQM dashboard will be made available to both the TRP, the ULB admin and the state admin. The access to data in the dashboard will be based on the following roles:
TRP will be available to view dashboard only for the assigned plant.
The ULB admin will be able to view the dashboard for all the plants in the ULB.
The state admin will be able to view the dashboard for all the plants in the ULB.
Navigation:
On landing on the dashboard, the user can navigate across treatment process types, to view the dashboard specific to the treatment process type.
Filters:
Date range: Users should be able to filter based on the date range.
ULB: Users should be able to filter based on the ULB. For plant TRP and ULB employees, the ULB is auto-selected to the ULB the plant and employee is tagged to. For a state user, all ULBs are available.
Plant: Users should be able to filter based on the plant: For plant TRP, plants that the TRP is tagged to is auto-selected. For ULB employees, plants tagged to the ULB to which the employee belongs should be available in the dropdown. For a state user, all plants are available.
Other functionalities:
Share:
Users should be able to share a filtered dashboard over WhatsApp in an image format.
Users should be able to share filtered charts/tables over WhatsApp in an image format.
Download:
Users should be able to download the filtered dashboard in PDF and image formats.
Users should be able to download filtered charts/tables in PDF and image formats.
Metrics:
Overall KPIs- The dashboard will display the following KPIs:
Total incoming sludge: The sum of the total sludge that is disposed of at the plant for the selected time period.
Number of trips: A count of the total incoming vehicles at the treatment plant for the selected time period.
Overall quality: The number of tests where all parameters are as per the benchmarks as compared to the total number of test results recorded.
Compliance percentage: The percentage of tests where results have been recorded.
Total alerts: A count of the total alerts raised of the following types: Test results not as per the benchmark, no reading from the IoT device, and lab results and IoT results not matching.
Treatment Quality Overview:
KPIs:
Total plants: A count of the unique plants for the particular treatment process.
Count of plants who have passed the treatment quality as per the last recorded test.
Count of plants who have failed the treatment quality as per the last recorded test.
Treatment quality is said to have passed if all parameters for final output(s) of a treatment process are as per the benchmarks.
Treatment quality is said to have failed if one or more parameters for final output(s) of a treatment process is not as per the benchmarks.
Map:
A map view of the location of each plant will be displayed as part of the dashboard. Plants here will be colour coded, based on whether it has passed/failed the treatment quality. (Red = Failed, Green = passed).
Table:
A table will be available on the plant-wise details of the test results (pass/fail) and compliance percentage. This will be as per the last test result. The user will also be able to see a change in the compliance percentage as compared to the last month. A drilldown will be made available for a plant via this table. For TRP users and ULBs where only one plant is tagged for the process type, the drilled table is automatically visible.On drilldown, the following is viewable to the user:
Heading - Name of the plant.
Table displaying the following fields:
a. Stage, output type, value of parameters, and compliance percentage.
b. Button to view the trends for a particular stage.
Toggle to toggle between IoT readings and lab results.
Trends of parameter readings:
This chart will be available once the user clicks on the view trend button in the table button.
The table shows the trend for one parameter over time, and provides a view of the benchmark for comparison. A toggle is available to navigate between the parameters. Detailed metric details for the Treatment Quality Monitoring dashboard are viewable below:
Guided navigation:
If the user clicks on the help button, it will give a walkthrough of the entire screen, including the role of each button placed with two buttons:
Skip: If the user wants to skip the walkthrough at any point.
Next: It will proceed to the next action aligned.
There are multiple business models when it comes to lab testing and O&M of a treatment plant. In case of some FSTPs in Tamil Nadu, treatment plant operators are employees of a vendor who have a build-and-manage-model, and are directly responsible for resolving issues. Some FSTPs have an in-house lab to test the quality, whereas others send samples to other labs for testing. In certain cases, the ULB employee is responsible for testing and resolution of issues. Keeping this in mind, it is imperative that we design roles in the system in a flexible way such that the product can be implemented in various models.
See noun verb mapping for roles below:
The user will land on the home page post login. The following actions can be performed by the user:
The plant name is visible on the top right hand corner of the screen.
A help button is available for the user to get a guided view of the page. This is available on every page.
Cards are viewable for the following modules:
a. Vehicle log module (record incoming vehicles)
b. Treatment quality module
c. View dashboard
Clicking on each of these cards will take the user to the Homepage for the specific module.
List of pending tasks: This will show the list of tests pending within the next [X] days. The next action item in the task workflow will be displayed beside a pending task for a user to take prompt action. A button for “View All Pending Tasks” will be displayed which will redirect the user to “All Pending Tasks”.
On clicking on the treatment quality card, the user is redirected to the TQM home page. The following actions can be performed by the user:
View and take action on upcoming tests using the inbox. The inbox will show a count of upcoming tests beside it.
View past test results: Past results from both lab and IoT devices will be displayed here.
View IoT readings: The user can access the record of IoT readings here.
Sensor Monitoring: The user can access a list of IoT devices along with their status here.
View dashboard: The user will be directed to the treatment quality dashboard.
View performance: This widget will show the performance of the plant in regards to treatment quality and will display the following KPIs:
a. Test compliance: Compliance percentage of plant with regards to the treatment quality and its comparison to state level compliance percentage.
b. Last treatment quality result - Pass/fail and date of the test.
c. Count of alerts raised in the past 30 days.
d. Distribution of alerts based on the alert category.
Go back to the Landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
View List of Upcoming Tests
On clicking on inbox, the user is redirected to the list of upcoming tests. This will show only a list of lab tests. The user can perform the following tasks:
Total count of upcoming tests are displayed beside the inbox in brackets.
View list of upcoming tests. The following will be the fields displayed:
Total count of upcoming tests are displayed beside the inbox in brackets.
View list of upcoming tests. The following will be the fields displayed:
a. Test ID.
b. Treatment process (in case there is only 1 treatment process for the plant, this field will not be displayed).
c. Stage: This will display the process stage where the sample is to be collected from.
d. Output type: Biosolids/effluents.
e. Pending date: This is the test date as per schedule.
f. Status: status of the test.
g. SLA: Show difference between test due date and today.
An action item available based on the next status in the workflow will be displayed:
a. For test results in the scheduled stage, update status will be displayed.
b. For tests in the pending results stage, update results will be displayed.
Filter tests: On clicking on filter, a pop-up will be displayed:
a. The following filters are available:
Treatment process: (In case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing the values for treatment processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
Output type: This will be a dropdown showing values for output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
Status: This will be a dropdown showing values for the status in the treatment quality workflow. The selected status is displayed here on selection. If not, the field is left blank.
Date range: Selection of date range (calendar view): Selected date range is displayed here on selection. If not, the field is left blank.
b. On selecting values for the filters above, a user can click on filter to filter the inbox.
c. To clear filters, a user can click on clear all.
d. To close the pop-up, a user can click on the cross on the top right hand corner of the screen.
Sort: On clicking on sort, a pop-up will be displayed:
a. Tests can be sorted by the pending date:
Date (Latest first)
Date (Latest Last)
b. On selecting values for sort above, the user can click on sort to sort the inbox.
c. To clear sort, a user can click on clear all.
d. To close the pop-up, a user can click on the cross on the top right hand corner of the screen.
Go back to the landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
View Test Details
Test Details can be viewed by the user in 2 ways:
Via the pending tasks.
Via inbox.
View Test Details Via Pending Tasks
The list of pending tasks can be accessed via the landing page for TQM. This will show the list of tests pending within the next [X] days.
The next action item in the task workflow will be displayed as a button beside a pending task for a user to take prompt action. On clicking on the button, the user will be redirected to the test details page.
View Test Details Via Inbox
A list of tests can be accessed on the inbox. An action item is available based on the next status in the workflow will be displayed:
For test results in the scheduled stage, the update status will be displayed.
For tests in the pending results stage, the update results will be displayed.
On clicking on the action item, the user will be redirected to the test details page.
The test details page will consist of 2 cards:
The first card will display the following fields:
Test ID
Treatment Process
Stage
Output Type
Pending Date
Status
Parameters to be tested along with their unit of measurement
SLA (This will be displayed in Red/green basis SLA. If today>Pending date, this is red, If today<pending date, then green).
The second card will be based on the test status:
For tests in status ‘Scheduled’, the user will be asked to select a lab.
For tests in status “Pending Results”, the user will be asked to add test results
The user can go back using the back button - The redirection will be based on the page the user has accessed the test details page from. A help button is available for the user to get a guided view of the page. This is available on every page.
Update Tests
Tests can be updated by the user from the test details page. The test details page will display the next action item for the user based on the workflow test. For tests with the wWorkflow status ‘Scheduled’, the user will be prompted to confirm if sample has been submitted to the lab for testing.
The user can perform the following actions:
Select a lab from a dropdown list configured in the MDMS.
Update status of the test. The button will be deactivated if Lab is not selected, and will only be activated once selection is made. Once the user clicks on update status, he/she is redirected back to the page from which test details were accessed and a snack bar confirms the status update and the action Item button shows updated next step.
In case an update of status fails, the user will remain on the same page and a failure message will be displayed to the user.
For tests with the workflow status “Pending Status”, the user will be prompted to fill test results.
The user can perform the following actions:
Update parameter readings (mandatory fields) The following validations will be applied:
a. Only numerical values will be viewable here. In case of non numerical values, the following error message is displayed “Only numeric values allowed. Please input in the required format”.
Attach documents (non-mandatory): The following validations will be applied:
a. Only files in the following formats will be supported: .png, .jpg. .pdf. In case a file of unsupported format is selected, the following error will be displayed “The file type is not supported. Please upload in the following formats: .pdf, .png, .jpg”.
b. File size of maximum X mb allowed. In case file size is larger than permitted value, the following error will be displayed “The file size is too large. Please upload a file below x mbs”.
Submit test results by clicking on the ‘Submit’ button. The button will be deactivated if all is not selected, and will only be activated once selection is made. On clicking the submit button, a pop-up will be displayed to the user to confirm submission.
The following actions will be made available to the user:
Confirm submission by clicking on the ‘Submit’ button.
Go back to the test details page by clicking on the “Go back” button.
On clicking the submit button and failure to submit test results, the user will remain on the same page and a failure message will be displayed.
On clicking the submit button and successful submission of the test results, the user will be redirected to the summary page and a snack bar will confirm the submission.
At this stage, the user will be displayed the summary of test results and whether it has passed/failed based on a comparison between the values entered by the user and the benchmarks. In case all values are as per the benchmarks, the test results will be displayed as ‘Pass’. All values will be shown in green and the user receives feedback that all results are as per benchmarks. The user can go back to the home page by clicking on the ‘Back’ button.
In case one or more values are not as per the benchmarks, the test results will be displayed as ‘Fail’. All values as per benchmarks will be shown in green. Values not as per the benchmarks are shown in red. The user is provided with information that the test results are not as per benchmark. The user can go back to the Home page by clicking on the ‘Back’ button.
View Past Test Results
Past test results (both IoT and Lab) can be viewed via the TQM landing page and by clicking on past tests.
On clicking on past tests, the user is redirected to the list of past tests.
The user can perform the following tasks:
View the list of past tests. The following will be the fields displayed:
a. Test ID.
b. Treatment process (in case there is only 1 treatment process for the plant, this field will not be displayed).
c. Stage: This will display the process stage where the sample is to be collected from.
d. Output type: Biosolids/effluents
e. Pending date: This is the test date as per schedule.
f. Test result: Pass/fail.
2. View test details: The user can view test details by clicking on “View Results” button on each card.
Filter tests: On clicking on Filter, a pop-up will be displayed. The following filters are available:
i. Treatment process: (in case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing values for Treatment Processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
ii. Output type: This will be a dropdown showing values for the output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
iii. Test type: This will be a dropdown showing values for the test type (IoT/Lab). The selected test type is displayed here on selection. If not, the field is left blank.
iv. Date range: The selection of date range (calendar view) - The selected date range is displayed here on selection. If not, the field is left blank.
On selecting values for filters above, the user can click on filter to filter the inbox. To clear filters, the user can click on clear all. To close the pop-up, a user can click on the cross on the top right hand corner of the screen. On selection of the filter, the selected filter is displayed on the screen. On clicking the cross button near the displayed filter, the filter is removed.
Sort: On clicking on sort, a pop-up will be displayed:
a. Tests can be sorted by the pending date:
Date (Latest first)
Date (Latest last)
b. On selecting values for sort above, the user can click on sort to sort the inbox.
c. To clear sort, the user can click on clear all.
d. To close the pop-up, the user can click on the cross on the top right hand corner of the screen.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
The user can download the list of tests, filtered by selection in Excel and PDF format.
Go back to the Landing page using the back button.
A help button is available for the user to get a guided view of the page. This is available on every page.
On clicking the “View Results” button, the user will be redirected to the test summary page.
The page will have 2 cards:
The first card will display the following fields:
Test ID.
Treatment Process.
Stage.
Output Type.
Test Type.
Lab Name/Device ID: This will show Lab Name/Device ID based on the Test type.
Test submitted on.
Test Results: Pass/Fail.
The second card will display the following fields:
Parameters, their unit of measurement and the values of the parameters recorded. The values will be read/green basis whether they are as per benchmarks or not.
The user can go back to the list of tests by clicking on the ‘Back’ button, both on the top and bottom of the page.
View IoT Readings
IoT readings can be viewed via the TQM landing page and clicking on view IoT readings.
On clicking on View IoT readings, the user is redirected to the view tests page filter on test type: IoT.
The functionality of the page remains the same as the “View Past Tests” page.
Sensor Monitoring
Sensor monitoring can be accessed by clicking on the sensor monitoring link on the TQM landing page.
On clicking on sensor monitoring, the list of IoT devices are displayed.
The following details are displayed on the page:
Total number of IoT devices is displayed beside the page heading in brackets.
A card is available for each device. The following details will be displayed:
- Device ID.
- Treatment Process.
- Stage.
- Output Type.
- Last Calibrated Date.
- Device Status.
- Verification Status.
- Last Verification Date.
- Parameters that the device monitors.
The user can perform the following actions:
Filter devices: On clicking on filter, a pop-up will be displayed.
a. The following filters are available:
i. Treatment process: (in case there is only 1 treatment process for the plant, this field will not be displayed). This will be a dropdown showing values for the treatment processes configured for the plant. The selected treatment process is displayed here on selection. If not, the field is left blank.
ii. Output type: This will be a dropdown showing values for the output types configured for the plant. The selected output type is displayed here on selection. If not, the field is left blank.
iii. Device status: This will be a radio button showing active/inactive
iv. Parameters: This will be a multi-select displaying all parameters configured on the backend.
b. On selecting values for filters above, the user can click on filter to filter the inbox.
c. To clear filters, the user can click on clear all.
d. To close the pop up, the user can click on the cross on the top right hand corner of the screen.
e. On selection of the filter, the selected filter is displayed on the screen. On clicking the cross button near the displayed filter, the filter is removed.
Search: On clicking on search, a pop-up will be displayed.
a. The user can search a device by device ID. Part search to be enabled.
View Dashboard
Dashboards can be accessed by clicking on the “View Dashboards” link on the TQM landing page.
Navigation:
On landing on the dashboard, the user can navigate across the treatment process types to view the dashboard specific to the treatment process type.
Filter:
Date range: Users should be able to filter based on the date range basis which dashboard is to be filtered.
Other functionalities:
Share:
Users should be able to share a filtered dashboard over WhatsApp in an image format.
Users should be able to share filtered charts/tables over WhatsApp in an image format.
Download:
Users should be able to download the filtered dashboard in PDF and image formats.
Users should be able to download filtered charts/tables in PDF and image formats.
Metrics:
Total incoming sludge: The sum of the total sludge that is disposed of at the plant for the selected time period.
Number of trips: A count of the total incoming vehicles at the treatment plant for the selected time period.
Overall quality: The number of tests where all parameters are as per the benchmarks as compared to the total number of test results recorded.
Compliance percentage: The percentage of tests where results have been recorded.
Total alerts: A count of the total alerts raised of the following types: Test results not as per the benchmark, no reading from the IoT device, and lab results and IoT results not matching.
Treatment Quality Overview:
KPIs:
Total tests - A count of the total tests for the filtered date range.
A count of tests that have passed treatment quality.
A count of tests that have failed the treatment quality.
Table:
Heading - Name of the plant.
Table displaying the following fields:
a. Stage, output type, value of parameters, and compliance percentage.
b. Button to view the trends for a particular stage.
Toggle to toggle between IoT readings and lab results.
Trends of parameter readings:
This chart will be available once the user clicks on the view trend button in the table button.
The table shows the trend for one parameter over time, and provides a view of the benchmark for comparison. A toggle is available to navigate between the parameters. Detailed metric details for the Treatment Quality Monitoring dashboard are viewable below:
A card for Treatment Quality Monitoring will be made available on the landing page of the employee.
The treatment quality contain the following:
a. An overview of the total pending tests and how many are nearing SLA.
b. View upcoming tests using the inbox. The inbox will show a count of upcoming tests beside in brackets.
c. View past test results: Past results from both lab and IoT devices will be displayed here.
d. View IoT readings: The user can access the record of IoT readings here.
e. Sensor monitoring: The user can access a list of IoT devices along with their status here.
f. View dashboard: The user will be directed to the treatment quality dashboard.
Clicking on each of these links will take the user to the specific page.
Notifications: This will show the list of alerts regarding TQM. Currently, this will display the tests that have crossed SLA for greater than 7 days. The user can view the details of the test by clicking on the “View Details” button. The user can dismiss the notification by clicking on the cross button.
Rest of the functionality will remain the same as the current ULB employee landing page.
View List of Upcoming Tests
On clicking on Inbox, the user is redirected to the list of upcoming tests. This will show only a list of lab tests. The user can perform the following tasks:
The total count of upcoming tests are displayed beside the inbox in brackets.
View a list of upcoming tests. The list of upcoming tests will be sorted by the pending date, where the test with the highest SLA is displayed first. The following fields will be displayed:
a. Test ID
b. Plant Name
c. Treatment Process
d. Pending Date: This is the test date as per schedule
e. Status: Status of the test
f. SLA: Show difference between test due date and today. This will be displayed in red if test due date<today and in green if today>test due date.
The user can view test details by clicking on the test ID.
Filter tests: Filters are displayed on the left hand panel of the screen. The following filters are available:
i. Treatment process: This will be multi-select showing values for the treatment processes configured for the ULB. The selected treatment process will be displayed as a tick on the multi-select box. If not, it is left blank.
ii. Stages: This will be a dropdown showing values for stages configured for the plant. The selected stage is displayed here on selection. If not, the field is left blank.
iii. Status: This will be a multi-select showing values for the status in the treatment quality workflow.
On selecting values for filters above, the user can click on filter to filter the inbox. To clear filters, the user can click on the refresh icon on the top right of the filter panel.
Sort: Tests can be sorted by the pending date by clicking on the date column.
Search:
a. Tests can be searched using the following:
i. Test ID.
ii. Plant Name.
b. Part search to be enabled for both.
c. Users can fill either test ID or plant or both and click on the search button.
d. Users can clear search by clicking on the clear search link.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
Redirecting to other links in the TQM module: Users can redirect to other pages in the TQM module via the links provided on the top left of the page. The following links will be displayed:
a. View Past Results
b. View IoT Results
c. Sensor Monitoring
d. View Dashboard
View Test Details
A test details page can be accessed by clicking on the test ID in the inbox. The test details page will consist of the following fields:
The following information will be displayed. In case the information on any field is not available such as the lab name/value against parameters based on the status of the test, the value against the fields will be displayed as “To be Updated”
- Test ID
- Plant Name
- Treatment Process
- Stage
- Output Type
- Test Type
- Test Scheduled on
- Status
- Lab Name
- Sample submitted on
- Test results submitted on
- SLA (This will be displayed in red/green based on the SLA. If today>pending date, this is red, If today<pending date, then green for open tests. For closed tests, SLA will be displayed).
- Table containing the following details:
i. S.No
ii. Parameter
iii. UoM
iv. Benchmark
v. Value Recorded - The value will be displayed in red/green based on comparison to the benchmark
vi. Overall Test results - Pass/Fail
- Attached documents, if any. The user should be able to view the document by clicking on the document icon. No icon will be present if documents are not attached.
- Test Timeline
The user can go back using the breadcrumbs of the page.
The user can download the test report by clicking on the 'Download' button.
View Past Test Results
Past test results (both IoT and Lab) can be viewed via the TQM landing page and clicking on past tests.
On clicking on past test results, the user is redirected to the list of past tests.
The user can perform the following tasks:
View the list of past tests. The results will be sorted on the test date. The following will be the fields displayed:
a. Test ID
b. Plant
c. Treatment Process (in case the there is only 1 treatment process for the plant, this field will not be displayed).
d. Test Type
e. Test Date: This is the date the test results are updated
f. Test Result: Pass/Fail
View test details: The user can view test details by clicking on the “Test ID” link in each row.
Search tests: The user can search based on the following:
i Test ID: Input text field, part search should be enabled.
ii. Plant: Dropdown of list of plants in the ULB.
iii. Treatment process: Dropdown of list oftTreatment Process in the ULB.
iv. Test type: This will be a dropdown showing values for the test type (IoT/Lab). The selected test type is displayed here on selection. If not, the field is left blank.
v. Date range: Selection of date range (calendar view): The selected date range is displayed here on selection. If not, the field is left blank.
On selecting values for search above, the user can click on search to filter the inbox. To clear search and view all past tests, the user can click on “Clear Search”.
Sort: Tests can be sorted by the pending date by clicking on the date column.
In case filters/sort/search is applied and the user navigates to the test details page, on going back, the values of the filters/sort/search should remain the same.
Download test results in Excel and PDF formats using the download button.
The user can go back using the breadcrumbs on the top of the page. In case the user has navigated to the test details page from the past test results list, on clicking back, the user should be redirected to the test details page.
On clicking on any test ID button, the user will be redirected to the test details page (same as redirection from the inbox).
View IoT Readings
IoT readings can be viewed via the TQM landing page and clicking on “View IoT Reading”.
On clicking on IoT readings, the user is redirected to the list of past tests, with the search on test type selected as IoT and the results filtered for IoT readings only. All other functionality will remain the same.
Sensor Monitoring
The list of devices can be viewed via the TQM landing page and by clicking on ‘Sensor Monitoring’’.
On clicking on sensor monitoring, the user is redirected to the list of devices.
The user can perform the following:
View total devices: The total number of IoT devices is displayed beside the page heading in brackets.
A row is available for each device. The following details will be displayed:
- Device ID
- Plant
- Treatment Process
- Stage
- Output Type
- Device Status
- Parameters: One or multiple parameters that the device is monitoring.
The user can perform the following actions:
a. Search Devices: On clicking on the filter, a pop up will be displayed. The following filters are available:
Device ID: Part search should be available here.
Plant: Dropdown based on plants configured in the MDMS.
Treatment process: Dropdown-based on the treatment process type.
Stage: Dropdown-based stage of selected treatment process.
Output type: This will be a drop down showing values for Output types configured for the plant.
Device status: Dropdown contains active/Inactive as options.
b. On selecting values for filters above, the user can click on Search to filter the inbox.
c. To clear search, the user can click on clear all.
Record Test Result
Since actors such as PCB might conduct adhoc tests, a provision will be provided to the user to record test results without a schedule. A user can record test results by clicking on the “Add Test Result” link on the card.
Clicking on the “Add Test Result” button will redirect the user to the “Add Test Result” page.
The following fields need to be entered by the user:
Plant Name: This is a dropdown based on the list of plants available in the system. For a state level user, this should display all plants in the state. For a ULB, it should also display the names tagged to the ULB.
Treatment Process: This is a dropdown based on the list of treatment processes in the plant selected.
Treatment Stage: This is a dropdown based on the list of stages in the treatment process selected.
Output Type: This is a dropdown based on the output types available in the stage
Values against parameters. Atlease 1 parameter needs to be filled for the submit button to be enabled. If no parameter is filled and the user clicks on the submit button, an error message is displayed as a snack bar.
Attachments, if any.
Once the user clicks on the Submit button, the test results page is displayed.
This is the same as the “View Test Results” page with the following changes:
TesttType will be displayed as Lab.
Status, lab name and SLA field are not displayed.
Workflow will not be displayed.
The user can go back to the “Add Test Results” page via the breadcrumbs.
The TQM Dashboard will be made available to both the ULB employee and the state employee, and can be accessed by clicking on the dashboard link on the landing page.
On clicking on the dashboard, the user is directed to the dashboard view.
The access to data in the dashboard will be based on the following roles:
ULB admin will be able to view the dashboard for all plants in the ULB.
A state admin will be able to view the dashboard for all plants in the ULB.
Navigation:
On landing on the dashboard, the user can navigate across the treatment process types to view the dashboard specific to the treatment process type.
Filters:
Date range: Users should be able to filter based on the date range.
ULB: Users should be able to filter based on the ULB. For a ULB employee, the ULB is auto-selected to the ULB the plant and employee is tagged to. For a state user, all ULBs are available.
Plant: Users should be able to filter based on the plant. For ULB employees, plants tagged to the ULB to which the employee belongs should be available in the dropdown. For state users, all plants are available.
Other functionalities:
Share:
- Users should be able to share a filtered dashboard over WhatsApp in an image format.
- Users should be able to share filtered charts/tables over WhatsApp in an image format.
Download:
- Users should be able to download the filtered dashboard in PDF and image formats.
- Users should be able to download the filtered charts/tables in PDF and image formats.
Metrics
Overall KPIs:
The dashboard will display the following KPIs:
Total incoming sludge: The sum of the total sludge that is disposed of at the plant for the selected time period.
Number of trips: A count of total incoming vehicles at the treatment plant for the selected time period.
Overall quality: The number of tests where all parameters are as per benchmarks as compared to the total number of test results recorded.
Compliance percentage: The percentage of tests where results have been recorded
Total alerts: A Count of total alerts raised of the following types: Test results not as per benchmark, the number reading from IoT device, and lab results and IoT results not matching.
Treatment quality overview:
KPIs:
Total plants - A count of the unique plants for the particular treatment process.
Count of plants who have passed the treatment quality as per the last recorded test.
Count of plants who have failed the treatment quality as per the last recorded test.
Treatment quality is said to have passed if all parameters for final output(s) of a treatment process are as per benchmarks. Treatment quality is said to have failed if 1 or more parameters for final output(s) of a treatment process is not as per benchmarks.
Map:
A map view of the location of each plant will be displayed as part of the dashboard. Plants here will be colour coded, based on whether it has passed/failed the treatment quality. (Red = failed, Green = passed).
Table:
A table will be available to plant-wise details of test results (pass/fail), and the compliance percentage. This will be as per the last test result. The user will also be able to see a change in compliance percentage compared to the last month. A drilldown will be made available for a plant via this table. For TRP users and ULBs, where only 1 plant is tagged for the process type, the drilled table is automatically visible.
On drilldown, the following is viewable to the user:
Heading - Name of plant.
Table displaying the following fields:
a. Stage, output type, value of parameters, and compliance percentage.
b. Button to view the trends for a particular stage.
Toggle to toggle between IoT readings and lab results.The selected test type will appear highlighted.
If there are multiple process flows, then the user can switch between process flows by using the buttons. The selected process flow will appear highlighted.
Trends of parameter readings:
This chart will be available once the user clicks on the view trend button in the table button.
The table shows the trend for one parameter over time and provides a view of the benchmark for comparison. A toggle is available to navigate between parameters.
The following is out of scope:
Testing for parameters beyond those defined for a plant. If testing needs to be recorded for an additional parameter, these need to be configured for a plant first before testing can be performed.
Verification of test result documents uploaded by the user: While the system will mandate the user to upload the test result document, verification of the document and ensuring correctness of the information recorded is not in scope.
There is some initial uptake in the market for mobile treatment units, which bring treatment onsite to the citizen. Here treatment is done within the mobile treatment unit. We are unaware of how this technology works in terms of recording and storing treatment quality and hence this is out of scope.
Editing test results once recorded by the user in case of manual testing.
Editing IoT device readings.
Recording of the data trail in cases when testing parameters have been added/deleted/updated.
Testing parameters may be changed at any point. However, test schedules may already be generated and testing will be done based on old parameters defined when the test schedule is generated.
No tracking of reason if test results are not recorded as per schedule.
In case a schedule for a test is available, but the user fills the test results directly via the “Record Result” page, this will not be adjusted.
Escalation in case of non-compliance to the testing schedule will be done once per test. If test results are still not recorded, no further action will be taken.
UI for defining testing requirements and adding/disabling/editing testing parameters.
As per our field study, one plant has only one treatment process per treatment process type. The dashboard does not cover calculations for a plant that may have 2 processes for the same treatment process type. For example, For faecal sludge, the plant will either run an aerobic or an anaerobic process and not 2 process flows, one for aerobic and one for anaerobic.
Activating/deactivating IoT devices from the frontend.
Effluent Parameters
While each state is free to monitor the parameters applicable in their context, the platform mandates MOEFCC’s October 13, 2017 Notification as the minimum requirement for effluent quality monitoring.
*Metro Cities are Mumbai, Delhi, Kolkata, Chennai, Bengaluru, Hyderabad, Ahmedabad, and Pune.
During implementation, these will be setup by the system integrators as per the state policy.
Biosolids Parameters
Regulations for biosolids are unclear - we currently don't have a definitive GoI-notified standard for faecal sludge-derived biosolids. The Advisory and Primer on FSSM (along with the CPHEEO Manual) recommended US EPA Class A Biosolids criteria (looking at E Coli/F Coli, Salmonella, Helminth Eggs). The National Policy on FSSM issued after all these three makes mention of SWM Rules, 2016, as the guiding document for processed sludge quality.
Taking from the Quality in Faecal Sludge Management (Benchmarks, Standards, and Specifications) by NFSSMA and Plain English Guide Part 503 Biosolids Rule, SWM Rules 2016, the platform mandates minimum parameters as listed below, while keeping the flexibility for the States to choose or add as per its context.
Medical Waste: Brazil
The following are the acceptable parameters for medical waste according to Brazilian Standard NBR:
Platform: Master: Overall Scope | ||||||
---|---|---|---|---|---|---|
Plant
A facility that takes raw materials as inputs and converts them into a set out expected outputs through a series of setup processes and with the use of equipment operated by people.
Treatment Process
Sequential series of steps required to translate input to output.
Stages
Each step within the treatment Process. Each job may have one or many input quality parameters and one or many output quality parameters.
Assets
Physical infrastructure required to execute a job in a treatment Process
Parameters
Criteria used to measure input and output for a job. Each parameter will have a unit of measurement.
Frequency
The duration of time between two subsequent tests
Benchmarks
Acceptable value ranges for each parameter
Testing Standards
A combination of a Parameter, acceptable benchmark of the parameter and its frequency of testing. For example, ph >= 7 tested weekly
Goal
Category
Objective
How will it be measured via the product
How will we know TQM is successful
Zero deaths, diseases, and environmental contamination resulting from poor sanitation
Primary
To ensure treated waste is as per the quality standards
The percentage of plants with output quality is as per benchmarks
Increase in the percentage of plants with output quality as per benchmarks over time
Zero deaths, diseases, and environmental contamination resulting from poor sanitation
Secondary
To ensure treated waste is tested regularly for quick identification of issues
The percentage compliance against testing schedule
Increase in the percentage compliance against testing schedule over time
1
The TQM module should be configurable for multiple waste streams such as faecal sludge, solid waste, medical waste, waste water etc.
2
The TQM module should be usable by itself without the need to set up/deploy the rest of the DIGIT FSM modules
3
In the design of DIGIT Sanitation v1.3, treatment plant operators are employees of a ULB
Based on our learnings, treatment plants can be managed by:
Self-help groups (as in the case of Orissa and Trichy, Tamil Nadu). In such a case, a member of the SHG performs the role of a treatment plant operator.
Direct employees of the ULB (as in the case of some plants in Tamil Nadu).
Vendors who were outsourced on a build and manage model.
Which requires the capability to enable vendors and individuals to be able to manage the operations, access and use the system.
4
Currently, in the product, there is a 1-to-1 mapping between ULBs and plants. The TQM module should allow for tagging as per the following: Many plant - 1 boundary 1 boundary - 1 plant Multiple boundaries - 1 plant
4
There are multiple operating models when it comes to lab testing and O&M of a treatment plant. Roles in the system need to be defined to support various operating models.
Case 1:
The treatment quality workflow defined in the PRD has two steps:
Submission of sample to a lab
Uploading test results
In the case of Orissa, where labs for testing the quality of treated waste are in-house, both these workflow steps are performed by the same person. However, in Tamil Nadu, testing is outsourced to a central lab for 3 or 4 plants. In this case, the functionality to record test results will be provided to the labs.
Components
Description
Functionality
Schedule of Tests
This component will be used by treatment plant operators and ULB employees to see the schedule of tests
View schedule of Lab tests and track compliance.
Track compliance of IoT test results and cases of failures.
Recording Test Results
This component will be used by treatment plant operators and ULB employees to upload results manually and track IoT readings
Create digital records of quality test results
Alerts in the following cases:
IoT device not working
Lab results do not match IoT results
Anomaly Detection
This component will be used by treatment plant operators and ULB employees to interpret test results
Identify in real-time/near real time when results of a particular test are not as per benchmarks.
Alerts in the following cases:
Results not upto benchmark
Dashboards
This module will give stakeholders insights and information regarding operations of the treatment plant. Users can use this to drill down and identify plants and processes where compliance to testing and/or test results are not upto benchmarks.
Dashboards will also help users see trends over time to see patterns and identify long-term problem areas.
Dashboard to analyse trends in treatment quality and compliance with treatment schedule. Drill-down will be made available from state to ULB and to a plant level.
Dashboard to analyse patterns in issues. Drill-down will be made available from state to ULB and to a plant level.
Category
Challenge
How will the product address?
Informational
No knowledge of treatment and disposal processes
Awareness on when waste has to be tested and for which parameters.
Changing Standards for treated waste
Operationalising changing standards of waste by configuring additional parameters/frequency for testing.
Operational
No record keeping of waste quality
Record and maintain digital records of treatment quality.
No mechanism to monitor
treatment Plant operators
Track and measure compliance against testing schedule.
No clear definitions of responsibilities for the ULB
Setup via roles who are responsible for quality testing and issue resolution.
User
Value Bundle
State administration/Pollution control board/Stakeholders (DDWS, WATSAN boards, WATCO etc)
Ease of monitoring treatment quality, treatment quality trends and detect variations across plants.
Ease of monitoring and ensuring compliance to quality treatment schedule across plants.
Ability to change testing parameters and frequency basis test results across plants/for a particular plant.
Urban local body
Ease of monitoring treatment quality and treatment quality trends and detect variations.
Ease of monitoring and ensuring compliance to quality treatment schedule.
Centralised view of issues across plants.
Ability to change testing and maintenance frequency basis test results.
Treatment plant operators/Vendors managing treatment plants
Timely reminders to perform testing.
Ease of sharing records with ULB/stakeholders.
Role
Escalation
Governing Body + ULB employee
Test results pending beyond [X] days as per test schedule
Test Result Status
Roles
Action
Next Status
Scheduled
FSTPO
ULB employee
Submit sample for testing
Pending results
Pending results
FSTPO
ULB employee
Update Results
Submitted
Date to be matched on
Sample collection Date
If IoT result is not available for the sample collection date, the closest date after for which the IoT data is available will be considered
Deviation allowed
X%
Attribute
Type
Mandatory
Comments
Validation Required?
Treatment Process ID
Numeric
Y
Auto-generated numeric value which will act as a unique identifier for a process flow
N, this value should be system generated
Process Name
Text
Y
This is the commonly used identifier for the process flow
Max characters - 256
Status
Array
Y
Status of the process flow
Active/Inactive, Single Select
Treatment Process Type
Array
Y
The dropdown will be auto populated basis the list of waste maintained in the MDMS
Single Select
Treatment Process Subtype
Array
Y
The dropdown will be auto populated basis the list of waste maintained in the MDMS
Single Select
Attribute
Type
Mandatory
Comments
Validation Required?
Plant ID
Numeric
Y
Auto-generated numeric value which will act as a unique identifier for a plan.
Auto-generated
Plant Name
Text
Y
This is the commonly used identifier for the plant
Maximum charatcters - 128
Plant Type
Array
Y
Single select only, faecal sludge, solid waste, co-treatment
Tenant Id
Text
Y
Status
Array
Y
Status of the plant
Active/inactive, single select
Geolocation
Lat,Long
Y
Capture the exact latitude-longitude
Attribute
Type
Mandatory
Comments
Validation Required?
Stage ID
Numeric
Y
Auto-generated numeric value which will act as a unique identifier for a Job ID
Auto-generated
Stage Name
Text
Y
This is the commonly-used identifier for the Job
Maximum characters - 128
Minimum xharacters - NA
Status
Boolean
Y
Status of the stage
Active/inactive, single select
Input Quality Measurement Required
Boolean
Y
This selection will allow the user to set up if the input quality for the particular input type needs to be monitored. A user should be able to enable and disable input quality measurement requirement independently for each type
Yes/no, single select
Output Type
Array
Y
The dropdown will be auto-populated basis the list of output types
Multi-select
Output Quality Measurement Required
Boolean
Y
This selection will allow the user to set up if the output quality for the particular job needs to be monitored. A user should be able to enable and disable the output quality measurement requirement independently for each type
Yes/no, single select
Attribute
Type
Mandatory
Validation
Quality Parameter
Array
Y
Selecting from the predefined of the above-mentioned quality parameters and standards.
single select
Quality Parameter Unit of Measurement
Array
Y
Selection of the unit of measurement (mg/L, Absolute value etc). Single select
Benchmark Rule
Array
Y
Options include X>=,<=R, =<Y and >=Z, single select
Benchmark Value
Numeric
Y
Entered by user, numeric only
Testing Frequency - Manual (Days)
Numeric
Y
Selecting a custom frequency range for laboratory testing based on consent to operate, numeric only
Monitoring Frequency - Quality Sensor (Days)
Numeric
N
Selecting a custom frequency
Note: Should be optional if the ULB/state choses not to have sensor-based monitoring. Numeric only
Attribute
Type
Required?
Comments
Configuration Date
Datetime
Y
Device Type
Text
Y
Selection from the device master data
[“GPS Sensor”, “pH Sensor”, “Accelerometer”, “Light Sensor”]
Plant
Text
Y
Treatment Process
Text
Y
Stage
Text
Y
Output Type
Text
Y
Parameters
Array
Y
The parameters are monitored by the device
Monitoring Frequency
Numeric
Y
Custom frequency for the device
Calibration Date
Datetime
Y
Input from the user about any change in the calibration/maintenance of the device
Calibration Accuracy
Array
Y
Range to indicate the permissible deviation in the accuracy
IsConnected?
Boolean
Y
To indicate the connectivity of the device
Connectivity History
?
Y
Date-wise device audit log to know the connectivity status
Verification History
?
Date-wise device verification log to know the days when device input was verified with laboratory results
Attribute
Type
Mandataroy
Validation
Test ID
Alphanumeric
View only
Auto-generated on the creation of schedule
Plant Name
Text
View only
Auto-populated on the creation of schedule
Treatment Process
Text
View only
Auto-populated on the creation of schedule
Treatment Process Type
Text
View only
Auto-populated on the creation of schedule
Stage
Text
View only
Auto-populated on the creation of schedule
Output Type
Text
View only
Auto-populated on the creation of schedule
Test Type
Array
Lab/IoT, auto-selected to Lab
Parameter 1…n
Text
View only
Auto-populated on the creation of schedule
Testing Date
Date
View only
Date calculated through the predefined laboratory testing schedule
SLA
Numeric
View only
Difference between the current date and testing date: The compliance to a testing schedule can be checked through this field. However, the actions based on failed/successful compliance falls under vendor management, which is not in scope currently and will be taken up separately under vendor management
Status
Text
View only
Status to be auto set to ‘Scheduled’
Attribute
Type
Required?
Comments
Test ID
Numeric
Y
Auto-generated by system
Plant Name
Array
View only
Auto-populated on the creation of schedule, single select for on-demand test
Treatment Process
Array
View only
Auto-populated on the creation of schedule, single select for on-demand test
Treatment Process Type
Array
View only
Auto-populated on the creation of schedule, single select for on-demand test
Stage
Array
View only
Auto-populated on the creation of schedule, single select for on-demand test
Output Type
Array
View only
Auto-populated on the creation of schedule, single select for on-demand test
Test Type
Array
Lab/IoT, auto-selected to lab for on demand
Lab Submitted to
Text
Y
This will not be required in case test type = IoT
Quality Parameter 1
Numeric
Y
Validation to be applied at impel
Quality Parameter 2
Numeric
Y
Validation to be applied at impel
Quality Parameter 3
Numeric
Y
Validation to be applied at impel
Quality Parameter n
Numeric
Y
Validation to be applied at impel
Collection Time
Date
Y
This is the date-time during which the user updates status to pending Results. for IoT, this is the time sensor records reading
Attachment
Document
Y
For a given collection location, photo or PDF proof of laboratory result mentioning the information of above-mentioned parameters
Attribute
Type
Required?
Comments
Alert DateTime
Datetime
Y
Auto-captured based on date-time
Alert Type
Text
Y
Auto-captured
Lab test results not as per the benchmark
Plant Name
Text
Y
Process Name
Text
Y
Process Type
Text
Y
Parameter 1…n
Text
Y
UoM
Text
Y
Benchmark
Number
Y
Results
Number
Y
Test Type
Text
Y
Auto-selected to lab/IoT, or both
Attribute
Type
Required?
Comments
Alert DateTime
Datetime
Y
Auto captured based on date-time
Alert Type
Text
Y
Auto captured
No reading received from the device
Plant Name
Text
Y
Process Name
Text
Y
Process Type
Text
Y
Device ID
Numeric
Y
Section Heading
Chart Heading
Subheading
Definitions ((This will appear on the dashboard whenever a user hovers on the metric wherever applicable))
Chart Type
X-Axis
Y-Axis
Value
Columns
How to calculate
Boundary
Dtill down/Toggle
Comparsion KPIs, if any
Show comparison in
Specific to State/ULB/ TRP/all
Tooltip on Hover on Data Point
Input Fields
1
Input
Total Incoming Sludge
NA
Total Incoming sludge from registered and unregistered vehicles
KPI
NA
NA
Total Incoming Sludge
NA
Total incoming sludge = (Volume of Waste Disposed for Registered Vehicles) + (Volume of Waste Disposed for Unregistered Vehicles)
State, Plant ULB
NA
NA
NA
NA
2
Input
# incoming trips
NA
Number of trips disposed at the Treatment plant
KPI
NA
NA
Count of Trips to the Treatment plant from registered and unregistered vehicles
NA
Number of trips disposed = Count(DISTINCT Trip ID)
State, Plant ULB
NA
NA
NA
NA
3
Treatment Quality
Overall Quality
NA
% of tests where all parameters are as per benchmarks
KPI
NA
NA
% of test results meeting benchmarks
NA
Overall Quality = (Number of tests where all parameters meet benchmarks / Total number of tests) * 100
State, Plant ULB
NA
NA
NA
NA
4
Treatment Quality
Compliance
NA
% of tests where results have been recorded
KPI
NA
NA
% of tests with Status as submitted out of total tests
Compliance % = (Count of Test ID in status 'Submitted' / Count (Distinct Trip ID) * 100
State, Plant ULB
NA
NA
NA
NA
5
Alerts
Total Alerts
NA
Total Alerts raised by the system in the following categories: 1) Test Results not as per benchmark 2) No reading from IoT device 3) Lab results and IoT results not matching
KPI
NA
NA
Total Alerts
Count(DISTINCT AlertID)
State, Plant ULB
NA
NA
NA
NA
6
Treatment Quality Plants
Total Plants
NA
NA
NA
NA
NA
Count of Plants
Count (Distinct PlantID)
State, Plant ULB
NA
NA
NA
NA
7
Treatment Quality Plants
Treatment Quality Passed
NA
Treatment quality is considered passed if all parameters of both Biosolids and Effluents are as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Passed
Treatment Quality for Output type =IF(COUNTIF(All Parameters Meet Benchmarks, FALSE) = 0, "Treatment Quality for Output type passed ", "Treatment Quality for Output type failed") Treatment Quality for Plant passed = =IF(COUNTIF(Treatment Quality for Output type, FALSE) = 0, " Treatment Quality Passed ", "Treatment Quality Failed")
State, Plant ULB
NA
NA
NA
NA
8
Treatment Quality Plants
Treatment Quality Failed
NA
Treatment quality is considered failed when 1 or more parameters of Biosolids or Effluents are not as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Failed
Count (Distinct PlantID) - Treatment Quality Passed
State, Plant ULB
NA
NA
NA
NA
9
Treatment Quality Plants
NA
NA
NA
Map
NA
NA
Point = Geolocation of Plant Plant Icon Colour - Green if Treatment Quality Passed, Red if Treatment Quality Failed
State, Plant ULB
NA
NA
NA
NA
Name of Plant
10
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Plant Name, Test Result, Compliance %
Test Result Same as S.No 7 and S.No 8
State, Plant ULB
NA
NA
NA
NA
11
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Stage, Output Type, Parameters 1...n, Compliance %
Mentioned above
State, Plant ULB
NA
Compliance %
% from last month
NA
12
Trend in [Parameter Name] Readings
NA
NA
NA
Multi-Line Chart
Test Dates
Parameter Value
- Value of Device Reading - Value of Lab results
NA
NA
Plant
NA
NA
NA
NA
Date Lab result - X Device Reading - Y
Entities
Actions
Create
Read
Search
Update
Delete
Deactivate
Test Schedule
X
X
X
Status
X
Lab Result (Evidence)
X
X
Device Result
X
X
Anomalieis
X
Super Admin
All
Plant Admin
Create, edit and disable plants
Create, edit and disable Workcentres
Create, edit and disable assets
Create, edit and disable Devices
Map assets and devices
Map process flow and workcentres
Map assets and jobs
Set up escalation matrix
Process Admin
Create, edit and disable process flow
Create, edit and disable Jobs
Map jobs and process flow
Test Viewer
1) View upcoming tests and status
Plant Operator
1) Update status of test to Sample Submitted
Test Uploader
1) Upload test results
Dashboard Viewer
1) View dashboards
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
Jobs
X
Standards
x
Plant
X
Workcentre
X
Assets
X
Test Schedule
X
X
X
Status
X
X
Lab Result (Evidence)
X
Device Result
X
X
Anomalieis
X
X
Issue
x
Dashboards
x
x
Plant Admin
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
x
x
x
x
Jobs
x
x
x
x
Standards
x
x
x
x
Plant
X
Workcentre
X
Assets
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Process Admin
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
Jobs
Standards
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Process Admin
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
Jobs
Standards
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Test Viewer
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Plant Operator
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
Status
X
Lab Result (Evidence)
X
Device Result
X
X
X
Anomalieis
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Test Uploader
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
Status
X
Lab Result (Evidence)
X
Device Result
X
X
X
Anomalieis
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Plant Operator
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
Status
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
X
X
X
x
Dashboards
X
X
X
X
x
Issue Creator
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
x
Dashboards
X
X
X
X
x
Issue Editor (different for different types of issues)
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalieis
X
X
X
X
X
Issue
X
x
Dashboards
X
X
X
X
x
Dashboards
Entity
Actions
Create (C)
Read (R)
Search (S)
Update (U)
Delete (D)
Deactivate
Process Flows
X
X
X
X
X
X
Jobs
X
X
X
X
X
X
Standards
X
X
X
X
X
X
Plant
X
X
X
X
X
X
Workcentre
X
X
X
X
X
X
Assets
X
X
X
X
X
X
Test Schedule
X
X
X
X
X
Status
X
X
X
X
X
Lab Result (Evidence)
X
X
X
X
X
Device Result
X
X
X
X
X
Anomalies
X
X
X
X
X
Issue
X
X
X
X
X
Dashboards
X
X
x
Section Heading
Chart Heading
Subheading
Definitions ((This will appear on the dashboard whenever a user hovers on the metric wherever applicable))
Chart Type
X-Axis
Y-Axis
Value
Columns
How to calculate
Boundary
Dtill down/Toggle
Comparsion KPIs, if any
Show comparison in
Specific to State/ULB/ TRP/all
Tooltip on Hover on Data Point
Input Fields
1
Input
Total Incoming Sludge
NA
Total Incoming sludge from registered and unregistered vehicles
KPI
NA
NA
Total Incoming Sludge
NA
Total incoming sludge = (Volume of Waste Disposed for Registered Vehicles) + (Volume of Waste Disposed for Unregistered Vehicles)
State, Plant ULB
NA
NA
NA
NA
2
Input
# incoming trips
NA
Number of trips disposed at the Treatment plant
KPI
NA
NA
Count of Trips to the Treatment plant from registered and unregistered vehicles
NA
Number of trips disposed = Count(DISTINCT Trip ID)
State, Plant ULB
NA
NA
NA
NA
3
Treatment Quality
Overall Quality
NA
% of tests where all parameters are as per benchmarks
KPI
NA
NA
% of test results meeting benchmarks
NA
Overall Quality = (Number of tests where all parameters meet benchmarks / Total number of tests) * 100
State, Plant ULB
NA
NA
NA
NA
4
Treatment Quality
Compliance
NA
% of tests where results have been recorded
KPI
NA
NA
% of tests with Status as submitted out of total tests
Compliance % = (Count of Test ID in status 'Submitted' / Count (Distinct Trip ID) * 100
State, Plant ULB
NA
NA
NA
NA
5
Alerts
Total Alerts
NA
Total Alerts raised by the system in the following categories: 1) Test Results not as per benchmark 2) No reading from IoT device 3) Lab results and IoT results not matching
KPI
NA
NA
Total Alerts
Count(DISTINCT AlertID)
State, Plant ULB
NA
NA
NA
NA
6
Treatment Quality Plants
Total Plants
NA
NA
NA
NA
NA
Count of Plants
Count (Distinct PlantID)
State, Plant ULB
NA
NA
NA
NA
7
Treatment Quality Plants
Treatment Quality Passed
NA
Treatment quality is considered passed if all parameters of both Biosolids and Effluents are as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Passed
Treatment Quality for Output type =IF(COUNTIF(All Parameters Meet Benchmarks, FALSE) = 0, "Treatment Quality for Output type passed ", "Treatment Quality for Output type failed") Treatment Quality for Plant passed = =IF(COUNTIF(Treatment Quality for Output type, FALSE) = 0, " Treatment Quality Passed ", "Treatment Quality Failed")
State, Plant ULB
NA
NA
NA
NA
8
Treatment Quality Plants
Treatment Quality Failed
NA
Treatment quality is considered failed when 1 or more parameters of Biosolids or Effluents are not as per benchmarks for the output of the Treatment Process in the last test recorded.
NA
NA
NA
Count of Plants with Treatment Quality Failed
Count (Distinct PlantID) - Treatment Quality Passed
State, Plant ULB
NA
NA
NA
NA
9
Treatment Quality Plants
NA
NA
NA
Map
NA
NA
Point = Geolocation of Plant Plant Icon Colour - Green if Treatment Quality Passed, Red if Treatment Quality Failed
State, Plant ULB
NA
NA
NA
NA
Name of Plant
10
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Plant Name, Test Result, Compliance %
Test Result Same as S.No 7 and S.No 8
State, Plant ULB
NA
NA
NA
NA
11
Treatment Quality Plants
NA
NA
NA
Table
NA
NA
NA
Stage, Output Type, Parameters 1...n, Compliance %
Mentioned above
State, Plant ULB
NA
Compliance %
% from last month
NA
12
Trend in [Parameter Name] Readings
NA
NA
NA
Multi-Line Chart
Test Dates
Parameter Value
- Value of Device Reading - Value of Lab results
NA
NA
Plant
NA
NA
NA
NA
Date Lab result - X Device Reading - Y
Theme
Assumption
Testing Parameters and Standards
Testing parameters are a defined set for a particular process type, stage and input/output type combination, and so do not vary at a high frequency. Hence, they can be configured in the MDMS.
Testing
1 user will be responsible for managing the treatment quality of 1 plant for a process type.
Multiple parameters at one stage of a plant with the same testing frequency are tested together and monitored as one test.
Governing bodies such as the PCB perform adhoc tests, and will use the system to create a record of the results for the same.
Anomaly Detection
For a test result containing multiple parameters, if 1 parameter is not as per benchmark, we can categorise the test results as not as per benchmarks
Since IoT readings will be captured at high frequency, anomalies will be generated only if the test value for a parameter is not as per benchmarks for [X] days.
IoT Integration
A standard adaptor will be made available for IoT integration. However, the adaptor may need to be modified on a case basis during implementation.
Parameter
Unit
Value Range
Location
pH
-
6.5 - 9
Anywhere
BOD
mg/L
20
Metro Cities and all State Capitals except in the State of Arunachal Pradesh, Assam, Manipur, Meghalaya Mizoram, Nagaland, Tripura Sikkim, Himachal Pradesh, Uttarakhand, Jammu and Kashmir and Union Territory of Andaman and Nicobar Islands, Dadar and Nagar Haveli Daman and Diu and Lakshadweep
30
Other areas
TSS
mg/L
< 50
Metro Cities and all State Capitals except in the State of Arunachal Pradesh, Assam, Manipur, Meghalaya Mizoram, Nagaland, Tripura Sikkim, Himachal Pradesh, Uttarakhand, Jammu and Kashmir and Union Territory of Andaman and Nicobar Islands, Dadar and Nagar Haveli Daman and Diu and Lakshadweep
< 100
Other areas
F. Coliform
N/100mL
< 1000
Anywhere
COD
mg/L
(?)
Parameter
Unit
Value Range
Temperature
Degree Celsius
(?)
Moisture
Percentage
(?)
E.Coli
OR
F. Coli
MPN
< 1000 MPN (E-coli)/g Total solids
OR
< 1000 CFU (Faecal coliform)/g Total solids by dry weight)
Heavy metals (Arsenic, cadmium, chromium, copper, lead, mercury, nickel, zinc)
grams
(?)