Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Available design artifacts
Find below the Open API specifications defined for this guide:
As a part of this guide, we are going to build a single birth registry. We will re-use the user registry from the DIGIT core. This will capture the mother and father details and store them in the user registry. The baby's details will remain in the birth registry.
A single birth service will manage the registry.
Artefacts required before beginning the development phase
The are the inputs to the development phase.
The docs below guide you through the steps and the resources required to build and design the module.
eg_bt_registration
This table holds the baby's information
eg_bt_address
This table holds the applicant address who applied for the birth registration.
Follw the steps outlined on this page to setup the DIGIT development environment.
To setup the DIGIT development environment -
Run the Kafka and PostgreSQL on the development machine and re-use other services from the DIGIT development environment. The following tools are required for development:
Install IDE - For creating SpringBoot/Java applications we recommend using IntelliJ IDE. IntelliJ can be downloaded from the following links -
Install the Lombok plugins for IntelliJ as we use Lombok annotations in this module.
Install Kafka (version 3.2.0 which is the latest version) - To install and run Kafka locally, follow the following links -
Install Postman - To install postman, follow the following links -
Install Kubectl - Kubectl is the tool that we use to interact with services deployed on our sandbox environment -
https://kubernetes.io/docs/tasks/tools/install-kubectl-windows/https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/
Install aws-iam-authenticator - (if the DIGIT development environment is in AWS) - https://docs.aws.amazon.com/eks/latest/userguide/install-aws-iam-authenticator.html
Install PostgreSQL v10 locally
Add configuration - Post installation of kubectl, add the following configuration in the Kubernetes config file to allow access to resources present in our sandbox environment. Kubernetes config file is usually present in the user's home directory (which varies from OS to OS).
For example, on a Mac, it is present at:/Users/xyz/.kube/config
Once the config file is created, add the following content to it.
Note that you will have to replace the placeholder keys and tokens in the below snippet with your AWS-specific keys. Please contact your system administrator for help with this.
Once the configuration file is created, access the pods running in the environment by typing: kubectl get pods
This lists all the pods in the development environment.
In case you run into an error stating “Error: You must be logged in to the server (Unauthorized)”, add sudo before the command. For example, “sudo kubectl get pods”. That should resolve the error.
Generate Project Stub
This page provides detailed steps for generating projects using the given API specifications.
Prepare Swagger contracts that detail all the APIs that the service is going to expose for external consumption. eGov uses a customised Swagger Codegen tool.
Download the jar file and make sure it is available in the classpath. Use the Swagger Codegen tool to generate client SDKs using these Swagger contracts.
Refer to the following tutorials to understand the creation of Swagger contracts -
OpenAPI 3.0 Tutorial| Swagger Tutorial For Beginners | Design REST API Using Swagger Editor
Use the generic command below to create an API skeleton for any Swagger contract:
The following sequence is used to generate the API skeleton using codegen jar:
Navigate to the folder where you have downloaded the codegen jar.
Execute the following command:
OR
Download the contract available here and save it in a file locally. Run the following command to generate the skeleton code from the contract.
Rename the output
folder to birth-registration.
Import it in Eclipse or VS Code.
Update the spring-boot-starter-parent to 2.2.6-RELEASE in pom.xml.
Perform a maven update once the spring boot version is updated.
Make sure the following dependency is present in the pom.xml. If the version is different, make sure to update it to the version given below:
Put a slash in front of server.contextPath and add this property to the application.properties file which helps request handlers to serve requests -
Add the below external dependencies to pom.xml:
The application.properties file is already populated with default values. Read on to learn how to customise and add extra values for your application (if required).
Kafka topics for the module that need to be added are detailed here.
There are three ways to access services:
a. Run the code locally.
b. Access the service in a DIGIT environment.
c. Access the service locally via port forwarding. This bypasses Zuul's authentication and authorization.
Wherever the localhost is in the URL, the Kubernetes port forwarding has been set up from the development environment to the specified port. In your setup, modify the URLs for the various services depending on whether you are using them from an environment or running them locally or accessing them via port-forwarding.
For example, if no port forwarding has been done, you will have to provide the FQDN of your DIGIT install instead of localhost. Also, without port forwarding, you will have to update the auth tokens in your .aws profile file periodically.
Include all the necessary service host URLs and API endpoints in the "application.properties" file.
This guide specifically references the User, Localisation, HRMS, IDGen, MDMS, and Workflow services that are operational within the DIGIT development environment.
Add the following properties to the application.properties
file.
The following properties must be added for configuring the database and Kafka server. Make sure to use the default values to tune in the Kafka server that can be overwritten during deployment.
Include the following properties in the "application.properties" file to configure the database and Kafka for development purposes once you have completed adding the external dependencies to the "pom.xml" file and reloading the Maven changes.
Append Kafka configurations as per the specific requirements of the DIGIT services. Each module may use different configurations to manage its topics.
DIGIT backend development guide
This guide provides detailed steps for developers to create a new microservice on top of DIGIT. At the end of this guide, you will be able to run the provided (code provided), test it out locally and also deploy it using CI/CD to your DIGIT environment.
Steps to create a microservice:
Set up your development environment
Develop the registries, services, and APIs for a voter registration module that were described in the
Integrate with an existing DIGIT environment and re-use a lot of the common services using Kubernetes port forwarding
Test the new module and debug
Build and deploy the new service in the DIGIT environment
The guide is divided into multiple sections for ease of use. Click on the section cards below to follow the development steps.
Access the sample module . Download and run this in the local environment.
Once PostgreSQL (v10) has been installed and the basic setup is done, we use Flyway to create the tables.
Configure the below properties in the application.properties file to enable flyway migration:
Add the Flyway SQL scripts in the following structure under resources/db/migration/main
:
Add the migration files to the main folder. Follow the specified nomenclature while naming the file. The file name should be in the following format:
Example: V20180920110535__tl_tradelicense_ddl.sql
For this sample service, use the following SQL script to create the required tables.
Methods in the service layer, upon performing all the business logic, call methods in the repository layer to persist or lookup data i.e. it interacts with the configured data store. For executing the queries, JdbcTemplate class is used. JdbcTemplate takes care of the creation and release of resources such as creating and closing the connection etc. All database operations namely insert, update, search and delete can be performed on the database using methods of JdbcTemplate class.
On DIGIT the create and update operations are handled asynchronously.
The persister service listens on the topic to which service applications are pushed for insertion and updation. Persister then takes care of executing insert and update operations on the database without clogging the application’s threads.
The execution of search queries on the database returns applications as per the search parameters provided by the user.
Define POJOs - The Address object is defined in the common contract (refer to the API spec). Link it to the birth registration table via the registrationId as defined in the DB schema.
Update the Address POJO using the below code:
Define a BirthApplicationSearchCriteria POJO to take care of search requests in the DB.
Create packages - Add thequerybuilder
and rowmapper
packages within the repository folder.
Create a class - by the name of BirthApplicationQueryBuilder in querybuilder
folder and annotate it with @Component
annotation.
Insert the following content in BirthApplicationQueryBuilder class -
Create a class - by the name of BirthApplicationRowMapper within the rowmapper package and annotate it with @Component.
Add the following content to the class.
Create a class - by the name of BirthRegistrationRepository within the repository folder and annotate it with @Repository annotation.
Add the following content to the class.
The repository layer is implemented.
This section provides the complete details of the system pre-requisites, prepping and setting up the system for development.
Follow the steps in the docs resources below:
Find the steps to create the and on DIGIT.
All business validation logic should be added to this class. For example, verifying the values against the master data, ensuring non-duplication of data etc.
Follow the steps below to create the validation layer.
Create a package called validators. This ensures the validation logic is separate so that the code is easy to navigate through and readable.
Create a class by the name of BirthApplicationValidator
Annotate the class with @Component annotation and insert the following content in the class -
NOTE: For the sake of simplicity the above-mentioned validations are implemented. Required validations will vary on a case-to-case basis.
This layer enriches the request. System-generated values like id, auditDetails etc. are generated and added to the request. In the case of this module, since the applicant is the parent of a baby and a child cannot be a user of the system directly, both the parents' details are captured in the User table. The user ids of the parents are then enriched in the application.
Follow the steps below to create the enrichment layer.
Create a package under DIGIT by the name of enrichment. This ensures the enrichment code is separate from business logic so that the codebase is easy to navigate through and readable.
Create a class by the name of BirthApplicationEnrichment
Annotate the class with @Component and add the following methods to the class -
NOTE: For the sake of simplicity the above-mentioned enrichment methods are implemented. Required enrichment will vary on a case-to-case basis.
The pom.xml typically includes most of the dependencies listed below at project generation time. Review and ensure that all of these dependencies are present. Update the pom.xml in case any are missing.
Models/POJOs of the dependent service can be imported from the digit-core-models library (work on creating the library is ongoing). These models are used to integrate with the dependent services.
These are pre-written libraries which contain tracer support, common models like MDMS, Auth and Auth and the capability to raise custom exceptions.
Once these core models are imported, it is safe to delete the RequestInfo, and ResponseInfo classes from the models folder and use the ones present in the common contract that is imported.
Before starting development, create/update the following classes as given below.
Create RequestInfoWrapper POJO within the Models package -
Update the Applicant POJO with the following content -
Create the BTRConfiguration class within the configuration package. The MainConfiguration class should already exist inside the config package.
The core models are imported.
This section showcases how to create a basic Spring Boot project using an API spec and configure the database.
Follow the steps detailed as detailed below:
The service layer performs business logic on the RequestData and prepares the Response to be returned back to the client.
Follow the steps below to create the service layer.
Create a new package called Service.
Create a new class in this folder by the name BirthRegistrationService.
Annotate the class with @Service annotation.
Add the following content to the class -
NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.
Follow the steps detailed below to implement Kafka Producer & Consumer.
Producer classes help in pushing data from the application to Kafka topics. DIGIT has a custom implementation of KafkaTemplate class in the tracer library called CustomKafkaTemplate. This implementation of the Producer class does not change across services of DIGIT.
Access the producer implementation details here - Producer Implementation.
The Codegen jar already has created a Producer class. We will continue using it.
Make sure the tracer
dependency version in the pom.xml
is 2.0.0-SNAPSHOT.
For our guide, we will be implementing a notification consumer in the following section.
Once an application is created/requested or progresses further in the workflow, notifications can be triggered as each of these events is pushed onto Kafka topics which can be listened to and an sms/email/in-app notification can be sent to the concerned user(s).
For our guide, we will be implementing a notification consumer which will listen to the topic on which birth registration applications are created. Create a customised message and send it to the notification service (sms/email) to trigger notifications to the concerned users.
Sending SMS notifications to the customer:
Once an application is created/updated the data is pushed on Kafka topic. We trigger notifications by consuming data from this topic. Whenever any message is consumed the service will call the localisation service to fetch the SMS template. It will then replace the placeholders in the SMS template with the values in the message it consumed.
(For example, It will replace the {NAME} placeholder with the owner name from the data consumed). Once the SMS text is ready, the service pushes this data on the notification topic. SMS service consumes data from notification topic and triggers SMS.
Open Kafka/NotificationConsumer.java
and paste the following code:
Create a POJO by the name of SMSRequest in the web.models
package and add the following content to it:
Create a class by the name of NotificationService
under service
folder to handle preparation of customised messages and pushing the notifications.
Add the following content to it -
We have completed integration with key services and we can now test out the REST APIs we have built.
Ensure that Kafka and Persister are running in the local environment.
Run birth-registration-service that we just created.
Import the postman collection of the APIs that this sample service exposes from here: Birth Registration Postman Collection
Test the following APIs:
Hit the _create API request to create a birth registration application. You should see entries in the local database.
Hit the _search API request to search for the created birth registration applications.
Hit _update API request to update your application by changing the baby's firstName field. You should be able to see the updates in the DB.
Integration with signed audit
Enabling signed audit for a module ensures that all transactions - creates, updates, deletes - are recorded in a digitally signed fashion. Learn more about the signed audit service here.
Enabled signed audit is optional but highly recommended to ensure data security.
Add the following lines of code to the birth registration persister after the fromTopic
attribute under mappings
:
Integration with other DIGIT services
A separate class should be created for integrating with each dependent microservice. Only one method from that class should be called from the main service class for integration.
This guide showcases the steps to integrate our microservices with other microservices like
Code for developer guide till this stage is available here. Postman collection corresponding to this stage is available here.
For interacting with other microservices, we can create and implement the following ServiceRequestRepository
class under repository
package -
Describes how to integrate with DIGIT's ID Gen service
This page provides the steps to integrate with the IDGen Service. Each application needs to have a unique ID. The IDGen service generates these unique IDs. ID format can be customised via configuration in MDMS.
Add the ID format that needs to be generated in this file - Id Format Mdms File. The following config has been added for this module:
Restart the IDGen service and MDMS service and port-forward IDGen service to port 8285:
Note that you can set the ID format in the application.properties file of IDGen service and run the service locally if you don't have access to a DIGIT environment.
Hit the below curl to verify that the format is added properly. The "ID" name needs to match exactly with what was added in MDMS.
Once verified, we can call the ID generation service from within our application and generate the registrationId.
Add the following model POJOs under models
folder:
IdGenerationRequest.java
IdGenerationResponse.java
IdRequest.java
IdResponse.java
In the BirthApplicationEnrichment class, update the enrichBirthApplication method as shown below:
Make sure below ID generation host configuration is present in the application.properties file. Make sure to fill in the correct values for the host.
The user service provides the capabilities of creating a user, searching for a user and retrieving the details of a user. This module will search for a user and if not found, create that user with the user service.
DIGIT's user service masks PII that gets stored in the database using the encryption service.
Create the following POJOs under the model
directory:
Create a class by the name of UserService under service folder and add the following content to it:
Add the below methods to the enrichment class we created. When we search for an application, the code below will search for the users associated with the application and add in their details to the response object.
Add in a userService object:
And enhance the following two methods in BirthRegistrationService.java:
Add the following properties in application.properties file:
Note that if you are port-forwarding using k8s, you will use localhost. Else, if you have a valid auth token, please provide the name of the host here.
TBD
In governments, each region, city or state has custom rates and charges for the same domain. For example, birth certificate registration rates may differ from city to city and the way it is computed can also vary. This cannot be generalised into a platform service. However, billing is a generic platform service.
The way DIGIT solves for this is to "unbundle" the problem and separate out billing from calculations. A customisable service called a "calculator" is used for custom calculation. The calculator then calls the billing service to generate the bill. Each service/module ships with a "default" calculator which can then be customised for that city.
Code of developer guide till this stage is available here. Postman collection of btr-calculator is available here. Postman collection of btr-services is availble here.
We will call into MDMS deployed in the sandbox environment. All MDMS config data needs to be uploaded into the MDMS repository (DEV branch if you are deploying/testing in your dev environment).
Integration with MDMS requires the following steps to be followed:
Add a new MDMS file in MDMS repo. For this guide, a sample MDMS file has already been added available here. Copy this file into your repository.
Restart MDMS service after adding the new file via Jenkins build UI.
Once restarted, hit the curl mentioned below to verify that the new file has been properly added .
Call the MDMS service post verification from within our application and fetch the required master data. For this, create a Java class by the name of MdmsUtil under utils folder. Annotate this class with @Component and put the following content in the class -
Add the following properties in application.properties file -
The birth registration module follows a simple workflow derived from the swimlane diagrams. Please check the design inputs section for correlation as well as the design guide for info on how the workflow configuration is derived.
Integration with workflow service requires the following steps -
Add a workflow object to BirthRegistrationApplication POJO (this may already exist. Do not add if it exists already).
2. Create POJOs to support workflow - Create the following POJOs under the digit.web.models
package.
Create Workflow service - Create a class to transition the workflow object across its states. For this, create a class by the name of WorkflowService.java under the service directory and annotate it with @Service annotation.
Add the below content to this class -
Add workflow to BirthRegistrationService.
Add the below field to BirthRegistrationService.java
Transition the workflow - Modify the following methods in BirthRegistrationService.java as follows. Note that we are adding calls into the workflow service in each of these methods.
Configure application.properties - Add the following properties to application.properties file of the birth registration module. Depending on whether you are port forwarding or using the service directly on the host, please update the host name.
Run the workflow service locally - the application will call into it to create the necessary tables in the DB and effect the workflow transitions.
MDMS data is the master data used by the application. New modules with master data need to be configured inside the /data/<tenant>/
folder of the MDMS repo. Each tenant should have a unique ID and sub-tenants can be configured in the format state.cityA, state.cityB etc..Further hierarchies are also possible with tenancy.
If you already have a DIGIT environment configured with a tenant and CITIZEN/EMPLOYEE roles, that is sufficient data to get this module running locally. Configuring role-action mapping is necessary during deployment of the app in the DIGIT environment. It will not be needed to run the application locally.
For more information on MDMS see here. To read about how to design for MDMS, please see the design guide.
In the birth registration use case, we use the following master data:
tenantId = "pb"
User roles - CITIZEN and EMPLOYEE roles configured in roles.json (see below section for more info)
Actions - URIs to be exposed via Zuul (see below section for more info)
Role-action mapping - for access control (see below section for more info)
Make sure to add data to the correct branch of the MDMS repository. i.e. if you have setup CD CI to deploy the DEV branch of the repository to the dev environment (default), then make sure to add the information in the DEV branch. If you are testing in staging or some other environment, make sure to add the master data to the corresponding branch of MDMS.
Create a folder called "pb" in the data folder of the MDMS repository "DEV" branch. You will have a new folder path as follows:
<MDMS repo URL path>/data/pb
Restart the MDMS service in the development environment where DIGIT is running once data is added to the MDMS repository. This loads the newly added/updated MDMS configs.
A sample MDMS config file can be viewed here - Sample MDMS data file.
URIs (actions), roles and URI-role mapping will be defined in MDMS. These will apply when the module is deployed into an environment where Zuul is involved (not while locally running the app). In this sample app, we have used "pb" as a tenantId. In your environment, you can choose to define a new one or work with an existing one.
All folders mentioned below need to be created under the data/pb
folder in MDMS.
You can choose to use some other tenantId. Make sure to change the tenant ID everywhere.
Actions need to be defined inside the /data/pb/ACCESS-CONTROL-ACTIONS/actions.json
file. Below are the actions for the birth registration module. Append this to the bottom of the actions.json file. Make sure the "id" field in the JSON is incremented. It needs to be unique in your environment.
Note that the IDs in the actions.json config are generated manually.
Roles config happens at a state level. For birth registration, we need only CITIZEN and EMPLOYEE roles in the /data/pb/ACCESSCONTROL-ROLES/roles.json
file.Here are some sample roles that can be defined in an environment. If these roles are already present in the file, then there is no need to add them in again.
Append the below code to the "roleactions" key in the /data/pb/ACCESSCONTROL-ROLEACTIONS/
roleactions.json.
Note that other role-action mappings may already be defined in your DIGIT environment. So please make sure to append the below. The actionid
refers to the URI ID defined in the actions.json file.
Information on creating a custom calculator service
This calculator service integrates with the core platform's billing service & generates a demand. A bill can be fetched based on the demand and presented to the user. This page provides details about creating a custom calculator service.
Code for the custom calculator service is here. A separate API spec is written for the calculator service.
A calculator service typically has three APIs:
_calculate - This API returns the calculation for a given service application.
getbill or createbill - Creates and returns the bill associated with a particular application.
_search - to search for calculations.
The birth registration service calls the _calculate API to generate a demand for birth registration charges.
Learn all about the development pre-requisites, design inputs, and environment setup
The first step is to create and configure a spring boot project
The next step is to integrate the Persister service and Kafka to enable read/write from the DB
Steps on how to integrate with other key DIGIT services
Learn how to integrate the billing and payment services to the module
Learn how to integrate advanced services to the built module
Test run the built application in the local environment
Deploy and run the modules
This section contains information on steps for integrating with the Persister Service and Kafka.
Code of developer guide till this stage is available here. Postman collection corresponding to this stage is available here.
Steps to integrate Persister Service and Kafka:
Implementing the controller layer in Spring
The web/controller layer handles all the incoming REST requests to a service.
The @RequestMapping("/v1") annotation is added on top of the controller class. This contains the version of the API (this becomes a part of the API endpoint URL).
Follow the steps below to set up the request handler in the controller layer.
Make a call to the method in the Service Layer and get the response back from it.
Build the responseInfo.
Build the final response to be returned to the client.
The controller class here contains the below content -
NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.
Since the Codegen jar creates the search API with search parameters annotated with @RequestParam rather than taking request parameters as a POJO. For this, we will create a POJO by the name of BirthApplicationSearchCriteria within the Models package. Insert the following content in POJO.
The web layer is now setup.
Workflow configuration should be created based on the business requirements. More details on extracting workflow in the design guide.
For our guide, we will configure the following workflow which was the output of the design phase:
We will re-use the workflow service which is deployed in the development/sandbox environment.
This guide assumes you can call the development environment workflow service directly with a valid auth token.
A sample curl is posted below. Make sure to replace the server hostname and the username and password in the below statement:
In POSTMan, create a new POST request and paste the below content in the body. The URL for the request is http://yourserver.digit.org/egov-workflow-v2/egov-wf/businessservice/_create
to create the workflow.
Make sure to replace the authToken field in the body with appropriate auth token in your environment. Login to the server as a CITIZEN or EMPLOYEE user (depending on which one you've created) and obtain the authToken from the response body.
In DIGIT, the API Gateway (Zuul) enriches user information based on the auth token for all requests that go via the gateway. Port forwarding by-passes the API gateway. In this case, when accessing a service directly, for a request to be valid, a user has to send the userInfo JSON inside the RequestInfo object. This is true not just for Workflow but for any service. Sample:
"userInfo": {
"id": 24226,
"uuid": "11b0e02b-0145-4de2-bc42-c97b96264807",
"userName": "sample_user",
"roles": [
{
"name": "Citizen", "code": "CITIZEN"}
]
}
Note that UUID and roles can be dummy place-holder entities in this case for local testing.
Below is the URL and POST body for the business service creation request.
Calculating costs for a service and raising demand for bill generation
The calculation class will contain the calculation logic for the birth certificate registration charges. This can vary from city to city. Based on the application submitted, the calculator class will calculate the tax/charges and call the billing service to generate the demand.
What is a demand?
A demand is the official communication sent by a government authority to a citizen requesting them to pay for a service. A demand leads to a bill. When a bill is paid, a receipt is generated. A demand can be modified prior to bill generation.
For our guide, we are going to create a Calculation Service that will call the calculator to generate a demand. Follow the steps below -
Create a class under service
folder by the name of CalculationService
Annotate this class with @Service annotation and add the following logic within it -
Run the deployed application in the local environment
It is time to test our completed application! Follow the steps on this page to test and run the deployed application.
Before testing our application, we need to run a couple of services locally -
To run the persister service locally -
Run Kafka.
Open egov-persister folder which should be present under core-services repository.
Update the following properties in application.properties file -
To run indexer service locally (optional) -
Run Kafka.
Run ElasticSearch.
Open egov-indexer folder which should be present under core-services repository.
Update the following properties in application.properties file -
For example, the path to config files would be something like -
To run and test our sample application, follow the below steps -
Ensure that Kafka, Persister, Indexer and PDF services are running locally and run the code of voter-registration-service from DIGIT_DEVELOPER_GUIDE branch (for consistency).
Port-forward the following services -
egov-user to port 8284 (for e.g. - kubectl port-forward egov-user-58d6dbf966-8r9gz 8284:8080
)
egov-localization to port 8286 (for e.g. kubectl port-forward egov-localization-d7d5ccd49-xz9s9 8286:8080
)
egov-filestore to 8288 (for e.g. kubectl port-forward egov-filestore-86c688bbd6-zzk72 8288:8080
)
egov-mdms to 8082 (for e.g. kubectl port-forward egov-mdms-service-c9d4877d7-kd4zp 8082:8080
)
Run birth-registration-service that we just created.
Setup environment variables in Postman:
hostWithPort - Eg. yourserver.digit.org:8080 or yourserver.digit.org if the service is running on port 80.
applicationNumber - used in the search/update requests to search for that specific application number. Set it post the create birth registration application call.
In case no workflow has been configured, run the scripts to configure and search for workflow. Double-check ID gen by running the ID gen script.
a. Hit the _create API request to create a voter registration application.
b. Hit the _search API request to search for the created voter registration applications.
c. Hit _update API request to update your application or transition it further in the workflow by changing the actions.
A list of FAQs for developers by developers
Follow the instructions on this page to build and deploy applications on DIGIT.
eGov recommends before developing on top of DIGIT. This ensures that new modules can be developed and deployed in a streamlined way. DIGIT ships with CI as code as part of the DevOps repository. Run the prior to developing on DIGIT.
Step 1: Add entry in build-config.yaml file in the master branch of the forked MDMS repository. This will set up the job pipeline in Jenkins. Make sure to also add the same config to the feature branch you are working on. Refer to this .
Step 2: Follow the instructions for the , and configuration.
Step 3: Go to the Jenkins build page, select "Job Builder" and click on "Build now". This will pull config from build_config.yaml and identify all modules that need to be built.
Step 4: Once the build is done, go to your Jenkins build page. The service will appear under the repository path in which it has been added, i.e. if the service is added under core-services, it will show up in the core-services section on the aforementioned page.
Step 5: Most likely, you will be working on a feature branch for this module and not on "master". Click on "Build with parameters" for the module and search for the branch name in the filter box. Select the feature branch you are working on and then click "Build". This will make sure that Jenkins builds the module pulling code from the branch you prefer.
Step 6: Click on "Console Output". If the build pipeline and docker registries have been set up properly as part of CD/CI setup, the docker image will be built and pushed to the registry. The console output will have the docker image ID for the module. Scroll down to the bottom and copy the following information -
Step 7: After copying the docker image ID, go to your Jenkins server home page, click on "Deployments" and scroll to find your deployment environment. Deployment environments have the template of deploy-to-<env name> and get created as part of CD/CI setup. If multiple environments have been configured, you will see multiple deploy-to-* entries.
Step 8: It is best practice to always test out any new module in the dev environment. Select the environment you would like to deploy to and click on the "Run" icon on the right-hand side of the page against the environment. In the Images text box, paste the copied docker image ID and click "Build". Refer to the screenshot below.
Jenkins will now take care of deploying the new image to the DIGIT environment.
Step 9: Test your new service by testing out the APIs via Postman.
The final step in this process is the creation of configurations to create a voter registration PDF for the citizens to download. For this, we will make use of DIGIT’s PDF service which uses PDFMake and Mustache libraries to generate PDF. Detailed documentation on generating PDFs using PDF service is
Follow the below steps to set up PDF service locally and generate PDF for our voter registration service -
Clone repo.
Clone repo.
Navigate to the DIGIT-Dev repo and open up a terminal. Checkout DIGIT_DEVELOPER_GUIDE branch.
Navigate to the configs folder and under pdf-service data config and format config folders, create file by the name of digit-developer-guide.json
Add the following content in this newly created data config file -
Create a file by the name of digit-developer-guide.json
format-config
folder and place the following content in it -
Open the PDF service (under core-services repository of DIGIT-Dev) on your IDE. Open Environment.js
file and change the following properties to point to the local config files created. For example, in my local setup I have pointed these to the local files that I created -
Make sure that Kafka and Workflow services are running locally and port-forward the following services -
egov-user to port 8284
egov-localization to port 8286
egov-filestore to 8288
egov-mdms to 8082
PDF service is now ready to be started up. Execute the following commands to start it up
Once PDF service is up hit the following cURL to look at the created PDF -
Note: Follow the steps below when the code is deployed to the DIGIT environment. These steps are not applicable for deployment in the local environment. You may choose to follow these when you build and deploy.
Navigate to the forked DIGIT-DevOps repository.
Find the deployment helm chart that was used to deploy DIGIT within the deploy-as-code/helm/environments
directory.
Find "pdf-service"in the deployment helm chart (which was used to set up the DIGIT environment).
Find the "data-config-urls"
property.
Add the path to your new PDF config file here. For this module, we have added file:///work-dir/configs/pdf-service/data-config/digit-developer-guide.json
to the end of the data-config-urls
. The code block is shown below for reference:
Raise a PR for this to the appropriate branch of DevOps which was forked/used to create the deployment.
Restart the PDF service in the k8s cluster, once the PR is merged. It will pick up the latest config from the file above.
The indexer is designed to perform all the indexing tasks of the DIGIT platform. The service reads records posted on specific Kafka topics and picks the corresponding index configuration from the yaml file provided by the respective module configuration. Configurations are yaml based. A detailed guide to creating indexer configs is mentioned in the following document - .
Create a new file under egov-indexer
in configs
repo by the name of digit-developer-guide.yml
and put the following content into it -
Note: Follow the steps below when the code is deployed to the DIGIT environment. These steps are not applicable for deployment in the local environment. You may choose to follow these when you build and deploy.
Navigate to the forked DIGIT-DevOps repository. Under the deploy-as-code/helm/environments
directory, find the deployment helm chart that was used to deploy DIGIT.
In the deployment helm chart (which was used to set up the DIGIT environment), find "egov-indexer". Find the "egov-indexer-yaml-repo-path" property and add the path to your new indexer file here. The code block is shown below for reference:
Raise a PR to the DevOps branch which was forked/used to create the deployment. Once that is merged, restart the indexer service and make sure the cluster configs are propagated.
Status of payment
A demand leads to a bill which then leads to payment by a citizen. Once payment is done, the application status has to be updated. Since we have a microservices architecture, the two services can communicate with each other either through API calls or using events.
The collection service publishes an event on a Kafka topic when payment is collected for an application. Any microservice that wants to get notified when payments are done can subscribe to this topic. Once the service consumes the payment message, it will check if the payment is done for its service by checking the businessService code. The application status changes to PAID or triggers a workflow transition.
For our guide, we will follow the following steps to create payment back update consumer -
Create a consumer class by the name of PaymentBackUpdateConsumer. Annotate it with @Component annotation and add the following content to it -
Create a new class by the name of PaymentUpdateService in the service folder and annotate it with @Service. Put the following content in this class -
Create the following POJOs under models folder:
PaymentRequest.java
Payment.java
PaymentDetail.java
Bill.java
BillDetail.java
BillAccountDetail.java
Import the postman collection of the APIs that this sample service exposes from here -
Add Kafka Configuration
Implement Kafka Producer & Consumer
Add Persister Configuration
Run Application In Local Environment
This page provides the steps on how to add persister configuration.
The persister configuration is written in a YAML format. The INSERT and UPDATE queries for each table are added in prepared statement format, followed by the jsonPaths of values that have to be inserted/updated.
For example, for a table named studentinfo with id, name, age, and marks fields, the following configuration will get the persister ready to insert data into studentinfo table -
Fork the configs repo. Ignore if done already. Clone configs repo in the local environment.
Persister configurations for all modules are present under the egov-persister
folder. Create a file by the name of btr-persister.yml
(or any other name) under the egov-persister
folder.
Add the following content to it -
Import all the core-services projects as Maven projects into your IDE. It is assumed that you have already cloned the DIGIT code locally.
Modify the application.properties file in the egov-persister project and set the following property:
Note: You can set a comma-separated list of files as the value of the above property. If you are running multiple services locally, then this has to be a comma-separated list of persister config files. Make sure you always give the absolute path.
Make sure the Spring DB configurations and Flyway config reflect the same database as what has been set in the module itself. Otherwise, we will see failures in the persister code.
Make sure the Kafka is running locally. Now, go ahead and run the EgovPersistApplication from the IDE. Check the console to make sure it is listening to the right topics as configured in your module's application.properties file.
The persister is now ready for use.
Note: Below steps are for when you deploy your code to the DIGIT env, not for local development. You may choose to do this when you build and deploy.
Push the code to the appropriate branch from which your environment will read it.
Navigate to your fork of the DIGIT-DevOps repository. Under the deploy-as-code/helm/environments
directory, find the deployment helm chart that was used to deploy DIGIT.
In the deployment helm chart (which was used to set up the DIGIT environment), find "egov-persister". Find the "persist-yml-path" property and add the path to your new persister file here.
In the snippet below, file:///work-dir/configs/egov-persister/birth-module-developer-guide.yml
Raise a PR to the appropriate branch of the DevOps repo (master, in egov case) which was forked/used to create the deployment. Once that is merged, restart the indexer service in your environment so it will pick up this new config for the module.