Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Available design artifacts
Find below the Open API specifications defined for this guide:
As a part of this guide, we are going to build a single birth registry. We will re-use the user registry from the DIGIT core. This will capture the mother and father details and store them in the user registry. The baby's details will remain in the birth registry.
A single birth service will manage the registry.
Artefacts required before beginning the development phase
The outputs of the design phase are the inputs to the development phase.
The docs below provide the steps and the resources required to build and design the module.
Step 1: Create Project
Generate project stub using API specs- prepare Swagger contracts
Step 2: Create Database
Create database tables
Step 3: Configure Application Properties
Configure and customize the application properties
Step 4: Import Core Models
Import dependent services models for integration
Step 5: Implement Repository Layer
Perform business logic and configure data stores
Step 6: Create Validation Layers
Add business data validation logic
Step 7: Implement Service Layer
Create the service layer required to process business requests
Step 8: Build Web Layer
Implement web controller layer to manage incoming REST requests
The application.properties file is already populated with default values. Read on to learn how to customise and add extra values for your application (if required).
Kafka topics for the module that need to be added are detailed here.
There are three ways to access services:
a. Run the code locally.
b. Access the service in a DIGIT environment.
c. Access the service locally via port forwarding. This bypasses Zuul's authentication and authorization.
Wherever the localhost is in the URL, the Kubernetes port forwarding has been set up from the development environment to the specified port. In your setup, modify the URLs for the various services depending on whether you are using them from an environment or running them locally or accessing them via port-forwarding.
For example, if no port forwarding has been done, you will have to provide the FQDN of your DIGIT install instead of localhost. Also, without port forwarding, you will have to update the auth tokens in your .aws profile file periodically.
Include all the necessary service host URLs and API endpoints in the "application.properties" file.
This guide specifically references the User, Localisation, HRMS, IDGen, MDMS, and Workflow services that are operational within the DIGIT development environment.
Add the following properties to the application.properties
file.
The following properties must be added for configuring the database and Kafka server. Make sure to use the default values to tune in the Kafka server that can be overwritten during deployment.
Include the following properties in the "application.properties" file to configure the database and Kafka for development purposes once you have completed adding the external dependencies to the "pom.xml" file and reloading the Maven changes.
Append Kafka configurations as per the specific requirements of the DIGIT services. Each module may use different configurations to manage its topics.
DIGIT backend development guide
This guide provides detailed steps for developers to create a new microservice on top of DIGIT. At the end of this guide, you will be able to run the sample module provided (code provided), test it out locally and also deploy it using CI/CD to your DIGIT environment.
Steps to create a microservice:
Set up your development environment
Develop the registries, services, and APIs for a voter registration module that were described in the Design Guide
Integrate with an existing DIGIT environment and re-use a lot of the common services using Kubernetes port forwarding
Test the new module and debug
Build and deploy the new service in the DIGIT environment
The guide is divided into multiple sections for ease of use. Click on the section cards below to follow the development steps.
Access the sample module here. Download and run this in the local environment.
The pom.xml typically includes most of the dependencies listed below at project generation time. Review and ensure that all of these dependencies are present. Update the pom.xml in case any are missing.
Models/POJOs of the dependent service can be imported from the digit-core-models library (work on creating the library is ongoing). These models are used to integrate with the dependent services.
These are pre-written libraries which contain tracer support, common models like MDMS, Auth and Auth and the capability to raise custom exceptions.
Once these core models are imported, it is safe to delete the RequestInfo, and ResponseInfo classes from the models folder and use the ones present in the common contract that is imported.
Before starting development, create/update the following classes as given below.
Delete the following classes which have been generated from codegen - -Address -AuditDetails -Document -Error -ErrorResponse -RequestInfo -RequestInfoWrapper -ResponseInfo -Role -User -Workflow After this import the above classes from egov common-service library as follow-\
Create the BirthApplicationSearchRequest POJO with the following content -
Update BirthApplicationAddress pojo
Create the BTRConfiguration class within the configuration package. The MainConfiguration class should already exist inside the config package.
The core models are imported.
Once PostgreSQL (v10) has been installed and the basic setup is done, we use Flyway to create the tables.
Configure the below properties in the application.properties file to enable flyway migration:
Add the Flyway SQL scripts in the following structure under resources/db/migration/main
:
Add the migration files to the main folder. Follow the specified nomenclature while naming the file. The file name should be in the following format:
Example: V20180920110535__tl_tradelicense_ddl.sql
For this sample service, use the following SQL script to create the required tables.
Generate Project Stub
This page provides detailed steps for generating projects using the given API specifications.
Prepare contracts that detail all the APIs that the service is going to expose for external consumption. eGov uses a customised tool.
Download the jar file and make sure it is available in the classpath. Use the to generate client SDKs using these Swagger contracts.
Refer to the following tutorials to understand the creation of Swagger contracts -
Use the generic command below to create an API skeleton for any Swagger contract:
The following sequence is used to generate the API skeleton using codegen jar:
Navigate to the folder where you have downloaded the codegen jar.
Execute the following command:
OR
Rename the output
folder to birth-registration.
Import it in Eclipse or VS Code.
Update the spring-boot-starter-parent to 3.2.2 in pom.xml.
Perform a maven update once the spring boot version is updated.
Put a slash in front of server.contextPath and add this property to the application.properties file which helps request handlers to serve requests -
Add the below external dependencies to pom.xml:
Download the contract and save it in a file locally. Run the following command to generate the skeleton code from the contract.
Learn all about the development pre-requisites, design inputs, and environment setup
The first step is to create and configure a spring boot project
The next step is to integrate the Persister service and Kafka to enable read/write from the DB
Steps on how to integrate with other key DIGIT services
Learn how to integrate the billing and payment services to the module
Learn how to integrate advanced services to the built module
Test run the built application in the local environment
Deploy and run the modules
Follow the steps outlined on this page to setup the DIGIT development environment.
To setup the DIGIT development environment -
Run the Kafka and PostgreSQL on the development machine and re-use other services from the DIGIT development environment. The following tools are required for development:
Install IDE - To create SpringBoot/Java applications it is recommended to use IntelliJ IDE. IntelliJ can be downloaded from the following links -
Install the Lombok plugins for IntelliJ as we use Lombok annotations in this module.
Install Kafka (version 3.2.0 which is the latest version) - To install and run Kafka locally, follow the following links -
Install Postman - To install Postman, follow the following links -
Install Kubectl - Kubectl is the tool that we use to interact with services deployed on our sandbox environment - kubectl for windows
https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/
Install aws-iam-authenticator - (if the DIGIT development environment is in AWS) - https://docs.aws.amazon.com/eks/latest/userguide/install-aws-iam-authenticator.html
Install PostgreSQL v14 locally
Add configuration - Post installation of Kubectl, add the following configuration in the Kubernetes config file to enable access to resources in our sandbox environment. Kubernetes config file is available in the user's home directory (which varies from OS to OS).
For example, on a Mac, it is available at:/Users/xyz/.kube/config
Once the config file is created, add the following content to it.
Note: Replace the placeholder keys and tokens in the snippet below with your AWS-specific keys. Contact your system administrator for help with this.
Once the configuration file is created, access the pods running in the environment by typing: kubectl get pods
This lists all the pods in the development environment.
In case you get an error stating “Error: You must be logged in to the server (Unauthorized)”, add sudo before the command. For example, “sudo kubectl get pods”. That should resolve the error.
The service layer performs business logic on the RequestData and prepares the Response to be returned back to the client.
Follow the steps below to create the service layer.
Create a new package called Service.
Create a new class in this folder by the name BirthRegistrationService.
Annotate the class with @Service annotation.
Add the following content to the class -
NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.
Find the steps to create the Validation Layer and Enrichment Layer on DIGIT.
All business validation logic should be added to this class. For example, verifying the values against the master data, ensuring non-duplication of data etc.
Follow the steps below to create the validation layer.
Create a package called validators. This ensures the validation logic is separate so that the code is easy to navigate through and readable.
Create a class by the name of BirthApplicationValidator
Annotate the class with @Component annotation and insert the following content in the class -
NOTE: For the sake of simplicity the above-mentioned validations are implemented. Required validations will vary on a case-to-case basis.
This layer enriches the request. System-generated values like id, auditDetails etc. are generated and added to the request. In the case of this module, since the applicant is the parent of a baby and a child cannot be a user of the system directly, both the parents' details are captured in the User table. The user ids of the parents are then enriched in the application.
Follow the steps below to create the enrichment layer.
Create a package under DIGIT by the name of enrichment. This ensures the enrichment code is separate from business logic so that the codebase is easy to navigate through and readable.
Create a class by the name of BirthApplicationEnrichment
Annotate the class with @Component and add the following methods to the class -
NOTE: For the sake of simplicity the above-mentioned enrichment methods are implemented. Required enrichment will vary on a case-to-case basis.
Methods in the service layer, upon performing all the business logic, call methods in the repository layer to persist or lookup data i.e. it interacts with the configured data store. For executing the queries, JdbcTemplate class is used. JdbcTemplate takes care of the creation and release of resources such as creating and closing the connection etc. All database operations namely insert, update, search and delete can be performed on the database using methods of JdbcTemplate class.
On DIGIT the create and update operations are handled asynchronously.
The persister service listens on the topic to which service applications are pushed for insertion and updation. Persister then takes care of executing insert and update operations on the database without clogging the application’s threads.
The execution of search queries on the database returns applications as per the search parameters provided by the user.
Create packages - Add thequerybuilder
and rowmapper
packages within the repository folder.
Create a class - by the name of BirthApplicationQueryBuilder in querybuilder
folder and annotate it with @Component
annotation.
Insert the following content in BirthApplicationQueryBuilder class -
Create a class - by the name of BirthApplicationRowMapper within the rowmapper package and annotate it with @Component.
Add the following content to the class.
Create a class - by the name of BirthRegistrationRepository within the repository folder and annotate it with @Repository annotation.
Add the following content to the class.
The repository layer is implemented.
Follow the steps detailed below to implement Kafka Producer & Consumer.
Producer classes help in pushing data from the application to Kafka topics. DIGIT has a custom implementation of KafkaTemplate class in the tracer library called CustomKafkaTemplate. This implementation of the Producer class does not change across services of DIGIT.
Access the producer implementation details here - Producer Implementation.
The Codegen jar already has created a Producer class. We will continue using it.
Make sure the tracer
dependency version in the pom.xml
is 2.9.0-SNAPSHOT.
For our guide, we will be implementing a notification consumer in the following section.
Once an application is created/requested or progresses further in the workflow, notifications can be triggered as each of these events is pushed onto Kafka topics which can be listened to and an sms/email/in-app notification can be sent to the concerned user(s).
For our guide, we will be implementing a notification consumer which will listen to the topic on which birth registration applications are created. Create a customised message and send it to the notification service (sms/email) to trigger notifications to the concerned users.
Sending SMS notifications to the customer:
Once an application is created/updated the data is pushed on Kafka topic. We trigger notifications by consuming data from this topic. Whenever any message is consumed the service will call the localisation service to fetch the SMS template. It will then replace the placeholders in the SMS template with the values in the message it consumed.
(For example, It will replace the {NAME} placeholder with the owner name from the data consumed). Once the SMS text is ready, the service pushes this data on the notification topic. SMS service consumes data from notification topic and triggers SMS.
Open Kafka/NotificationConsumer.java
and paste the following code:
Create a POJO by the name of SMSRequest in the web.models
package and add the below content to it:
Create a class by the name of NotificationService
under service
folder to handle preparation of customised messages and pushing the notifications.
Add the below content to it -
Integration with signed audit
Enabling signed audit for a module ensures that all transactions - creates, updates, deletes - are recorded in a digitally signed fashion. Learn more about the signed audit service here.
Enabled signed audit is optional but highly recommended to ensure data security.
Add the following lines of code to the birth registration persister after the fromTopic
attribute under mappings
:
This page provides the steps on how to add persister configuration.
The persister configuration is written in a YAML format. The INSERT and UPDATE queries for each table are added in prepared statement format, followed by the jsonPaths of values that have to be inserted/updated.
For example, for a table named studentinfo with id, name, age, and marks fields, the following configuration will get the persister ready to insert data into studentinfo table -
Fork the configs repo. Ignore if done already. Clone configs repo in the local environment.
Persister configurations for all modules are present under the egov-persister
folder. Create a file by the name of btr-persister.yml
(or any other name) under the egov-persister
folder.
Add the below content to it -
Import all the core-services projects as Maven projects into your IDE. It is assumed that you have already cloned the DIGIT code locally.
Modify the application.properties file in the egov-persister project and set the following property:
Note: You can set a comma-separated list of files as the value of the above property. If you are running multiple services locally, then this has to be a comma-separated list of persister config files. Make sure you always give the absolute path.
Make sure the Spring DB configurations and Flyway config reflect the same database as what has been set in the module itself. Otherwise, we will see failures in the persister code.
Make sure the Kafka is running locally. Now, go ahead and run the EgovPersistApplication from the IDE. Check the console to make sure it is listening to the right topics as configured in your module's application.properties file.
The persister is now ready for use.
Note: Below steps are for when you deploy your code to the DIGIT env, not for local development. You may choose to do this when you build and deploy.
Push the code to the appropriate branch from which your environment will read it.
Navigate to your fork of the DIGIT-DevOps repository. Under the deploy-as-code/helm/environments
directory, find the deployment helm chart that was used to deploy DIGIT.
In the deployment helm chart (which was used to set up the DIGIT environment), find "egov-persister". Find the "persist-yml-path" property and add the path to your new persister file here.
In the snippet below, file:///work-dir/configs/egov-persister/birth-module-developer-guide.yml
Raise a PR to the appropriate branch of the DevOps repo (master, in egov case) which was forked/used to create the deployment. Once that is merged, restart the indexer service in your environment so it will pick up this new config for the module.
Describes how to integrate with DIGIT's ID Gen service
This page provides the steps to integrate with the IDGen Service. Each application needs to have a unique ID. The IDGen service generates these unique IDs. ID format can be customised via configuration in MDMS.
Add the ID format that needs to be generated in this file - Id Format Mdms File. The following config has been added for this module:
Restart the IDGen service and MDMS service and port-forward IDGen service to port 8285:
Note that you can set the ID format in the application.properties file of IDGen service and run the service locally if you don't have access to a DIGIT environment.
Hit the below curl to verify that the format is added properly. The "ID" name needs to match exactly with what was added in MDMS.
Once verified, we can call the ID generation service from within our application and generate the registrationId. \
In the BirthApplicationEnrichment class, update the enrichBirthApplication method as shown below:
Make sure below ID generation host configuration is present in the application.properties file. Make sure to fill in the correct values for the host.
Integration with other DIGIT services
A separate class should be created for integrating with each dependent microservice. Only one method from that class should be called from the main service class for integration.
This guide showcases the steps to integrate our microservices with other microservices like
For interacting with other microservices, we can create and implement the following ServiceRequestRepository
class under repository
package -
This section provides the complete details of the system pre-requisites, prepping and setting up the system for development.
Follow the steps in the docs resources below:
This section contains information on steps for integrating with the Persister Service and Kafka.
Steps to integrate Persister Service and Kafka:
Implementing the controller layer in Spring
The web/controller layer handles all the incoming REST requests to a service.
Follow the steps below to set up the request handler in the controller layer.
Make a call to the method in the Service Layer and get the response back from it.
Build the responseInfo.
Build the final response to be returned to the client.
The controller class reflects the below content -
NOTE: At this point, your IDE must be showing a lot of errors but do not worry we will add all dependent layers as we progress through this guide and the errors will go away.
The web layer is now setup.
The birth registration module follows a simple workflow derived from the swimlane diagrams. Please check the section for correlation as well as the for info on how the workflow configuration is derived.
Integration with workflow service requires the following steps -
Create Workflow service - Create a class to transition the workflow object across its states. For this, create a class by the name of WorkflowService.java under the service directory and annotate it with @Service annotation.
Add the below content to this class -
Add workflow to BirthRegistrationService.
Add the below field to BirthRegistrationService.java
Transition the workflow - Modify the following methods in BirthRegistrationService.java as follows. Note that we are adding calls into the workflow service in each of these methods.
Configure application.properties - Add the following properties to application.properties file of the birth registration module. Depending on whether you are port forwarding or using the service directly on the host, please update the host name.
Run the workflow service locally - the application will call into it to create the necessary tables in the DB and effect the workflow transitions.
We will call into MDMS deployed in the sandbox environment. All MDMS config data needs to be uploaded into the MDMS repository (DEV branch if you are deploying/testing in your dev environment).
Integration with MDMS requires the following steps to be followed:
Add a new MDMS file in MDMS repo. For this guide, a sample MDMS file has already been added . Copy this file into your repository.
Restart MDMS service after adding the new file via Jenkins build UI.
Once restarted, hit the curl mentioned below to verify that the new file has been properly added .
Call the MDMS service post verification from within our application and fetch the required master data. For this, create a Java class by the name of MdmsUtil under utils folder. Annotate this class with @Component and put the following content in the class -
Add the following properties in application.properties file -
Overview
The provides the capabilities of creating a user, searching for a user and retrieving the details of a user. This module will search for a user and if not found, create that user with the user service.
DIGIT's user service masks PII that gets stored in the database using the .
Create a class by the name of UserService under service folder and add the following content to it:
Add the below methods to the enrichment class we created. When we search for an application, the code below will search for the users associated with the application and add in their details to the response object.
Add in a userService object:
And enhance the following two methods in BirthRegistrationService.java:
Add the following properties in application.properties file:
Note: If you're port-forwarding using k8s, use "localhost". Otherwise, if you have a valid auth token, provide the hostname here.
MDMS data is the master data used by the application. New modules with master data need to be configured inside the /data/<tenant>/
folder of the MDMS repo. Each tenant should have a unique ID and sub-tenants can be configured in the format state.cityA, state.cityB etc..Further hierarchies are also possible with tenancy.
If you've configured the DIGIT environment with a tenant and CITIZEN/EMPLOYEE roles, you have sufficient data to run this module locally. Configuring role-action mapping is only necessary during app deployment in the DIGIT environment and won't be needed for local application execution.
Refer to docs for more information. To learn about how to design the MDMS, refer to the .
In the birth registration use case, we use the following master data:
tenantId = "pb"
User roles - CITIZEN and EMPLOYEE roles configured in roles.json (see below section for more info)
Actions - URIs to be exposed via Zuul (see below section for more info)
Role-action mapping - for access control (see below section for more info)
Ensure that you add data to the appropriate branch of the MDMS repository. For example, if you've set up CD/CI to deploy the DEV branch of the repository to the development environment (default), add the information to the DEV branch. If you're testing in staging or another environment, make sure to add the master data to the corresponding branch of MDMS.
Create a folder called "pb" in the data folder of the MDMS repository "DEV" branch. You will have a new folder path as follows:
<MDMS repo URL path>/data/pb
Restart the MDMS service in the development environment where DIGIT is running once data is added to the MDMS repository. This loads the newly added/updated MDMS configs.
A sample MDMS config file can be viewed here - .
URIs (actions), roles and URI-role mapping will be defined in MDMS. These will apply when the module is deployed into an environment where Zuul is involved (not while locally running the app). In this sample app, we have used "pb" as a tenantId. In your environment, you can choose to define a new one or work with an existing one.
All folders mentioned below need to be created under the data/pb
folder in MDMS.
You can choose to use some other tenantId. Make sure to change the tenant ID everywhere.
Actions need to be defined inside the /data/pb/ACCESS-CONTROL-ACTIONS/actions.json
file. Below are the actions for the birth registration module. Append this to the bottom of the actions.json file. Make sure the "id" field in the JSON is incremented. It needs to be unique in your environment.
Note that the IDs in the actions.json config are generated manually.
Append the below code to the "roleactions" key in the /data/pb/ACCESSCONTROL-ROLEACTIONS/
roleactions.json.
Note that other role-action mappings may already be defined in your DIGIT environment. So please make sure to append the below. The actionid
refers to the URI ID defined in the actions.json file.
In governments, each region, city or state has custom rates and charges for the same domain. For example, birth certificate registration rates may differ from city to city and the way it is computed can also vary. This cannot be generalised into a platform service. However, billing is a generic platform service.
The way DIGIT solves for this is to "unbundle" the problem and separate out billing from calculations. A customisable service called a "calculator" is used for custom calculation. The calculator then calls the billing service to generate the bill. Each service/module ships with a "default" calculator which can then be customised for that city.
Postman collection of btr-calculator is
Information on creating a custom calculator service
This calculator service integrates with the core platform's billing service & generates a demand. A bill can be fetched based on the demand and presented to the user. This page provides details about creating a custom calculator service.
Code for the custom calculator service is . A separate API spec is written for the calculator service.
A calculator service typically has three APIs:
_calculate - This API returns the calculation for a given service application.
getbill or createbill - Creates and returns the bill associated with a particular application.
_search - to search for calculations.
The birth registration service calls the _calculate API to generate a demand for birth registration charges.
Workflow configuration should be created based on the business requirements. More details on .
For our guide, we will configure the following workflow which was the output of the design phase:
We will re-use the workflow service which is deployed in the development/sandbox environment.
This guide assumes you can call the development environment workflow service directly with a valid auth token.
A sample curl is posted below. Make sure to replace the server hostname and the username and password in the below statement:
In POSTMan, create a new POST request and paste the below content in the body. The URL for the request is http://yourserver.digit.org/egov-workflow-v2/egov-wf/businessservice/_create
to create the workflow.
Make sure to replace the authToken field in the body with appropriate auth token in your environment. Login to the server as a CITIZEN or EMPLOYEE user (depending on which one you've created) and obtain the authToken from the response body.
In DIGIT, the API Gateway (Zuul) enriches user information based on the auth token for all requests that go via the gateway. Port forwarding by-passes the API gateway. In this case, when accessing a service directly, for a request to be valid, a user has to send the userInfo JSON inside the RequestInfo object. This is true not just for Workflow but for any service. Sample:
"userInfo": {
"id": 24226,
"uuid": "11b0e02b-0145-4de2-bc42-c97b96264807",
"userName": "sample_user",
"roles": [
{
"name": "Citizen", "code": "CITIZEN"}
]
}
Note that UUID and roles can be dummy place-holder entities in this case for local testing.
Below is the URL and POST body for the business service creation request.
Calculating costs for a service and raising demand for bill generation
The calculation class contains the calculation logic for the birth certificate registration charges. This can vary from city to city. Based on the application submitted, the calculator class will calculate the tax/charges and call the billing service to generate the demand.
What is a demand?
A demand is the official communication sent by a government authority to a citizen requesting them to pay for a service. A demand leads to a bill. When a bill is paid, a receipt is generated. A demand can be modified prior to bill generation.
For our guide, we are going to create a Calculation Service that will call the calculator to generate a demand. Follow the steps below -
Create a class under service
folder by the name of CalculationService
Annotate this class with @Service annotation and add the following logic within it -
Roles config happens at a state level. For birth registration, we need only CITIZEN and EMPLOYEE roles in the /data/pb/ACCESSCONTROL-ROLES/roles.json
file.Here are some that can be defined in an environment. If these roles are already present in the file, then there is no need to add them in again.
eg_bt_registration
This table holds the baby's information
eg_bt_address
This table holds the applicant address who applied for the birth registration.
Add Kafka Configuration
Implement Kafka Producer & Consumer
Add Persister Configuration
Run the deployed application in the local environment
It is time to test our completed application! Follow the steps on this page to test and run the deployed application.
Before testing our application, we need to run a couple of services locally -
To run the persister service locally -
Run Kafka.
Open egov-persister folder which should be present under core-services repository.
Update the following properties in application.properties file -
To run indexer service locally (optional) -
Run Kafka.
Run ElasticSearch.
Open egov-indexer folder which should be present under core-services repository.
Update the following properties in application.properties file -
For example, the path to config files would be something like -
To run and test our sample application, follow the below steps -
Ensure that Kafka, Persister, Indexer and PDF services are running locally and run the code of voter-registration-service from DIGIT_DEVELOPER_GUIDE branch (for consistency).
Port-forward the following services -
egov-user to port 8284 (for e.g. - kubectl port-forward egov-user-58d6dbf966-8r9gz 8284:8080
)
egov-localization to port 8286 (for e.g. kubectl port-forward egov-localization-d7d5ccd49-xz9s9 8286:8080
)
egov-filestore to 8288 (for e.g. kubectl port-forward egov-filestore-86c688bbd6-zzk72 8288:8080
)
egov-mdms to 8082 (for e.g. kubectl port-forward egov-mdms-service-c9d4877d7-kd4zp 8082:8080
)
Run birth-registration-service that we just created.
Import the postman collection of the APIs that this sample service exposes from here -Birth Registration Postman Collection
Setup environment variables in Postman:
hostWithPort - Eg. yourserver.digit.org:8080 or yourserver.digit.org if the service is running on port 80.
applicationNumber - used in the search/update requests to search for that specific application number. Set it post the create birth registration application call.
In case no workflow has been configured, run the scripts to configure and search for workflow. Double-check ID gen by running the ID gen script.
a. Hit the _create API request to create a voter registration application.
b. Hit the _search API request to search for the created voter registration applications.
c. Hit _update API request to update your application or transition it further in the workflow by changing the actions.
A list of FAQs for developers by developers
\
The final step in this process is the creation of configurations to create a voter registration PDF for the citizens to download. For this, we will make use of DIGIT’s PDF service which uses PDFMake and Mustache libraries to generate PDF. Detailed documentation on generating PDFs using PDF service is available here.
Follow the below steps to set up PDF service locally and generate PDF for our voter registration service -
Clone DIGIT Services repo.
Clone Configs repo.
Navigate to the DIGIT-Dev repo and open up a terminal. Checkout DIGIT_DEVELOPER_GUIDE branch.
Navigate to the configs folder and under pdf-service data config. Create file by the name of digit-developer-guide.json
Add the following content in this newly created data config file -
Navigate to the format-config within the pdf service folder. Create a file by the name of digit-developer-guide.json
and add the below content to it -
Open the PDF service (under core-services repository of DIGIT-Dev) on your IDE. Open Environment.js
file and change the following properties to point to the local config files created. For example, in my local setup I have pointed these to the local files that I created -
Make sure that Kafka and Workflow services are running locally and port-forward the following services -
egov-user to port 8284
egov-localization to port 8286
egov-filestore to 8288
egov-mdms to 8082
PDF service is now ready to be started up. Execute the following commands to start it up
Once PDF service is up hit the following cURL to look at the created PDF -
Note: Follow the steps below when the code is deployed to the DIGIT environment. These steps are not applicable for deployment in the local environment. You may choose to follow these when you build and deploy.
Navigate to the forked DIGIT-DevOps repository.
Find the deployment helm chart that was used to deploy DIGIT within the deploy-as-code/helm/environments
directory.
Find "pdf-service"in the deployment helm chart (which was used to set up the DIGIT environment).
Find the "data-config-urls"
property.
Add the path to your new PDF config file here. For this module, we have added file:///work-dir/configs/pdf-service/data-config/digit-developer-guide.json
to the end of the data-config-urls
. The code block is shown below for reference:
Raise a PR for this to the appropriate branch of DevOps which was forked/used to create the deployment.
Restart the PDF service in the k8s cluster, once the PR is merged. It will pick up the latest config from the file above.
The indexer is designed to perform all the indexing tasks of the DIGIT platform. The service reads records posted on specific Kafka topics and picks the corresponding index configuration from the yaml file provided by the respective module configuration. Configurations are yaml based. A detailed guide to creating indexer configs is mentioned in the following document - Indexer Configuration Guide.
Create a new file under egov-indexer
in configs
repo by the name of digit-developer-guide.yml
and place the below content into it -
Note: Follow the steps below when the code is deployed to the DIGIT environment. These steps are not applicable for deployment in the local environment. You may choose to follow these when you build and deploy.
Navigate to the forked DIGIT-DevOps repository. Under the deploy-as-code/helm/environments
directory, find the deployment helm chart that was used to deploy DIGIT.
In the deployment helm chart (which was used to set up the DIGIT environment), find "egov-indexer". Find the "egov-indexer-yaml-repo-path" property and add the path to your new indexer file here. The code block is shown below for reference:
Raise a PR to the DevOps branch which was forked/used to create the deployment. Once that is merged, restart the indexer service and make sure the cluster configs are propagated.
Status of payment
A demand leads to a bill which then leads to payment by a citizen. Once payment is done, the application status has to be updated. Since we have a microservices architecture, the two services can communicate with each other either through API calls or using events.
The collection service publishes an event on a Kafka topic when payment is collected for an application. Any microservice that wants to get notified when payments are done can subscribe to this topic. Once the service consumes the payment message, it will check if the payment is done for its service by checking the businessService code. The application status changes to PAID or triggers a workflow transition.
For our guide, follow the steps below to create payment back update consumer -
Create a consumer class by the name of PaymentBackUpdateConsumer. Annotate it with @Component annotation and add the following content to it -
Create a new class by the name of PaymentUpdateService in the service folder and annotate it with @Service. Put the following content in this class -
Create the following POJOs under models folder:
PaymentRequest.java
Payment.java
PaymentDetail.java
Bill.java
BillDetail.java
BillAccountDetail.java
Follow the instructions on this page to build and deploy applications on DIGIT.
eGov recommends CD/CI be set up before developing on top of DIGIT. This ensures that new modules can be developed and deployed in a streamlined way. DIGIT ships with CI as code as part of the DevOps repository. Run the CI installer to setup DIGIT CD/CI before developing on DIGIT.
Step 1: Add entry in build-config.yaml file in the master branch of the forked MDMS repository. This will set up the job pipeline in Jenkins. Make sure to also add the same config to the feature branch you are working on. Refer to this example here.
Step 2: Follow the instructions for the persister, indexer and PDF service configuration.
Step 3: Go to the Jenkins build page, select "Job Builder" and click on "Build now". This will pull config from build_config.yaml and identify all modules that need to be built.
Step 4: Once the build is done, go to your Jenkins build page. The service will appear under the repository path in which it has been added, i.e. if the service is added under core-services, it will show up in the core-services section on the aforementioned page.
Step 5: Most likely, you will be working on a feature branch for this module and not on "master". Click on "Build with parameters" for the module and search for the branch name in the filter box. Select the feature branch you are working on and then click "Build". This will make sure that Jenkins builds the module pulling code from the branch you prefer.
Step 6: Click on "Console Output". If the build pipeline and docker registries have been set up properly as part of the CD/CI setup, the docker image will be built and pushed to the registry. The console output will have the docker image ID for the module. Scroll down to the bottom and copy the following information -
Step 7: After copying the docker image ID, go to your Jenkins server home page, click on "Deployments" and scroll to find your deployment environment. Deployment environments have the template of deploy-to-<env name> and get created as part of the CD/CI setup. If multiple environments have been configured, you will see multiple deploy-to-* entries.
Step 8: It is best practice to always test out any new module in the dev environment. Select the environment you would like to deploy to and click on the "Run" icon on the right-hand side of the page against the environment. In the Images text box, paste the copied docker image ID and click "Build". Refer to the screenshot below.
Jenkins will now take care of deploying the new image to the DIGIT environment.
Step 9: Test your new service by testing out the APIs via Postman.