Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
iFIX Core v2.3 contains the following changes.
The data stores have been shifted:
From MongoDB to PostgreSQL
From Druid to ElasticSearch
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Updated Feature
Description
TransactionalDB moved to PostgreSQL
Analytics DB moved to ElasticSearch
Updated dashboard is built using DSS
Technical Documents
With the iFIX v2.3-alpha update, some of the DIGIT core services also need to be deployed. The builds of the same are also listed below. These have been picked from DIGIT-v2.7.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Infrastructure Setup
iFIX/mGramSeva is a microservices-based distributed cloud-native application. The microservices streamline processes to meet outcomes at scale and speed. Each of the microservices is dockerized and deployed on Kubernetes infrastructure.
It is essential to understand some of the key concepts, benefits and best practices of Kubernetes platform before we understand the deployment of the iFix/mGramSeva.
Know the basics of Kubernetes:
Know the commands
Know Kubernetes manifests:
Know how to manage env values, secrets of any service deployed in Kubernetes
Know how to port forward to a pod running inside k8s cluster and work locally
Know sops to secure your keys/creds:
Choose the target infra type and follow the instructions to set up a Kubernetes cluster before moving on to the deployment.
Before we begin the deployment, it is important to understand the deployment architecture that starts from the source code to the production-ready stage. Deploying and managing Kubernetes have emerged as a streamlined way to deploy containers in the cloud infrastructure. When running Kubernetes at scale, managing, operating, and scaling its infrastructure to maximize cluster utilization can be challenging. There are too many parameters the development team needs to manage and configure. This includes selecting the best instance type and size, determining when to scale up or down, and making sure all of the containers are scheduled and running on the best instances — and that is even before starting to think about cost resource optimization.
The simplest way to get started with the deployment process is to manage deployment configuration as code. Each service deployment configuration is defined as Helm charts and deployed into the Kubernetes cluster. We can collocate the deployment-as-code as source code, leveraging all the benefits of source control including change tracking and branching, and then package it. So below is the source code repo that contains all the deployment-as-code for iFIX.
Clean up the cluster Setup using the command below, if required. This deletes the entire cluster and other cloud resources that were provisioned for the mGramSeva Infra Setup.
All done, the infra on local, cloud, and deployment of iFIX into the Kubernetes cluster is completed successfully.
All content on this page by is licensed under a .
Migration to DIGIT architecture primarily involves the following changes to iFIX-Core:
Transactional DB is switched from MongoDB to PostgreSQL
Analytics DB is switched from Druid to ElasticSearch
Correspondingly, the dashboard is switched from Metabase to the DSS dashboard
This migration will require the following tasks to be performed:
Copy data to ElasticSearch
Copy data to PostgreSQL
Fiscal Events
Apart from iFIX-Core, the other services relying on MongoDB were also migrated to PostgreSQL. So they will also need to be migrated.
Department Entity Service
Department Hierarchy Level
Department Entity
Adapter Master Data Service
Expenditure
Department
Project
About the Platform
The platform is built as a Digital Public Good and follows a key set of design principles listed below.
Single Source of Truth - Data resides in multiple systems across departments and getting an integrated, consistent and disaggregated view of data is imperative.
Federated - Central, State and Local Governments represent the federated structure of government. This federated structure must be taken into cognisance while designing systems that enable intergovernmental information exchange.
Unbundled - Break down complex systems into smaller manageable and reusable parts that are evolvable and scale independently.
Minimum - Minimum data attributes that have a well-defined purpose is mandatory; everything else is optional.
Privacy and Security - Ensuring the privacy of citizens, employees and users while ensuring the system's security.
Performance at Scale - The system is designed to perform at scale.
Open Standards - The system is based on open standards so that it is easy to discover, comprehend, integrate, operate and evolve.
The platform applies the microservice architecture concept. Access the list of services supported by iFIX here.
The services are designed to comply with standards which consist of the Common Information Model and APIs. Click here to view the details.
To set up the platform, follow the installation steps listed here.
Category
Services
Docker Artifact ID
Remarks
IFIX Domain Services
IFIX Master Data Service
Removed government master from the service
Migrated chart of account from MongoDB to PostgreSQL
Fiscal Event Service
Move to DIGIT architecture
IFIX ES Pipeline
New kafka stream pipeline to transform data that will be stored in ES.
Fiscal Event Post Processor
Changes maintain compatibility with other updated services.
Frontend
DIGIT-UI
Core Services
dashboard-analytics
dashboard-analytics:v1.1.7-1ffb5fa2fd-49
egov-mdms-service
egov-mdms-service:v1.3.2-72f8a8f87b-12
egov-indexer
egov-indexer:v1.1.7-f52184e6ba-25
egov-localization
egov-localization:v2.5-log4j-12a9331b-2
egov-persister
egov-persister:v1.1.4-72f8a8f87b-6
MGramSeva iFix Adapter v2.3 has the below changes.
Data store moved from MongoDB to ElasticSearch
Non-backward compatible minor change to the /_create and /_update of department-entity and project master.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
v2.0-alpha release details
IFIX 2.3-alpha release details and highlights are available in the below release notes.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
With this update, the iFIX Adapter services now depend on some of the DIGIT Core services. egov-persister should be deployed and configured.
All content on this page by is licensed under a .
Category | Services | Docker Artifact ID | Remarks |
IFIX Adaptor | IFIX Department Entity Service |
|
Adapter Master Data Service |
|
Deploy ifix-migration-toolkit-db:develop-b7a2050-13
to ifix
environment
Port forward to the ifix-migration-toolkit pod
The following common request body is used to hit the ifix-migration-toolkit endpoints:
Government master data is moved to MDMS. Please manually copy the government(tenant) data to the tenants
master of tenant
module. Here is a link to a sample .
First, deploy the latest image of the ifix-master-data-service
(So that the tables get created in PostgreSQL.)
Deploy plain search docker image of ifix-master-data:
Ensure is configured in the egov-persister
Hit the Chart of Account migrate endpoint of the migration-toolkit
With this, we will be performing migration from MongoDB to both, PostgreSQL and ElasticSearch. Data will be read from MongoDB using fiscal-event-service's plain search api, and pushed to a kafka topic.
From there, the data will reach PostgreSQL using egov-persister.
From kafka-topic data will pass through ifix-es-pipeline. After that ifix-es-pipeline will post the processed data to the next kafka topic in the pipeline. From there, egov-indexer will read from the topic and index the data onto ElasticSearch.
First, deploy the latest image of the fiscal-event-service
(So that the tables get created in PostgreSQL.)
Follow the instructions mentioned in the ifix-es-pipeline project's README to setup the ES mapping before performing the migration.
Deploy plain search docker image of fiscal-event-service:
Deploy the latest image of ifix-es-pipeline
Hit the Fiscal Event migrate endpoint of the migration-toolkit
Ensure is configured in egov-persister
Ensure is configured in egov-indexer
(Ensure is configured in egov-persister to record the progress of the migration. In case of any failures during migration, with this, it will resume the migration from where it left off. )
Deploy ifix-migration-toolkit-db:develop-2557812-9
to mgramseva
environment.
Port forward to the ifix-migration-toolkit pod
The following common request body is used to hit the ifix-migration-toolkit endpoints:
First, deploy the latest image of the ifix-department-entity-service
(So that the tables get created in PostgreSQL.)
Deploy plain search docker image of ifix-department-entity-service:
Ensure department-entity-persister.yml is configured in the egov-persister.
(Ensure department-entity-migration-progress is configured in egov-persister to record the progress of the migration. In case of any failures during migration, with this, it will resume the migration from where it left off. )
Hit the Department Hierarchy migrate endpoint of the migration-toolkit
Hit the Department Entity migrate endpoint of the migration-toolkit
First, deploy the latest image of the ifix-department-entity-service
(So that the tables get created in PostgreSQL.)
Deploy plain search docker image of ifix-department-entity-service:
Ensure adapter-master-data-service.yml is configured in egov-persister.
Hit the Department migrate endpoint of the migration-toolkit
Hit the Expenditure migrate endpoint of the migration-toolkit
Hit the Project migrate endpoint of the migration-toolkit
iFix specification details
iFix is a fiscal data exchange platform that enables the exchange of standardized fiscal data between various agencies. iFix is designed to enable the exchange of fiscal data between various agencies and ensure the visibility of fiscal data. iFix makes it possible to chain the fiscal data with each other and establish a chain of custody for the entire lifecycle from budgeting to accounting.
From the iFIX perspective, there are two types of agencies
Fiscal Data Provider - posts the fiscal data into iFix using well-defined formats.
Fiscal Data Consumer - can query the fiscal data.
Both these roles are interchangeable.
Providers and consumers need to register on iFix before they can post or query fiscal data. To register the concerned person from the agency must be provided with the following information on the iFix portal - Name of the Agency, Contact Person, Name, Contact Person’s Phone Number, and Contact Person’s Official Email Address. The OTP sent to the email address for the person should be used to register.
Registered users -
logs in to the iFix portal using the official email address
registers one or more systems as a provider or consumer or both
provides the name of the system
a unique ID is generated for each system (example: mgramseva@punjab.ifix.org)
a secret API key is also generated for each system - use this key to post or query fiscal data
The API key can be regenerated if required - only one API key is active at a given point in time
The portal provides the ability to generate new keys for each system.
Fiscal data providers post the fiscal data in two ways.
A fiscal message - is directed to a specific consumer and is delivered to intended consumers. These messages are available for query by intended consumers only.
A fiscal event - iFix stores the events for consumers to query
Fiscal Event consists of
Header
From
To
Date of Posting
Body
Fiscal Event Type e.g. Revenue, Expenditure, Debt
Fiscal Event Subtype
Revenue - Estimate, Plan, Demand, Receipt, Credit
Expenditure - Estimate, Plan, Bill, Payment, Debit
Debt - in progress - will be provided later.
Array of fiscal line items
Amount
CoA
Location Code - from Location Registry
Program Code - from Program Registry
Project Code - from Project Registry
Administrative Hierarchy Code from Administrative Hierarchy Registry
Start Date of Period
End Date of Period
….
….
Attachment - Attachments consist of additional attributes like key-value pairs e.g. Account Number, Correlation ID or Documents
Signature - Fiscal messages can be signed by multiple agencies and add the signature to the Signature Array that contains the below-mentioned values -
Array of Signature
System
eSign - Signed Value of the Fiscal Event/Message Body using the System Key
Purpose - Acknowledgement or Approval or Rejection
Comments
Date of Signing
Data providers can reverse a previous fiscal message or event. The data provider reverses the data by posting the same event with a negative amount in the line item(s). The data consumers should handle reversals appropriately.
Data consumers can query fiscal data. They can query Messages - the unread messages that have been delivered to them. When consumers read the unread messages, these messages are marked as read. Events - Consumers can also query fiscal events posted by other data providers.
Location
Administrative
Chart of Account …. … …
Department Entity (For example, Ropar Water Department)
Business Services
Fiscal Event
Post-Fiscal Event
Search Fiscal Event
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
This object captures the fiscal information of external systems.
Field | Type | Description |
---|---|---|
Captures the department attributes.
Field | Type | Description |
---|---|---|
Captures the COA data as a map.
Field | Type | Description |
---|---|---|
Captures the Expenditure attributes - encapsulates all scheme and non-scheme expenditure details.
Field | Type | Description |
---|---|---|
Captures the Project attributes.
Field | Type | Description |
---|---|---|
Field | Type | Description |
---|---|---|
Platform architecture
The iFIX platform architecture and the interacting systems are divided into 3 parts:
iFIX Platform
External Agency Systems
Common Reference Data
Fiscal Event Service: It receives/sends fiscal events from/to the external systems.
Fiscal Event Store: After basic checks, the raw fiscal events are stored in the fiscal event store.
Fiscal Event Post Processor: Post Processor de-references the master data and flattens the object to store the event in the analytical data store.
Fiscal Event Analytical Store: Fiscal Events will be stored in a separate analytical store to run analytics queries.
Fiscal Event Analytical Service: All the analytics queries might not be able to run on the raw fiscal events, so a new fiscal event analytical service is developed to process the raw data and prepare it for the analytics queries.
Client Registry: All the registered clients of iFIX are maintained in this registry.
Fiscal Event Producer: They are the on-ground systems that generate new financial transactional information.
Producer Adapter: It helps the producer systems send the fiscal information to the iFIX Platform and link that transaction with some common reference data.
Fiscal Event Consumer: There are systems on the other end of the transaction that executes some process based on fiscal transaction information.
Consumer Adapter: It helps communicate fiscal events from the iFIX Platform to the consumer systems.
Reference Dashboard: Based on the consumed fiscal events, reference dashboards can be built with aggregated amounts by reference data.
We can link fiscal events with common reference data to derive more value from the fiscal events data. This allows us to run better analytics queries.
Administrative Registry: It will maintain the hierarchy of entities for a department integrated with iFIX.
Program Registry: It will maintain the master data for government missions/schemes.
Location Registry: It will maintain location reference data. This data is shared across all iFIX clients.
Chart of Account Registry: It will maintain the chart of account details.
iFix Infra Setup & Deployment
Tenant Management
Master Data Management (Chart of Accounts)
Fiscal Project Management
Immutable Fiscal events
Type of events: Bill, Demand, Payment, Receipt (budgetary, cash and accrual events in the budget cycle)
Deduplication Reversals, Adjustments - Planned
Fiscal Messaging and Subscription (coordination between different entities) - Planned
Fiscal reporting and analytics - Planned
Reference Adaptor
Reference Dashboard
All content on this page by is licensed under a .
The Azure Kubernetes Service (AKS) is one of the Azure services used for deploying, managing, and scaling any distributed and containerized workloads. Here we can provision the AKS cluster on Azure from the ground up and using an automated way (infra-as-code) using and then deploy the DIGIT-iFIX Services config-as-code using .
This quickstart assumes a basic understanding of Kubernetes concepts. For more information, see .
If you don't have an , create a before you begin.
Use the Bash environment in .
If you prefer, the Azure CLI to run CLI reference commands.
If you're using a local installation, sign in to the Azure CLI by using the command. To finish the authentication process, follow the steps displayed in your terminal. For additional sign-in options, see .
When you're prompted, install Azure CLI extensions on first use. For more information about extensions, see .
Run to find the version and dependent libraries that are installed. To upgrade to the latest version, run .
This article requires version 2.0.64 or greater of the Azure CLI. If using Azure Cloud Shell, the latest version is already installed.
The identity you are using to create your cluster has the appropriate minimum permissions. For more details on access and identity for AKS, see .
Install on your local machine that helps you interact with the kubernetes cluster
Install that helps you package the services along with the configurations, envs, secrets, etc into a
Install version (0.14.10) for the Infra-as-code (IaC) to provision cloud resources as code and with desired resource graph and also it helps to destroy the cluster at one go.
Note: Run the commands as administrator if you plan to run the commands in this quickstart locally instead of in Azure Cloud Shell.
Before we provision the cloud resources, we need to understand and be sure about what resources need to be provisioned by terraform to deploy DIGIT. The following picture shows the various key components. (AKS, Worker Nodes, Postgres DB, Volumes, Load Balancer)
Considering the above deployment architecture, the following is the resource graph that we are going to provision using terraform in a standard way so that every time and for every env, it'll have the same infra.
AKS Azure (Kubernetes Service Master)
Work node group (VMs with the estimated number of vCPUs, Memory
Volumes (persistent volumes)
PostGres Database
Virtual Network
Users to access, deploy and read-only
Here we have already written the terraform script that provisions the production-grade DIGIT Infra and can be customized with the specified configuration.
The following main.tf contains the detailed resource definitions that need to be provisioned, please have a look at it.
Dir: iFix-DevOps/Infra-as-code/terraform/aks-ifix-dev
You can define your configurations in variables.tf and provide the environment-specific cloud requirements so that using the same terraform template you can customize the configurations.
Following are the values that you need to mention in the following files, the blank ones will be prompted for inputs while execution.
Now that we know what the terraform script does, the resources graph that it provisions and what custom values should be given with respect to your env.
Let's begin to run the terraform scripts to provision infra required to Deploy DIGIT on AZ.
First CD into the following directory and run the following command 1-by-1 and watch the output closely.
Upon Successful execution following resources gets created which can be verified by the command "terraform output"
Network: Virtual Network.
AKS cluster: with nodepool(s), master(s) & worker node(s).
Storage(s): for es-master, es-data-v1, es-master-infra, es-data-infra-v1, zookeeper, kafka, kafka-infra.
Downloads credentials and configures the Kubernetes CLI to use them.
3. Finally, Verify that you are able to connect to the cluster by running the following command
iFIX business specifications
This page provides the details of the fiscal event-based approach for building out iFIX as an information exchange platform. These details are used to define the technical specifications for iFIX and the functional specifications that are open for validation and inputs internally and from the ecosystem.
The specifications have been defined from the lens of a sub-national government (the state in the current context) and are limited to interactions from a financial information perspective. The interactions covered on this page include:
State Finance Department (FD) and Central (National) Government
State Finance Department (FD) and Line Department(s) at the state level
State Line Department’s interactions with other line departments
State Line Department’s interactions with government autonomous bodies including local government and/or non-government agencies
The functional specifications are built using publicly available information and inputs from the Department of Water Supply and Sanitation (DWSS), Punjab, and Finance Department, Punjab. Financial information-related interactions of the Central Government with other Central Line Departments are out of the scope of the current specs.
The broad objective of iFIX is to enable the flow of reliable and verifiable fiscal information in a timely manner. Recognizing the multiplicity of unique types of information flows in the current PFM system, iFIX aims at simplifying this network of information flow. The key driver of this simplification will be the use of standardised formats for fiscal information exchange. To arrive at a crystallised set of formats and protocols covering all fiscal information exchanges following steps have been followed
Step 1 - Define what is a fiscal event (and its types) or what is the scope of relevant fiscal information from an iFIX perspective (refer section)
Step 2 - Document the current public finance management processes at a generic level, i.e. defined using actors, verbs, inputs and outputs to ensure they are representative of all minor variants of the process at the level of various sub-national governments in India. (refer to tables in , columns 1-6)
Step 3 - Apply the definition of the fiscal event to the current processes to collapse or abstract the fiscal event essence of the whole process to a set of fiscal events. (refer to tables in , columns 7-9).
Step 4 - Extract the current data attributes used for fiscal information exchange to define the format for fiscal information exchange. (refer to tables in , columns 10)
Events that trigger the generation of relevant fiscal information, at any stage in the budget cycle, are termed fiscal events. To be classified as a fiscal event, the event will need to meet one of the following criteria:
Transaction-Based Fiscal Events: Such fiscal events are triggered when there is fiscal information being generated due to an actual change of hands of a financial asset or in simple words due to a financial transaction.
Non-Transactional Fiscal Events: While there are numerous non-transactional events that occur throughout the budget cycle, these are termed Fiscal events only if they meet at least one of the following criteria:
Minimum Degree of Finality: A non-transactional event will be considered a fiscal event only when the action resulting from or the document produced from the event has definitive implications for the budgetary cycle. Examples:
A draft of the budget that has been prepared at the state line department level (DDO) and sent to the next competent authority (BCO) for approval will trigger a fiscal event. The reason for this is that it is assumed that the line department at its level has done the calculations for arriving at a final figure which then needs to be approved by the next competent authority.
With respect to any payments to be made out of state treasury, any verbal/ email-based intra-Finance department go-ahead (between State Treasury and Cash Planning Department) to make the payment will not be considered a Fiscal Event. Only the generation of Payment Advice to the concerned bank will trigger a fiscal event.
Change in ability to claim/ use/ dispose of a financial asset: A non-transactional event will be considered a fiscal event if it results in (de-)authorizing a certain individual or entity to claim/ utilize/ dispose of a financial asset. Examples:
Allocation of the budget by the department to respective DDOs authorises the DDOs to utilize the funds as planned in the budget and will trigger a fiscal event.
The fiscal information generated during each fiscal event needs to be recorded in a specific format for the purpose of exchange. Based on the current budgetary practices and guidelines followed at the sub-national and national levels in India, the fiscal events triggered during the course of the budget cycle can be classified into four major types:
Revenue Receipts: Revenue receipts comprise receipts that do not result in the creation of a liability on the government. The total revenue receipts include State’s Own Tax and Non-Tax revenues and Grants-in-Aid and Share in Central Taxes from the Government of India. The non-tax revenues consist mainly of interest and dividends on investments made by the Government, fees and other receipts for services rendered by the Government.
Capital Receipts: The capital receipts are loans raised by the Government from the public (these are termed as market loans), borrowings by the Government through the sale of Treasury Bills, the loans received from Central Government and bodies, disinvestment receipts and recoveries of any loans and advances given.
Revenue Expenditure: Revenue expenditure is for the normal running of different Government Departments and for the rendering of various services, making interest payments on debt, meeting subsidies, grants in aid, etc. Broadly, the expenditure which does not result in the creation of assets for the Government of India is treated as revenue expenditure. All grants given by the State are also treated as revenue expenditure even though some of the grants may be used for the creation of capital assets.
Capital Expenditure: Capital payments consist of capital expenditure on the acquisition of assets like land, buildings, machinery, and equipment, as also investments in shares, etc., and loans and advances made by the State Government to boards, corporations and other institutions.
Further within each major type of fiscal event, there are varied types of events. To enable information exchange using an easily understandable and standardised format, subtypes of fiscal events are identified based on the similarity in nature of fiscal information generated due to these events.
Budget Cycle comprises the following stages:
Budget Planning
Budget Preparation
Budget Approval
Budget Allocation
Budget Execution
Budget Accounting
Budget Auditing
Additionally, Budget Planning is an activity that sits outside of the Budget Cycle and takes place throughout the year based on need. Examples:
A scheme/project announced by a state government official can happen at any point of the year, the planning for which begins right after the announcement. Thereafter, all the required approvals happen and estimates are prepared accordingly which then feed into the budget preparation phase of the budget cycle
Planning for already approved projects and planning to get approval
Process for Planning for the New Projects/ Schemes (for Revenue and Capital Expenditure)
Budget Preparation for Revenue Receipt
Budget Preparation for Revenue Expenditure
Budget Preparation for Capital Expenditure - Capital Outlay
Budget Approval from Legislature (Same process for Revenue and Capital nature Receipts and Expenditures)
[5]Communication and Distribution of funds (Same process for Revenue and Capital nature Expenditure)
Request for Release of Funds from State Treasury - Scheme-related Capital Expenditure
Release of funds to DDO
Work awarded to vendors
Work bill payment to vendors
₹Request for Release of Funds from State Treasury - Revenue Expenditure
Collection of Revenue Receipt into the State Treasury
Request for Supplementary Grants (Same process for Revenue and Capital Expenditure)
Surrender of Excess / Savings (Same process for Revenue and Capital expenditure)
Process for monthly budget accounting/reporting for Receipts (Same Process for Revenue and Capital Receipts)
Process for monthly budget accounting/reporting -Expenditure (Same Process for Revenue and Capital Expenditure)
Process for budget auditing (have to explore if there are variations for revenue and expenditure)
Each fiscal event extracted above is defined in terms of
Header
Body
Array of Fiscal Line Items
Attachment
Signature
Technology used for the platform
Java 8 ()
Apache Kafka ()
Keycloak ()
Postgres()
Zuul ()
MongoDB ()
Apache Druid ()
Metabase ()
All content on this page by is licensed under a .
mGramSeva Quickstart, this is not for a production
Quickstart Installation helps you jump-start with the mGramSeva basic installation with limited functionalities.
mGramSeva is a distributed microservice-based platform that comprises many services that are containerized. Depending upon the required features, the specific services can be run on any container-supported orchestration platform like docker-compose, Kubernetes, etc.
The Quickstart guide covers the installation steps for basic services to get the platform up. Before setting up mGramSeva, create a lightweight Kubernetes cluster called on a local machine with specified H/W requirements. The H/W requirements are listed below to ensure before we proceed further.
To provision a lightweight Kubernetes cluster, please follow the instructions below in context to your OS and install the k3d on your machine.
min 4 vCPUs (recommended 8)
min 8GiB of RAM (recommended 16)
min 30GiB of HDD (recommended 30+)
Linux distribution running in a VM or bare metal
Ubuntu 18.04 or Debian 10 (VM or bare metal)
Install
on Linux
Open terminal and Install k3d on Linux using the below command
OSX or Mac
local Kubernetes cluster enabled
on Mac
Install k3d on Mac, on terminal use (Homebrew is available for MacOS) using the below command
Windows 10 or above
need to be installed
on Windows
package manager for windows
Install as an alternative command prompt that allows most of the Linux commands on windows.
Open gitbash and Install k3d on Windows using the below command
Once the above prerequisites are met, run the following tasks depending upon your OS.
login/ssh into the machine, go to terminal/command prompt and run the following commands as an admin user.
Create /Kube directory and change permission. To use this directory for persistent data mount. This means data from all container logs will be stored here.
Create a cluster with a single master node and 2 agents (Worker Nodes) and mount the pre-created directory (for data persistence).
When cluster creation is successful, get the kubeconfig file, that allows you to connect to the cluster at any time.
Verify the Cluster creation by running the following commands from your local machine where the kubectl is installed. It gives you the sample output as below
You can verify the workers' nodes created by using the following command.
Once the above steps are completed successfully, your Cluster is now up and running ready to proceed with the DIGIT Deployment.
Now that we have the Infra setup to proceed with the DIGIT Deployment. Following are the tools that need to be installed on the machine before proceeding with the DIGIT Services deployment.
What we'll deploy in Quickstart:
mGramSeva core platform services
After cloning the repo CD into the folder iFix-DevOps and type the "code ." command that will open the visual editor and opens all the files from the repo iFix-DevOps
Once the prerequisite setup is complete, go to the following repo, run the command and follow the instructions.
You can now test the DIGIT application status in the command prompt/terminal by using the below command.
Choose your infra type and provision the necessary infra before you actually deploy the services
iFIX/mGramSeva is a microservices-based distributed cloud-native application. Each of these context-specific microservices is dockerized and deployed on Kubernetes infrastructure.
It is essential to understand some of the key concepts, benefits and best practices of the Kubernetes platform before we understand the deployment of the iFIX/mGramSeva.
Know the basics of Kubernetes:
Know the commands
Know Kubernetes manifests:
Know how to manage environment values and secrets of any service deployed in Kubernetes
Know how to port forward to a pod running inside k8s cluster and work locally
Know sops to secure your keys/creds:
Choose the target infra type and follow the instructions to set up a Kubernetes cluster before moving on to the deployment.
Before we begin the deployment, it is important to understand the deployment architecture that starts from the source code to the production-ready stage. Deploying and managing Kubernetes have emerged as a streamlined way to deploy containers in the cloud infrastructure. When running Kubernetes at scale, managing, operating, and scaling its infrastructure to maximize cluster utilization can be challenging. There are too many parameters the development team needs to manage and configure. This includes selecting the best instance type and size, determining when to scale up or down, and making sure all of the containers are scheduled and running on the best instances — and that is even before starting to think about cost resource optimization.
The simplest way to get started with the deployment process is to manage deployment configuration as code. Each service deployment configuration is defined as Helm charts and deployed into the Kubernetes cluster. We can collocate the deployment-as-code as source code, leveraging all the benefits of source control including change tracking and branching and then packaging it. The source code repo below contains the deployment-as-code details for iFIX.
Use the command below to clean up the setup cluster. This deletes the entire cluster and other cloud resources that were provisioned for the mGramSeva Infra Setup.
All done, the infra on local, cloud, and deployment of iFIX into the Kubernetes cluster is completed successfully.
Explore our iFIX (PFM) roadmap detailed view below.
Key Themes | Q1 | Q2 | Q3 | Q4 |
---|
The is one of the AWS services for deploying, managing, and scaling any distributed and containerized workloads, here we can provision the EKS cluster on AWS from ground up and using an automated way (infra-as-code) using and then deploy the DIGIT-iFIX Services config-as-code using .
Know about EKS:
Know what is terraform:
with the admin access to provision EKS Service, you can always subscribe to free AWS account to learn the basics and try, but there is a limit to , for this demo you need to have a commercial subscription to the EKS service.
Install on your local machine that helps you interact with the kubernetes cluster
Install that helps you package the services along with the configurations, envs, secrets, etc into a
Install version (0.14.10) for the Infra-as-code (IaC) to provision cloud resources as code and with desired resource graph and also it helps to destroy the cluster at one go.
on your local machine so that you can use aws cli commands to provision and manage the cloud resources on your account.
Install that helps you authenticate your connection from your local machine so that you should be able to deploy DIGIT services.
Use the credentials provided for the Terraform () to connect with your AWS account and provision the cloud resources.
You'll get a Secret Access Key and Access Key ID. Save them safely.
Open the terminal and run the following command. The AWS CLI is already installed and the credentials are saved. (Provide the credentials and you can leave the region and output format blank).
The above will create the following file In your machine as /Users/.aws/credentials
Before we provision the cloud resources, we need to understand and be sure about what resources need to be provisioned by terraform to deploy DIGIT. The following picture shows the various key components. (EKS, Worker Nodes, Postgress DB, EBS Volumes, Load Balancer)
Considering the above deployment architecture, the following is the resource graph that we are going to provision using terraform in a standard way so that every time and for every environment, it'll have the same infra.
EKS Control Plane (Kubernetes Master)
Work node group (VMs with the estimated number of vCPUs, Memory)
Node-pool's (mgramseva and ifix)
EBS Volumes (persistent volumes)
RDS (Postgresql)
VPCs (private network)
Users to access, deploy and read-only
Here we have already written the terraform script that provisions the production-grade DIGIT Infra and can be customized with the specified configuration.
Example:
VPC Resources:
VPC
Subnets
Internet Gateway
Route Table
EKS Cluster Resources:
IAM Role to allow EKS service to manage other AWS services
EC2 Security Group to allow networking traffic with EKS cluster
EKS Cluster
EKS Worker Nodes Resources:
IAM role allowing Kubernetes actions to access other AWS services
EC2 Security Group to allow networking traffic
Data source to fetch the latest EKS worker AMI
AutoScaling Launch Configuration to configure worker instances
AutoScaling Group to launch worker instances
Database
Configuration in this directory creates a set of RDS resources including DB instance, DB subnet group, and DB parameter group.
Storage Module
Configuration in this directory creates EBS volume and attaches it together.
The following main.tf with create s3 bucket to store all the state of the execution to keep track.
iFix-DevOps/Infra-as-code/terraform/sample-eks/remote-state
2. The following main.tf contains the detailed resource definitions that need to be provisioned, please have a look at it.
Dir: iFix-DevOps/Infra-as-code/terraform/sample-eks
Define your configurations in variables.tf. Provide the environment-specific cloud requirements and use the same terraform template to customize the configurations.
The values given below must be mentioned in the following files. The blank ones will be prompted for inputs while execution.
Important: Create your own key base key before you run the terraform
Now that we know what the terraform script does, the resources graph that it provisions and what custom values should be given with respect to your env.
Let's begin to run the Terraform scripts to provision infra required to Deploy DIGIT on AWS.
First CD into the following directory and run the following command 1-by-1 and watch the output closely.
Upon Successful execution following resources get created which can be verified by the command "terraform output"
s3 bucket: to store terraform state.
Network: VPC, security groups.
EKS cluster: with master(s) & worker node(s).
Storage(s): for es-master, es-data-v1, es-master-infra, es-data-infra-v1, zookeeper, kafka, kafka-infra.
3. Verify that you are able to connect to the cluster by running the following command
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Field | Type | Description |
---|---|---|
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Ideally, one would write the terraform script from the scratch using this .
Let's clone the GitHub repo where the terraform script to provision the AKS cluster is available and below is the structure of the files.
To manage a Kubernetes cluster, use the Kubernetes command-line client, . kubectl
is already installed if you use Azure Cloud Shell.
Install kubectl
locally using the command:
Configure kubectl
to connect to your Kubernetes cluster using the command. The following command:
Uses ~/.kube/config
, the default location for the . Specify a different location for your Kubernetes configuration file using --file.
Whola! All set and now you can go with .
All content on this page by is licensed under a .
All content on this page by is licensed under a .
mGramSeva uses (required v1.13.3) automated scripts to deploy the builds on to Kubernetes - or or
All mGramSeva services are packaged using helm charts
is a CLI to connect to the kubernetes cluster from your machine
Install for making API calls
IDE Code for better code/configuration editing capabilities
The mGramSeva services deployment configurations are in which needs to and then it to your local.
to run some DIGIT bootstrap scripts
Check the file that needs to be configured as per any specific values according to your needs. (For a quick start you can run as it is)
Add the following entries in your host file /etc/hosts depending on your OS, instructions can be found .
All content on this page by is licensed under a .
All content on this page by is licensed under a .
Ideally, one would write the terraform script from the scratch using this .
Let's clone the GitHub repo where the terraform script to provision EKS cluster is available and below is the structure of the files.
Use the URL to . This creates both public and private keys on your machine. Upload the public key into the account that you have just created, give a name to it and ensure that you mention that in your terraform. This allows the encryption of all sensitive information.
Example - the keybase user (in eGov case is "egovterraform") needs to be created and has to be uploaded the public key here -
you can use this to decrypt your secret key. To decrypt PGP Message, upload the PGP Message, PGP Private Key and the Passphrase.
IAM users auth: using the key base to create admin, deployer and the user. Use this URL to , this will create both public and private keys in your machine. Upload the public key into the account that you have just created, give a name to it and ensure that you mention that in your terraform. This allows for encrypting all sensitive information.
Example: keybase user (in eGov case is "egovterraform") needs to be created and has to be uploaded the public key here -
you can use this to decrypt your secret key. To decrypt PGP Message, Upload the PGP Message, PGP Private Key and Passphrase.
2. Use this link to to get the kubeconfig file and be able to connect to the cluster from your local machine so that you should be able to deploy DIGIT services to the cluster.
Whola! All set and now you can go with ..
All content on this page by is licensed under a .
id
String (64,1)
Unique Identifier
name
String (256,2)
Name of the Government or Tenant - e.g. India, Nigeria, Punjab
id
string
Unique Identifier
name
string
Name of the Department - e.g. Department of Water Supply and Sanitation
code
string
Unique Code e.g. DWSS
name
String
Name of the Account
coaCode
String
Full Chart of Account String eg: 1234-123-123-12-12-12
majorHead
String
Major head code
majorHeadName
String
Major head name
majorHeadType
String
Major head code type
subMajorHead
String
Sub-Major head code
subMajorHeadName
String
Sub-Major head name
minorHead
String
Minor head code
minorHeadName
String
Minor head name
subHead
String
Sub-Head code
subHeadName
String
Sub-Head name
groupHead
String
Group head code
groupHeadName
String
Group head name
objectHead
String
Object head code
objectHeadName
String
Object head name
id
String
Unique Identifier
name
String
Name of the Expenditure - This could be a scheme or a non-scheme expenditure - e.g. Jal Jeevan Mission
code
String
Unique Code e.g. JJM
type
Enum
Type of Expenditure - [Scheme, Non-Scheme]
id
String
Unique Identifier
name
String
Name of the Project - e.g. Kartarpur Sahib Water Supply Project
code
String
Code of the Project - e.g. S572
id
String
Unique Identifier
amount
Number
Transaction Amount
coaId
String
Chart of Account ID
fromBillingPeriod
Integer($int64)
Start date of the billing period for which transaction is applicable
toBillingPeriod
Integer($int64)
End date of the billing period for which transaction is applicable
version
String
Version of the Information Model
id
String
System Generated Unique ID
tenantId
String
Government ID
projectId
String
Unique ID of the Associated Project
eventType
String
Type of Fiscal Event e.g. Demand, Bill, Payment, Receipt
eventTime
integer($int64)
Time Stamp when the event occurred in the source system
referenceId
String
Unique Transaction Reference ID in the source system
parentEventId
String
Unique ID of the Parent Event to which this event is linked
amountDetails
Array[Amount]
Array of type Amount
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1 | Department/s | Announcement of the new Project/ Scheme by Chief Minister/ Department Minister | Create project/scheme | New Project/ Scheme created | Announcement Date, Place, Project/Scheme Name, Department Name, Financial Year, Announcement By | N | - | - | DPR Preparation and Approval CORE: Name of project, Type of Project, Goal & Objectives (Outcome) of the Project, Work Plan, Mode of funding, Funding Agency, Major milestones, Outputs of each activity, Cost of each Activity, Total Cost of the Project ANCILLARY: Administrative Approval Date, Approved By, Approval Remarks, Rejection Remarks, |
2. | HoD | New Project/ Scheme created | Draft Detailed Project Report | Drafted DPR | Name of project, Type of Project, Goal & Objectives (Outcome) of the Project, Work Plan, Mode of funding, Funding Agency, Major milestones, Outputs of each activity, Cost of each Activity, Total Cost of the Project | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
3. | Administrative Department (AD) | Drafted DPR | Review DPR | Approved DPR | Administrative Approval Date, Approved By, Remark | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
2. | Administrative Department (AD) | Drafted DPR | Review the DPR | Rejected DPR Sent to HoD for review | Remarks for rejecting the DPR | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
4. | Chief Minister/ Department Minister | Approved DPR | Review the DPR | Approved DPR with administrative sanction | Administrative sanction date, Approved By, Remark | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
3. | Chief Minister/ Department Minister | Approved DPR | Review the DPR | Rejected DPR Sent to AD for review | Remarks for rejecting the DPR | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
5. | AD/ HoD | Approved DPR with administrative sanction | Prepare Financial sanction proposal Prepare case for financial sanction | Financial sanction proposal | Project Name, total Budget, Multi Year Plan, Financial Year, Project Start Year, Project Duration, Remark, Proposed COA, Amount | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan | Financial Sanction Preparation and Approval CORE: Name of project, Type of Project, Goal & Objectives (Outcome) of the Project, Work Plan, Mode of funding, Funding Agency, Major milestones, Outputs of each activity, Cost of each Activity, Total Cost of the Project ANCILLARY: Administrative Approval Date, Approved By, Approval Remarks, Rejection Remark |
6. | FD/Planning Department | Financial sanction proposal | Review financial sanction proposal | Approved, Generate Financial Sanction | Financial Year, Sanction No., Sanction Date, Sanction Amount, COA | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
5. | FD/Planning Department | Financial sanction proposal | Review Financial Sanction | Rejected Objection sent to HoD for consideration and review | Remarks for rejecting financial proposal | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
6. | AD/ HoD | Approved DPR with financial sanction | Prepare plan for budget provision and send to the FD via BFC | Budget provision[2] (as RE for the current fiscal year) | Department Name, Project Name, Project Amount, Financial Year, COA | Y | Capital Expenditure/ Revenue Expenditure/ Revenue Receipt/ Capital Receipt | Plan |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Finance Department | Initiate budget process | Issue Budget Circular and Budget Calendar Budget Circular is issued to all departments inviting estimates of receipts and expenditure of the respective department | Budget Circular and budget calendar issued | Department name, annual receipts and expenditure estimates (BE and RE both), Demand for Grant details, Details of competent authority (BCO, DDO), dates | N | - | - | - |
2. | [3]Estimating Officer / DDO/HoD | Budget Circular | Prepare revenue receipt estimates Estimating officers prepare Budget Estimate for current FY and Revised Estimate for previous FY based on historic trends, projection of future demand and effect of any policy change to be formulated by the department/State | Budget Estimate for current FY and Revised Estimate for previous FY created | Department Name, COA details, projection of demand (for service/goods) and associated revenue receipt - gross amount, arrears, refunds, existing rate of tariff/fee/tax, proposed change in tariff/fee/tax as sanctioned by government | Y | Revenue Receipt | Estimate | Inputs for Department Level Proposed Budget CORE: Department Name, COA details, projection of demand and associated revenue - gross amount, arrears, refunds, ANCILLARY: Existing rate of tariff/fee/tax, proposed change in tariff/fee/tax as sanctioned by government. |
3. | Estimating Officer / DDO | Budget Estimates - current FY and Revised Estimates - previous FY | Upload on IFMS | Budget Estimates and Revised Estimates uploaded on IFMS | Department Name, COA details, estimated revenue receipt - gross amount, arrears, refunds | Y | Revenue Receipt | Estimate |
4. | HoD | Budget Estimates and Revised Estimates | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated revenue receipt - gross amount, arrears, refunds | Y | Revenue Receipt | Estimate | Department Level Proposed Budget CORE: Department Name, COA details, estimated revenue receipt - gross amount, arrears, refunds ANCILLARY: Remarks for cuts and modifications introduced by HOD/ BCO etc. |
5. | AD | Approved estimates by BCO | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated revenue receipt - gross amount, arrears, refunds | Y | Revenue Receipt | Estimate |
6. | FD | Approved estimates by AD | Constitute BFC | BFC constituted | Not Applicable | N | - | - | - |
7. | BFC | Approved estimates by AD | Review estimates | Approved/rejected/modified estimates by BFC | Department Name, COA details, estimated revenue receipt- gross amount, arrears, refunds | Y | Revenue Receipt | Estimate | State Level Proposed Budget CORE: Department Name, COA details, estimated revenue receipts - gross amount, arrears, refunds ANCILLARY: Remarks for cuts and modifications introduced by BFC, Cabinet etc. - |
8a. | Finance Department | Approved estimates by BFC | Consolidate budget estimates for cabinet approval FD makes any necessary changes to the received estimates and consolidate s department-wise detailed estimates | Budget documents | Department Name,COA, estimated revenue receipts | Y | Revenue Receipt | Estimate |
8b. | Finance Department | Approved estimates by BFC | Communicate details of finalized estimates to concerned ADs of departments for information | Consolidated department-wise estimates | Department Name, COA details, estimated revenue receipts - gross amount, arrears, refunds | N | Revenue Receipt | Estimate |
9. | Finance Department | Budget documents | Prepare memorandum based on the budget documents and submit to the Cabinet for approval | Memorandum containing all budget documents | [4]Multiple Documents - Annual Financial Statement, Receipt Budget, Budget At A Glance | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Finance Department | Issue Budget Circular along with Budget Calendar Budget Circular is issued to all departments inviting estimates of expenditure (Revenue) of the respective department | Budget Circula along with budget calendar issued | Department name, annual revenue expenditure estimates (BE and RE both), Demand for Grant details, Details of competent authority (BCO, DDO), dates | N | - | - | - |
2. | Estimating Officer / DDO/HoD | Budget Circular | Prepare expenditure (Revenue) estimates Estimating officers prepare Budget Estimate for current FY and Revised Estimate for previous FY based on historic trends, projection of future demand and effect of any policy change to be formulated by the department/State | Budget Estimate for current FY and Revised Estimate for previous FY created | Department Name, COA details, estimated revenue expenditure, changes in pay scale,, no. of employee, electricity tariffs, interest rates etc, | Y | Revenue Expenditure | Estimate | Inputs for Department Level Proposed Budget CORE: Department Name, COA details, estimated revenue expenditure ANCILLARY: changes in pay scale,, no. of employee, electricity tariffs, interest rates etc, |
3. | Estimating Officer / DDO | Budget and Revised Estimated | Upload on IFMS | Budget and Revised Estimates uploaded on IFMS | Department Name, COA details, estimated revenue expenditure | Y | Revenue Expenditure | Estimate |
4. | HoD | Budget and Revised Estimates prepared by DDO | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated revenue expenditure | Y | Revenue Expenditure | Estimate | Department Level Proposed Budget CORE: Department Name,COA, estimated revenue expenditure ANCILLARY: Remarks for cuts and modifications introduced by HoD, AD etc. |
5. | AD | Approved estimates by HoD | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated revenue expenditure | Y | Revenue Expenditure | Estimate |
6. | Finance Department | Approved estimates by AD | Form Budget Finanilization Committee to scrutinize the submitted estimates | Approved/rejected/modified estimates by BFC | Department Name, COA details, estimated revenue expenditure | Y | Revenue Expenditure | Estimate | State Level Proposed Budget CORE: Department Name,COA, estimated revenue expenditure ANCILLARY: Remarks for cuts and modifications introduced by BFC, Cabinet etc. |
7a. | Finance Department | Approved estimates by BFC | Prepare budget documents for submission to cabinet for approval FD makes any necessary changes to the received estimates and consolidate s department-wise detailed estimates | Budget documents | Department Name,COA, estimated revenue expenditure | Y | Revenue Expenditure | Estimate |
7b. | Finance Department | Approved estimates by BFC | Communicate details of finalized estimates to concerned ADs of departments for information | Consolidated department-wise estimates | Department Name, COA details, estimated revenue expenditure | Y | Revenue Expenditure | Estimate |
8. | Finance Department | Budget documents | Prepare memorandum based on the budget documents and submit to the Cabinet for approval | Memorandum containing all budget documents | Multiple Documents - Annual Financial Statement, Expenditure Budget, Budget At A Glance | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Planning Department | Issue planning Circular along with meeting Calendar Planning circular is issued to all departments inviting ceiling of expenditure (Project/Scheme) of the respective department | Planning Circular along with meeting calendar issued | Department Name, Meeting Date | N | - | - | - |
2. | Department HoD | Discussion on ceiling for New project/Scheme | Planning Dept given the budget ceiling In the discussion, planning department define the ceiling for the budget preparation for the respective department | Defined the Budget ceiling for new project/Scheme | Department name, annual New capital/Project expenditure ceiling,, Details of competent authority (HoD, DDO), dates | N | - | - | - |
3. | Finance Department | Issue Budget Circular along with Budget Calendar Budget Circular is issued to all departments inviting estimates of expenditure (capital) of the respective department | Budget Circular along with budget calendar issued Profile with all relevant details created which can be edited, enabled, disabled | Department name, annual capital outlay estimates (BE and RE both), Demand for Grant details, Details of competent authority (BCO, DDO), dates | N | - | - | - |
4. | Estimating Officer / DDO/HoD | Budget Circular | Prepare expenditure (Capital) estimates under the defined ceiling by the planning dept. Estimating officers prepare Budget Estimate for current FY and Revised Estimate for previous FY based on historic trends, projection of future demand and effect of any policy change to be formulated by the department/State | Budget Estimate for current FY and Revised Estimate for previous FY created | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate | Inputs for Department Level Proposed Budget CORE: Department Name, COA details, estimated capital outlay ANCILLARY: None |
5. | Estimating Officer / DDO | Budget and Revised Estimated | Upload on IFMS | Budget and Revised Estimates uploaded on IFMS | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate |
6. | HoD | Budget and Revised Estimates prepared by DDO | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate | Department Level Proposed Budget CORE: Department Name, COA details, estimated capital outlay ANCILLARY: Remarks for cuts and modifications introduced by HOD, BCO, Planning Department etc. |
7. | AD | Approved estimates by BCO | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate |
8. | Planning Department | Approved estimated by AD | Scrutinize the estimates | Approved/rejected/modified estimates | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate |
9. | Finance Department | Approved estimates by PD | Form Budget Finanilization Committee to scrutinize the submitted estimates | Approved/rejected/modified estimates by BFC | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate | State Level Proposed Budget CORE: Department Name, COA details, estimated capital outlay ANCILLARY: Remarks for cuts and modifications introduced by BFC, Cabinet etc. |
10a. | Finance Department | Approved estimates by BFC | Prepare budget documents for submission to cabinet for approval FD makes any necessary changes to the received estimates and consolidate department-wise detailed estimates | Budget documents | Department Name,COA, estimated capital outlay | Y | Capital Expenditure | Estimate |
10b. | Finance Department | Approved estimates by BFC | Communicate details of finalized estimates to concerned ADs of departments for information | Consolidated department-wise estimates | Department Name, COA details, estimated capital outlay | Y | Capital Expenditure | Estimate |
11. | Finance Department | Budget documents | Prepare memorandum based on the budget documents and submit to the Cabinet for approval | Memorandum containing all budget documents | Multiple Documents - Annual Financial Statement, Expenditure Budget, Budget At A Glance | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Finance Department | Memorandum containing all budget documents | Present to the Legislative Assembly | Budget speech of the FM | Priorities of the Government, Current status of some important existing schemes, New schemes and programmes to be launched during the ensuing year, Tax/Tariff proposals and reliefs to be granted, if any, Summary of Revised Estimates and Budget Estimates | N | - | - | - |
2. | Finance Department | Memorandum containing all budget documents | General discussion on the budget as a whole and/or on any question of principle or policy involved therein | Tabling of the Budget Documents | Not Applicable | N | - | - | - |
3. | Finance Department | Memorandum containing all budget documents | Vote on Demand for Grants | Voted Demand for Grants | Demand No., Department Name, COA, Amount for BE, RE and Actuals | N | - | - | - |
4. | Finance Department | Demand for Grants | Introduce Appropriation Bill | Appropriation Bill | N | - | - | - |
5. | Finance Department | Appropriation Bill | Obtain Governor’s assent | Appropriation Act | N | - | - | - |
6a. | Finance Department | Appropriation Act | Issue a circular authorising incurring of expenditure as per guidelines contained in the Appropriation Act | Circular containing details as per the Appropriation Act | N | - | - | - |
6b. | Finance Department | Appropriation Act | Issue notification in the Official Gazette | Gazette notification | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1a. | Finance Department | Circular containing details as per the Appropriation Act | Upload budget for department on IFMS | Budget by department There is detailed information of allotments placed at the disposal of each department during the budget year | Department name, Department code, COA heads, amount, financial year, authorization authority with details | Y | Revenue Expenditure/ Revenue Receipts/ Capital Expenditure/ Capital Receipts | Estimate | State Level Approved Budget CORE: Department name, Department code, COA heads, amount, financial year ANCILLARY: Authorization authority with details |
1b. | Finance Department | Circular containing details as per the Appropriation Act | Send the minutes of the meeting of BFC to the concerned AD | Minutes of the BFC | Department name, Department code, COA heads, amount, financial year, authorization authority with details | N | - | - | - |
2. | AD | Minutes of the BFC | Issue orders for DDO-wise estimates | Order to prepare DDO wise estimates | Department name, Department code, COA heads, amount, financial year, authorization authority with details | N | - | - | - |
3. | HoD | Order to prepare DDO wise estimates | Prepare DDO wise estimates | DDO wise estimates available on IFMS | Department name, Department code, COA heads, amount, financial year, authorization authority with details, DDO Name and Code | Y | Capital Expenditure/ Revenue Expenditure | Estimate | DDO Level Approved Budget CORE: Department name, Department code, COA heads, amount, financial year, DDO Name and Code ANCILLARY: Authorization Authority with Details |
4. | DDO | DDO wise estimates available on IFMS | View allocation on IFMS | DDO receives details of allotments provided for expenditure | Department name, Department code, COA heads, amount, financial year, DDO Name and Code | Y | Capital Expenditure/ Revenue Expenditure | Estimate |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | DDO | 60% of first tranche of project/scheme funds utilized | Prepare Utilization Certificate | Utilization Certificate | Department name, department code, Vendor name, project details, work done against total deliverable, amount sanctioned, amount utilized, balance amount, DDO name and DDO code | N | - | - | - |
2. | HoD/AD | Utilization Certificate | Review utilization certificate | Approved, forward to FD with demand request | Department name, department code, Vendor name, project details, work done against total deliverable, amount sanctioned, amount utilized, balance amount, amount requested (next tranche), DDO name and DDO code | N | - | - | - |
3. | FD | Utilization Certificate, and demand request | Review utilization certificate and demand request | Approve next tranche Amount reflected on IFMS, visible to HoD | Approved UC + Department name, department code, amount, COA | Y | Capital Expenditure | Demand | Demand Request by DDO CORE: Department name, department code, amount, COA, DDO name and DDO code ANCILLARY: (Approved UC: Department name, department code, Vendor name, project details, work done against total deliverable, amount sanctioned, amount utilized, balance amount, DDO name and DDO code) |
4. | HoD | Approved next tranche information | Inform to respective DDO | Amount reflected on IFMS, visible to DDO | Department name, department code, amount, COA, DDO name and DDO code | Y | Capital Expenditure | Demand |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | HoD/DDO | Approved DPR with financial sanction | Invite bids from vendor/contract for work Review process of tender | Bid invitation | Bid Reference no. Department name, Project details, date of bid invitation, selection/eligibility criteria Bid Reference no. | N | - | - | - |
2. | HoD/DDO | Bid invitation | Award work to a vendor/contractor | Work awarded to vendor/contractor | Bid Reference no. , Vendor name, date of award of work, Project details - name, terms and conditions, deliverables expected, vendor account details, Tender value, Contract rate | Y | Capital Expenditure | Plan | Work Allocation to Vendor CORE: Bid Reference No., Contract Reference No., Vendor Name, Vendor account details, Tender value, Contract rate, Date of award, Project Name ANCILLARY: Project Details- Terms and Conditions, Deliverables expected |
3. | HoD/DDO | Contract sign | Create Work order/purchase order | Work Execution start | Bid Reference no., Contract Reference No., Vendor name, date of award of work, project details - name, terms and conditions, deliverables expected, vendor account details, Tender value, Contract rate | Y | Capital Expenditure | Plan |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Vendor | Work execution | Execute work and submit bill with Physical report of assign work | Physical report with invoice bill | Vendor name, date of award of work, project details - terms and conditions, deliverables expected, vendor account details, physical report, tender value, contract rate, invoice amount, date | Y | Capital Expenditure | Bill | Bill Generation by Vendor CORE: Work order reference, Bill date, Name of vendor, Bill amount (Gross, Deduction, Net), Bill type, Bank account details ANCILLARY: (Physical Report: Vendor name, date of award of work, project details - terms and conditions, deliverables expected, vendor account details, physical report, tender value, contract rate, invoice amount, date) |
2. | Vendor | Eligibility criteria (Milestones, MRN, PO) | Submit bill Bill submitted with an invoice covering letter as per the work done on the deliverable designed | Acknowledge bill receipt When bill is submitted, receiving received from DDO on the invoice covering letter | Work order reference, Bill date, Name of vendor, Bill amount (Gross, Deduction, Net), Bill type, Vendor Bank account details | Y | Capital Expenditure | Bill |
3. | DDO | Vendor bill | Review bill DDO reviews the invoice against set rules/norms (verification process) | Verified vendor bill | Project name, Sanction, Cost, Deliverable, Timeline, Bill amount against work done, Vendor Bank account details | Y | Capital Expenditure | Bill | Bill Generation by DDO CORE: Bill date, Name of vendor, Bill amount (Gross, Deduction, Net), Bill type, COA head, DDO code, Department code, Vendor Bank account details, Reference No. ANCILLARY: Token No., Treasury rules, Bill approval status, Approval or Rejection Remarks, Relevant Fields from Physical Report (Project name, Sanction, Cost, Deliverable, Timeline, Bill amount against work done, Vendor Bank account details) |
4. | DDO | Verified vendor bill | Generate bill (government format) Bill generated by DDO based on verified bill | Bill available in department system Bill available for sharing/uploading on treasury system | Bill date, Name of vendor, Bill amount (Gross, Deduction, Net), Bill type, COA head, DDO code, Department code, Vendor Bank account details . Reference No. | Y | Capital Expenditure | Bill |
5. | Treasury Token Section | Bill received from DDO | Assign token number and audit officer | Bill tagged with token number reflected in Treasury system | Token number (Bill No.), date, gross amount, deduction, net amount, COA head, DDO code, Department code, Vendor Bank account details | Y | Capital Expenditure | Bill |
6. | Treasury Audit Section - auditor, accountant and treasury officer | Bill with token number | Audit bill (as per treasury rules) | Bill status - approved or rejected, with remark | Treasury rules, bill approval status (approved or rejected, with remark) | Y | Capital Expenditure | Bill |
7. | Treasury Audit Section | Rejection criteria | Create intimation (with details for reason for rejection) | Status and rejection criteria is reflected in Department Bill system | Remarks DDO reviews the remarks, rectifies the bill, and sends again to treasury for payment | Y | Capital Expenditure | Bill |
8. | Treasury Payment Section | Bill - approved | Generate payment advice | Payment advice pushed to banking system | Token number, date, gross amount, deduction, net amount, COA head, DDO code, Department code, Vendor Bank account details | Y | Capital Expenditure | Payment | Payment Advice Generation CORE: Token number, date, gross amount, deduction, net amount, COA head, DDO code, Department code, Vendor Bank account details ANCILLARY: Payment Advice Status, Remarks |
9. | RBI/Bank | Payment advice receipt | Validate payment advice | Payment advice status - accepted or rejected | Payment advice status and remarks, if rejected | Y | Capital Expenditure | Payment |
10. | RBI/Bank | Payment advice - accepted | Credit beneficiary account Debit State Govt account | e-scroll pushed to Treasury System | Token number, date, gross amount, deduction, Vendor Bank account details | Y | Capital Expenditure | Payment |
11. | Treasury Audit Section | e-scroll | Generate voucher number | Consolidate e-scrolls for preparation of AG accounts | Voucher number, Date, Amount, COA head, DDO code, Department code | Y | Capital Expenditure | Debit | Payment Completion CORE: Voucher number, Date, Amount, COA head, DDO code, Department code ANCILLARY: None |
12. | RBI/Bank | Payment advice - rejected | Credit suspense head | Status and rejection criteria is reflected in Treasury system | Remarks | Y | Capital Expenditure | Payment | (Included in Ancillary attributes mentioned against Row 8-10 of this process. ) |
13. | Treasury Audit Section | Rejection criteria | Intimation of rejection DDO rectifies the bill and sends again to treasury for payment | Status and rejection criteria is reflected in Department Bill System | Remarks DDO reviews the remarks, Rectifications to the bill, and | Y | Capital Expenditure | Bill | (Included in Ancillary attributes mentioned against Row 3-7 of this process. ) |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | DDO (Maker) | Eligibility criteria (Monthly Salary Bill/ reimbursement of TA/Medical and Other Expenditure) | Submit bill Bill submitted by the maker(DDO) | Acknowledge bill receipt When bill is submitted to DDO officer for approval | Department code, DDO Code, Employee name, Bank details, Month, Year, designation, Gross Amount, Net Amount, Deduction Amount, bill type(Salary/TA/Medical and other expense ), Reference No. | Y | Revenue Expenditure | Bill | Bill Generation by DDO CORE: Department code, DDO Code, Employee name, Bank details, Month, Year, designation, Gross Amount, Net Amount, Deduction Amount, bill type (Salary/TA/Medical and other expense ), COA (Budget Head), Reference No. ANCILLARY: Token Number, Treasury Rules, Bill Approval Status, Approval or Rejection Remarks |
2. | DDO (Approval) | Revenue bill | Review bill DDO reviews the bill against set rules/norms (verification process) | Verified Revenue bill | Department code, DDO Code, Employee name, Bank details, Month, Year, designation, Gross Amount, Net Amount, Deduction Amount, bill type(Salary/TA/Medical and other expense ), Reference No. | Y | Revenue Expenditure | Bill |
3. | DDO | Verified Revenue bill | Generate bill (government format) Bill generated by DDO based on verified bill | Bill available in department system Bill available for sharing/uploading on treasury system | Department code, DDO Code, Employee name, Bank details, Month, Year, designation, Gross Amount, Net Amount, Deduction Amount, bill type(Salary/TA/Medical and other expense, COA (Budget Head) | Y | Revenue Expenditure | Bill |
4. | Treasury Token Section | Bill received from DDO | Assign token number and audit officer | Bill tagged with token number reflected in Treasury system | Token number (Bill No.) , date, gross amount, deduction, net amount, COA head, DDO code, Department code, Bank account details | Y | Revenue Expenditure | Bill |
5. | Treasury Audit Section - auditor, accountant and treasury officer | Bill with token number | Audit bill (as per treasury rules) | Bill status - approved or rejected, with remark | Treasury rules, bill approval status (approved or rejected, with remark) | Y | Revenue Expenditure | Bill |
6.. | Treasury Audit Section | Rejection criteria from Treasury Audit | Create intimation (with details for reason for rejection) | Status and rejection criteria is reflected in Department Bill System | Remarks DDO reviews the remarks, rectifies the bill, and sends again to treasury for payment | Y | Revenue Expenditure | Bill |
7. | Treasury Payment Section | Bill - approved | Generate payment advice | Payment advice pushed to banking system | Token number, date, gross amount, deduction, net amount, COA head, DDO code, Department code, Bank account details | Y | Revenue Expenditure | Payment | Payment Advice Generation CORE: Token number, date, gross amount, deduction, net amount, COA head, DDO code, Department code, Bank account details ANCILLARY: Payment Advice Status, Remarks |
8. | RBI/Bank | Payment advice receipt | Validate payment advice | Payment advice status - accepted or rejected | Payment advice status and remarks, if rejected | Y | Revenue Expenditure | Payment |
9. | RBI/Bank | Payment advice - accepted | Debit State Govt account Credit beneficiary account | e-scroll pushed to Treasury System | Token number, date, gross amount, deduction, Bank Account details | Y | Revenue Expenditure | Payment |
10. | Treasury Audit Section | e-scroll | Generate voucher number | Consolidate e-scrolls for preparation of AG accounts | Voucher number, Date, Amount, COA head, DDO code, Department code | Y | Revenue Expenditure | Debit | Payment Completion CORE: Voucher number, Date, Amount, COA head, DDO code, Department code ANCILLARY: None |
11. | RBI/Bank | Payment advice - rejected | Credit suspense head | Status and rejection criteria is reflected in Treasury system | Remarks | Y | Revenue Expenditure | Payment | (Included in Ancillary attributes mentioned against Row 7-9 of this process. ) |
12. | Treasury Audit Section | Rejection criteria from RBI | Create intimation (with details for reason for rejection) DDO rectifies the bill and sends again to treasury for payment | Status and rejection criteria is reflected in Department Bill System | Remarks DDO reviews the remarks, Rectifications in the Bill | Y | Revenue Expenditure | Bill | (Included in Ancillary attributes mentioned against Row 1-6 of this process.) |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | DDO | Receipt (water charges collected) | Create profile DDO logs in eReceipt Module of IFMS and creates profile to start receipt deposit process | Profile created Profile with all relevant details created which can be edited, enabled, disabled | User name (DDO), Department Name and Code, Treasury and Sub-Treasury Name and Code, COA Head | N | - | - | - |
2. | DDO | DDO profile | Fill Challan details DDO fills out all necessary details including the the type of payment, bank and mode of payment | e-Challan created | DDO details - name, division, etc., Nature of payment, Financial Year, Period (annual, monthly, etc.), COA head, gross amount, net amount, mode of payment | Y | Revenue Receipt | Demand | Challan Created CORE: DDO details - name, division, etc., Nature of payment, Financial Year, Period (annual, monthly, etc.), COA head, gross amount, net amount, mode of payment ANCILLARY: None |
3a. | DDO | e-Challan | Deposit payment - online DDO chooses a payment mode and fills in all required details for making payment | Bill Invoice reflected in IFMS eReceipt | Bank details, date and time of payment, CIN, Reference Number, | Y | Revenue Receipt | Payment | Payment Initiation CORE: DDO details - name, division, etc., Nature of payment, Financial Year, Period (annual, monthly, etc.), COA head, gross amount, net amount, mode of payment ANCILLARY: None |
3b. | DDO | e-Challan | Deposit payment - offline[6] | Challan | Bank details, date and time of payment, CIN, Reference Number, | Y | Revenue Receipt | Payment |
4. | Bank | Deposited Challan | Credit the treasury account | E-scroll generated | UTR No., CIN, date and time of transaction, Department Code, treasury code, COA head, amount | Y | Revenue Receipt | Credit | Collection Completion CORE: DDO details - name, division, etc., Nature of payment, Financial Year, Period (annual, monthly, etc.), COA head, gross amount, net amount, mode of payment ANCILLARY: None |
5. | Treasury | e-scroll | Generate Treasury Challan No. and consolidate all challans | Consolidated report for AG | Treasury challan No. Treasury Challan Date, Amount, COA, Department Code | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | DDO | Excess expenditure /Need for supplementary grants | Prepare Statement of need of Supplementary Demand for Grants | Submit Statement of Supplementary Demand for Grants | Department Name and Code, Demand for Grant No., DDO code, COA, Original Appropriation, Actual Expenditure, Supplementary grant (amount) required and reasons for same | Y | Revenue Expenditure/ Capital Expenditure | Estimate | Estimation of Supplementary Grants and Approval CORE: Department Name and Code, Demand for Grant No., DDO code, COA, Original Appropriation, Actual Expenditure, Supplementary grant (amount) ANCILLARY: Reasons for supplementary grant, comments from BCO, comments from AD, Approval / Rejection Remarks |
2. | BCO | Statement of Supplementary Demand for Grants | Review against budget provision and prepare proposal for Supplementary Grants | Proposal for Supplementary Grants | Department Name and Code, Demand for Grant No., COA, Original Appropriation, Actual Expenditure, Supplementary grant (amount) required and reasons for same, comments of BCO | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
3. | AD | Proposal for Supplementary Grants | Review the proposal | Approved, submit proposal to FD | Department Name and Code, Demand for Grant No., COA, Original Appropriation, Actual Expenditure, Supplementary grant (amount) required and reasons for same, comments of AD | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
4. | FD | Proposal for Supplementary Grants | Review the proposal | Approved, send to Legislative Assembly for approval | Department Name and Code, Demand for Grant No., COA, Original Appropriation, Actual Expenditure, Supplementary grant (amount) required | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
4b. | FD | Proposal for Supplementary Grants | Review the proposal | Rejected, send communication with remarks to AD | Remarks for rejection | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
5. | Legislative Assembly | Proposal for Supplementary Grants | Review the proposal | Approved, issue communication to FD on Supplementary Demand for Grants | Department Name and Code, Demand for Grant No., COA, Original Appropriation, Actual Expenditure, Approved Supplementary grant (amount) | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | DDO | Excess savings/Need for reappropriation | Prepare Statement of expected savings/Revised Estimates of Expenditure | Submit Statement of expected savings/Revised Estimates of Expenditure | Department Name and Code, Demand for Grant No., DDO code, COA, Original Appropriation, Actual Expenditure, Savings and Amount | Y | Revenue Expenditure/ Capital Expenditure | Estimate | Excess/ Savings Estimation- and Approval CORE: Department Name and Code, Demand for Grant No., DDO code, COA, Original Appropriation, Actual Expenditure, Savings/Excess,- Amount ANCILLARY: Comments from BCO, Comments from AD, Remarks for rejection |
2. | BCO | Statement of expected savings/Revised Estimates of Expenditure | Review against budget provision and consolidate all Revised Estimates | Approved, submit proposal to AD | Comments from BCO | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
3. | AD | Consolidate Revised Estimates | Review the Revised Estimates | Approved, submit proposal to FD | Comments from AD | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
4a. | FD | Consolidated Revised Estimates | Review the proposal | Approved, upload Revised Estimates on IFMS | Department Name and Code, Demand for Grant No., DDO code, COA, Original Appropriation, Actual Expenditure, Approved Savings/Excess,- Amount | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
4b. | FD | Consolidated Revised Estimates | Review the proposal | Rejected, send communication with remarks to AD | Remarks for Rejection | Y | Revenue Expenditure/ Capital Expenditure | Estimate |
5.[7] | FD | Approved request for savings | Revoke amount under the COA of respective department | Savings surrendered under the COA of respective department | Department Name and Code, Demand for Grant No., COA, Savings, Amount surrendered |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Treasury | e-scroll | Generate Treasury Challan No. and consolidate all challans | Consolidated challans | Treasury challan No. Treasury Challan Date, Amount, COA, Department Code | N | - | - | - |
2. | Treasury | Consolidated challans | Prepare monthly receipt report | Consolidated receipt report sent to AG | Department name and code, total budget provision, receipt in the last month, total receipts till last month of the FY, expected collections remaining | N | - | - | - |
3. | DDO | Prepare daily/bi-weekly/monthly receipt report | Consolidated report sent to BCO | Department name and code, total budget provision, receipt in the last month, total receipts till last month of the FY, expected collections remaining | N | - | - | - |
4. | HoD/AD | Consolidated receipt report | Review consolidated receipt report | Approved, shared with AG | Department name and code, total budget provision, receipt in the last month, total receipts till last month of the FY, expected collections remaining | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1. | Treasury Audit Section | e-scroll | Generate voucher number and consolidate all vouchers | Consolidated vouchers | Voucher number, Date, Amount, COA head, DDO code, Department code | N | - | - | - |
2. | Treasury | Consolidated vouchers | Prepare monthly expenditure reports | Consolidated expenditure report sent to AG | Department name and code, COA, total budget provision, expenditure incurred in the last month, total expenditure incurred till last month of the FY, balance remaining | N | - | - | - |
3. | DDO | Prepare monthly expenditure reports | Consolidated expenditure report sent to HoD | Department name and code, total budget provision, expenditure incurred in the last month, total expenditure incurred till last month of the FY, balance remaining | N | - | - | - |
4. | HoD/AD | Consolidated expenditure report | Review consolidated expenditure report | Approved, shared with AG | Department name and code, total budget provision, expenditure incurred in the last month, total expenditure incurred till last month of the FY, | N | - | - | - |
# (1) | Actors (2) | Input (3) | Verb and Noun 4) | Output (5) | Attributes (6) | Does it trigger a fiscal event (7) | Fiscal Event Type (8) | Fiscal Event Sub-Type (9) | Data Attributes for iFIX (10) |
1a. | AG | Consolidated receipt /expenditure report received from Treasury and HoD/AD | Review consolidated receipt/expenditure report | Approved, publish final accounts | <To be Identified> | N | - | - | - |
1b. | AG | Consolidated receipt /expenditure report received from HoD/AD | Review consolidated receipt/expenditure report | Rejected, reconcile accounts with DDO | <To be Identified> | N | - | - | - |
Core service configuration and promotion docs
Building Shared Narrative | i. Bluebook gap areas addressed and partner feedback incorporated ii. Finalize list of actors to approach for publishing the Bluebook iii. List of PFM Conferences identified to participate in iv. Draft of Health-financing white paper | i. Engage with identified actors to get commitment for publishing the Bluebook ii. 1 Op-ed iii. Health-financing white paper published | i. Bluebook V6 Published with GOI/ State Govt ii. Participation in 1 conference | i. Op-ed ii. mGramSeva case study with CPR |
Design and Build DPGs | i. IFMIS integration demonstrated in Odisha ii. iFIX enhancements for Works/ MUKTASoft iii. Webinar to release V1 of iFIX Specs iv. Works V1 Development | i. Specs V2 developed through collaborative workshops with Samaaj and Sarkaar partners ii. Works V2 iii. Explore PFMS integration through iFIX exemplar in Punjab | i. iFIX health use case (ASHA worker payments) to be specified for adoption by Piramal in Jharkhand, subject to funding commitment from BMGF ii. Works V3 iii. Webinar to release V2 of iFIX Specs/ iFIX for Smart Payments iv. Dashboard Prototypes - Scheme Dashboard, Chief Minister's/ Chief Secretary's Dashboard for Financial Performance Management at State-Level, and Dept Head Dashboard | Asha Worker use case implementation starts |
Scale @ Speed Exemplar to Ship Imagination | i. mGramSeva handover to DWSS with PSPCL integration live ii. iFIX program initiated with FD with a signed MoU specifying the program design, exemplar timelines and a clear handover plan for FD to own the program iii. iFIX Exemplar kicked off in Punjab iv. Hiring Engagement Manager | i. Initiate the adoption approach with identified line departments ii. State iFIX dashboard live | Case Study on FD iFix success |
i. MUKTA V1 customisation and deployment ii. iFIX integration with IFMS in Odisha to showcase Smart Payments through MUKTASoft iii. Account Management Hiring | i. MUKTA V2 customisation and deployment ii. MUKTA adoption monitoring | i. MUKTA V3 customisation and deployment ii. MUKTA handover plan for HUDD ownership iii. Mukta Webinar and product proposition for smart payments | i. MUKTA handed back to HUDD |
Catalysing Scale and Adoption | i. Note on benefits of IFMS+iFIX vs Only IFMS presented to WB ii. iFIX value proposition collateral: - Microsite updated - Scheme dashboard prototype - Artefacts/ Success Stories/ Case Study iii. Finalise champion in IMF who we will be approaching for the board note | i. Engagement with IMF on board note and readiness for it (advocacy) | i. Draft of board note circulated with IMF for feedback and finalisation | i. IMF issues board note endorsing fiscal events approach |
Sustaining and Institutionalisation | Identify 4-5 agencies who should publish V2 of iFIX specs ii. Connect to 16th FC through Policy team" | Advocacy for handover and publishing of specs by the agency or by convening a working group under the CGA or DoE ii. Approach backup agencies if DoE (FinMin) or CAG do not engage at the required pace" | Advocacy for handover and publishing of specs by the agency ii. Approach backup agencies if DoE (FinMin) or CAG do not engage at the required pace" | V2 Specs launched |
# | Sub-Type Fiscal Event | Definition |
Revenue Receipts | Capital Receipts | Revenue Expenditure | Capital Expenditure |
1 | Estimate | Event resulting in fiscal information containing a high-level view regarding what amount of receipts/ expenditure is expected/ needed. |
2 | Plan | Event resulting in fiscal information containing a detailed view regarding how the estimated receipts/ expenditure will be met/ utilized. |
3 | Demand | Event resulting in fiscal information containing a request for transfer or payment of money into the government account. |
4 | Bill | Event resulting in fiscal information containing a request for transfer or payment of money out of the government account. |
5 | Receipt | Event resulting in fiscal information containing banking transaction initiation details for any fund transferred into the government account. |
6 | Payment | Event resulting in fiscal information containing banking transaction initiation details for any fund transferred out of the government account. |
7 | Debit | Event resulting in fiscal information containing banking transaction completion details for any fund transferred out of the government account. |
8 | Credit | Event resulting in fiscal information containing banking transaction completion details for any fund transferred into the government account. |
Only the administrator can generate the API key and secret. The steps for doing this is as follows.
Keycloak console will be available at https://<host-name>/auth
. The admin will log in using the username and password secret.
Open the Keycloak console
Near the top-left corner in the realm drop down menu, select Add Realm.
Select the ifix-realm.json file.
After the realm gets created, select the ifix
realm from the drop-down near the top-left corner.
Remember to select the ifix
realm from the Keycloak console before proceeding.
From the Clients section of Keycloak Admin Console, create a client.
Provide a unique username for the client.
Go to the client's settings
Change Access Type to confidential
Turn on Service Account Enabled
In the Valid Redirect URIs field provide the root URL of the iFIX Instance. (Not important for our purposes but need to set it because it is mandatory)
And Save these changes
In the Service Account Roles tab, assign the role "fiscal-event-producer"
In the Mappers tab, create a new mapper to associate the client with a tenantId
Select Mapper Type
to be "Hardcoded claim"
In Token Claim Name
, write "tenantId"
In Claim value
, write the under which the client is being created. (For example, "pb")
Set Name
same as Token Claim Name
i.e. "tenantId"
Select Claim Json Type
to be "String"
Now you can get the credentials from the Credentials tab and configure them in the client's system.
CI/CD setup
Post infra setup (Kubernetes Cluster), We start with deploying the Jenkins and kaniko-cache-warmer.
Sub Domain to expose CI/CD URL
GitHub Oauth App
Docker hub account details (username and password)
SSL Certificate for the sub-domain
Prepare an <ci.yaml> master config file and <ci-secrets.yaml>, you can name this file as you wish which will have the following configurations.
credentials, secrets (You need to encrypt using sops and create a ci-secret.yaml separately)
Check and Update ci-secrets.yaml details (like github Oauth app clientId and clientSecret, GitHub user details gitReadSshPrivateKey and gitReadAccessToken etc..)
To create Jenkins namespace mark this flag true
Add your env's kubconfigs under kubConfigs like https://github.com/misdwss/iFix-DevOps/blob/mgramseva/deploy-as-code/helm/environments/ci-secrets.yaml#L19
KubeConfig env's name and deploymentJobs name from ci.yaml should be the same
Update the CIOps and DIGIT-DevOps repo name with your forked repo name and provide read-only access to github user to those repo's.
SSL Certificate for the sub-domain
You have launched the Jenkins. You can access the same through your sub-domain which you configured in ci.yaml.
The Jenkins CI pipeline is configured and managed 'as code'.
Example URL - https://<Jenkins_domain>
Since there are many services and the development code is part of various git repos, you need to understand the concept of cicd-as-service which is open-sourced. This page also guides you through the process of creating a CI/CD pipeline.
As a developer - To integrate any new service/app to the CI/CD below is the starting point:
Once the desired service is ready for the integration: decide the service name, type of service, whether DB migration is required or not. While you commit the source code of the service to the git repository, the following file should be added with the relevant details which are mentioned below:
Build-config.yml –It is present under the build directory in each repository
This file contains the below details which are used for creating the automated Jenkins pipeline job for your newly created service.
While integrating a new service/app, the above content needs to be added in the build-config.yml file of that app repository. For example: If we are onboarding a new service called egov-test, then the build-config.yml should be added as mentioned below.
If a job requires multiple images to be created (DB Migration) then it should be added as below,
Note - If a new repository is created then the build-config.yml should be created under the build folder and then the config values are added to it.
The git repository URL is then added to the Job Builder parameters
When the Jenkins Job => job builder is executed the CI Pipeline gets created automatically based on the above details in build-config.yml. Eg: egov-test job will be created under the core-services folder in Jenkins because the “build-config was edited under core-services” And it should be the “master” branch only. Once the pipeline job is created, it can be executed for any feature branch with build parameters (Specifying which branch to be built – master or any feature branch).
As a result of the pipeline execution, the respective app/service docker image will be built and pushed to the Docker repository.
Job Builder – Job Builder is a Generic Jenkins job that creates the Jenkins pipeline automatically which are then used to build the application, create the docker image of it and push the image to the docker repository. The Job Builder job requires the git repository URL as a parameter. It clones the respective git repository and reads the build/build-config.yml file for each git repository and uses it to create the service build job.
Check git repository URL is available in ci.yaml
If git repository URL is available build the Job-Builder Job
If the git repository URL is not available ask the Devops team to add it.
The services deployed and managed on a Kubernetes cluster in cloud platforms like AWS, Azure, GCP, OpenStack, etc. Here, we use helm charts to manage and generate the Kubernetes manifest files and use them for further deployment to the respective Kubernetes cluster. Each service is created as charts which will have the below-mentioned files in them.
To deploy a new service, we need to create the helm chart for it. The chart should be created under the charts/helm directory in iFix-DevOps repository.
We have an automatic helm chart generator utility that needs to be installed on the local machine, the utility prompts for user inputs about the newly developed service (app specifications) for creating the helm chart. The requested chart with the configuration values (created based on the inputs provided) will be created for the user.
Name of the service? test-service Application Type? NA Kubernetes health checks to be enabled? Yes Flyway DB migration container necessary? No, Expose service to the internet? Yes, Route through API gateway [zuul] No Context path? hello
The generated chart will have the following files.
This chart can also be modified further based on user requirements.
The Deployment of manifests to the Kubernetes cluster is made very simple and easy. We have Jenkins Jobs for each state and are environment-specific. We need to provide the image name or the service name in the respective Jenkins deployment job.
Enter a caption for this image (optional)
Enter a caption for this image (optional)
The deployment Jenkins job internally performs the following operations,
Reads the image name or the service name given and finds the chart that is specific to it.
Generates the Kubernetes manifests files from the chart using the helm template engine.
Execute the deployment manifest with the specified docker image(s) to the Kubernetes cluster.
The Fiscal Event Service maintains the fiscal event flow activities of multiple projects. The root level entity is a tenant which is basically the State Government. The tenant has several projects that are tagged to multiple attributes like department, scheme, mission, department, hierarchy, etc. All the entity information and COA (Chart of Account) are passed along with amount details which constitutes an array of fiscal event instances under the system.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met.
Java 8
MongoDB instance
Required Service Dependencies.
iFIX-Master-Data-Service
iFIX-Fiscal Event Post Processor
Apache Kafka Server
This service provides two features - Push fiscal event data and search fiscal event data.
It is a secure endpoint and user info details are required to access it.
It receives fiscal detail along with mandatory tenant and project info under attributes detail in publish requests, which triggers the whole fiscal event process and forwards it to the post-process service.
Before encompassing all entities, it checks for entities validation via iFix core Master Data Service.
While we create any fiscal event instance in the system, we log ingestion time and event occurrence timestamp which makes event flow clear on timestamp activities.
After processing all these activities, it passes fiscal event information to other fiscal event post-processor services for further processing by publishing fiscal data into the Kafka topic "fiscal-event-request-validated" and “fiscal-event-mongodb-sink“.
It also responds to snapshots of enriched fiscal event data which can be used by the source system for reference or any further processing.
It provides a search feature on the existing fiscal events which has been processed before by post-processor service and persisted into the MongoDB instance.
We can mainly search on eventType, tenantId, referenceId and event time interval and in return we get enriched data (dereference with nested id details).
Note: Kafka topic needs to be configured with respect to the environment
Update all the DB and URI configuration in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running And have been configured with the required client ID.
Before the iFIX platform can be used, the master data must be configured directly into the respective collections in MongoDB. Associated APIs for configuring Master Data will be made available in the next version.
Add Government - Insert into MongoDB collection
Add Department
Add Department Hierarchy - It defines the hierarchy definition for the department
department id: It is master department id (UUID)
level: It defines the depth of hierarchy of department level
Root level department hierarchy will not contain any parent value and level value will be zero
Level value incrementation rule:
When parent id is having any value then we search parent in department hierarchy record for hierarchy level evaluation
Get level value from parent department hierarchy and increment current department hierarchy level value by one
parent: It provides details about department parent (UUID)
Add Department Entity - It contains department entity information along with its hierarchy level and also attaches master department information (department id - UUID). Here we keep all children information list at every department node (department record). Leaf level department will not have any children info. Children list contains department entity id list, which makes current department entity parent of all children list (department id list), that's how it maintains department entity level. Entities can be created by
First defining the hierarchy level top to bottom because it has parent's reference
Then we can start adding department entities, bottom-to-top because it has child's reference.
Note:
The root department hierarchy will have the label "Department", and the root department entity will be the department itself
When we have to update the existing children list of Department Entity then update the existing children list using mongodb command like below:
Find the department entity parent where the new children need to be added. Do search by name and hierarchy level db.departmentEntity.find({"name" : "<current_department_entity's_name>","hierarchyLevel": <Current_department_entity_hierarchy_level>});
Append the department entity id at the end of the current department entity children's list. So first find the length of the current array and then set it as "children.n": "" (where n is the length of the array.)db.departmentEntity.update({"_id": ""},{$set: {"children.n": "", "children.n+1": ""} })
Add Chart of Accounts - Insert manually into MongoDB collection
Add Expenditure - Insert manually into MongoDB collection.
Master data service maintains information about Government and Chart of Accounts. We can create these details and search for the same details based on the given parameters/request data.
Current version : 2.0.0
Before we proceed with the configuration, make sure the following pre-requisites are met:
Java 8
MongoDB instance
It creates secure endpoints for the master data service. The access token is required to create any master data.
The subsequent sections on this page discuss the service details maintained by the IFIX core master data service.
This service provides the capabilities to maintain the Government details and allow users to Create and Search data. For creating the Government, we need a unique Id for the Government and a name for the same. Optionally, we can pass some additional details as part of the attribute. In the case of search, passing the unique ID(s) as search parameters can give you all the details of the required Government.
This service provides the capabilities to maintain the Chart of Account (COA) details and support create and search of COA. The following information is passed while creating the Chart of Accounts - Government Id, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead and corresponding head names & types. A unique code named COACode is generated by combining (concatenating) majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead with a hyphen ("-") and stored with the given request. Searching the details for COA is done based on the given search parameters like the Chart of Account IDs, COACodes, Government ID, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead.
No environment variables are required specific to the environment (migration).
Update the DB and URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running and has been configured with the required client ID.
Department Entity service manages the department and its hierarchies metadata. It deals with department entity and department hierarchy only. Department Hierarchy store only hierarchies definition like metadata for department level and department entity stores actual department data with its ancestry information.
Current Version: 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met.
Java 8
MongoDB instance
Required Service Dependencies - Adapter-Master-Data-Service
It defines the hierarchy definition for the department.
department id: It is the ID of the department from the department master
level: It defines the depth of hierarchy of department-level
parent: It provides details about department hierarchy parent (UUID)
Root level department hierarchy should not contain any parent value and the level value will be zero
When parent ID is having any value then we search parent in the department hierarchy record for hierarchy level evaluation.
Get level value from the parent department hierarchy and increment the current department hierarchy level value by one.
It contains department entity information along with its hierarchy level and also attaches master department information (department id - UUID). It keeps all child level information lists at every department node (department record). Leaf level department does not have any children info. Child list contains department entity ID list, which makes the current department entity parent of all children list (department ID list). This is how it maintains the department entity level.
Define the hierarchy level using the top to bottom because it has the parent's references. For department entity, create using the bottom-to-top approach. The leaf department entity does not have any child reference. When the department entity goes higher (parent) only then does it defines child reference in its child list.
Create Department Hierarchy: We pass the current hierarchy level and its parent details along with master department and tenant information. It stores data as meta-information about hierarchy level for department entity data processing, it works as a reference meta index which will tell about hierarchy level information.
Search Department Hierarchy: It just provides a preview of department hierarchies by providing request parameters - tenant ID, hierarchy level or department hierarchy ID.
Create Department Entity: It passes tenant Id, master department id, hierarchy level, its children list with department entity name and code. Tenant, hierarchy level and the master department is root info about the department entity. If the department entity does not contain any child, that means it is a leaf department entity it can only seek for its parent.
Search Department Entity: It can make search requests based on any department entity attribute but can't skip tenant information, it returns the whole department entity details along with its child information. It finds all ancestry information of the current department entity.
There will not be any environment variables required specific to the environment (migration).
Update the DB and URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Department Hierarchy defines the definition(Meta- Data) for actual department entity creation.
In the above image, the second row contains the actual Department Entities. Department Entity Create API - follows Bottom to Up approach.
In the above example data, the very first Department Entity will be created for “BARUWAL” with code “7278”. Below are the Steps to create Department Entity :
The first Department Entity will be created for “BARUWAL” (Since the Department Hierarchy has been defined from “Department” to “GPWSC”. Please refer Department-Hierarchy-API-With-Example.)
The details that need to be passed in Department Entity Create API is -
tenantId: This will be from Ifix Master Data Service
departmentId: This will be from Adapter Master Data Service
code: For the first Department Entity, as per the above data image, it is “7278”
name: For the first Department Entity, as per the above data image, it is “BARUWAL”
hierarchyLevel: For the first Department Entity, as per the above data image and defined Department Hierarchy, It will be “6”.(Bottom to Up)
children: For the first Department Entity, as per the above data image, it will be an empty set
For the next Department Entity Creation - repeat step 2 by making sure the subsequent request has - next Department Entity - Code, name, hierarchyLevel (this will decrement by one for every next Department entity), children (this will contain all or at least one previously created Department Entity Id(s)).
For reference, the Bottom to Up request & response of Department Entity Create API will look like :
Request :[For Department Entity at Bottom - “BARUWAL”]
Response:
Subsequent Request: [For Department Entity - “Kiratpur Sahib”]
Response:
And so on for the next Department Entity(ies) ...
Note: Below is the observation from the above example :
Tenant Id and Department Id will be created before creating the Department Entity hence before Department Hierarchy Create. Tenant Id and Department Id will be the same in all hierarchy levels for a Single Department Hierarchy and corresponding Department Entity(ies).
In the children attribute, we can pass a set of previous Department Entity Ids (or at least one or empty in case of Bottom level Department Entity).
When we have to update the existing children list of Department Entity then update the existing children list using mongodb command like below
Find the parent department entity where the new children need to be added. We should know which is the parent Department Entity beforehand. For the same, Do search by name and hierarchy level and find the corresponding parent department entity’s id (UUID).
Append the department entity id at the end of the parent department entity children's list that we got from step 1. And count the length of the parent department entity children’s list (it is 0 based indexing ) that we are calling as ‘n’ here (it could be any integer number as per the children’s list size ) and then set it as
"children.n": "<picked_from_step_1_query_department_entity_id>"
e.g.
Fiscal Event Post Processor is a streaming pipeline for validated fiscal event data. Apache Kafka has been used to stream the validated fiscal event data, process it for dereferencing, unbundle, flatten, and finally push these details to the Druid data store.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Apache Kafka and Kafka-Connect server should be up and running
Druid DB should be up and running
Below dependent services are required: iFix Master data service iFix Fiscal Event service
Fiscal Event post-processor consumes the fiscal event validated data from Kafka topic named “fiscal-event-request-validated” and processes it by following the below steps:
Fiscal event validated data gets dereferenced. For dereferencing, service ids like COA id, Tenant id etc. are passed to corresponding services - master service and fetch the corresponding object(s). Once the fiscal event data is dereferenced, push/send the same data to the dereferenced Topic.
Unbundle consumers pick up the dereferenced fiscal event data from the dereferencing topic. Dereference fiscal event data gets unbundled and then flattened. Once the flattening is complete, push/send the same data to the Druid Sink topic.
Flattened fiscal event data is pushed to Druid DB from a topic named: fiscal-event-druid-sink.
The Kafka-connect is used to push the data from a Kafka topic to MongoDB. Follow the steps below to start the connector.
Connect (port-forward) with the Kafka-connect server.
Create a new connector with a POST API call to localhost:8083/connectors.
The request body for that API call is written in the file fiscal-event-mongodb-sink.
Within that file, wherever ${---} replace it with the actual value based on the environment. Get ${mongo-db-authenticated-uri} from the configured secrets of the environment. (Optional) Verify and make changes to the topic names.
The connector is ready. You can check it using API call GET localhost:8083/connectors.
The Druid console is used to start ingesting data from a Kafka topic to the Druid data store. Follow the steps below to start the Druid Supervisor.
Open the Druid console
Go to the Load Data section
Select Other
Click on Submit Supervisor
Copy...Paste the JSON from the druid-ingestion-config.json file in the available text box.
Verify the Kafka topic name and the Kafka bootstrap server address before submitting the config
Submit the config and the data ingestion should start into the fiscal-event data source
Note: Kafka topic needs to be configured with respect to the environment
Update the DB, Kafka producer & Consumer And URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Keycloak console is available at https://<host-name>/auth
. The Ops team will provide the username and password secrets.
Open the Keycloak console
Near the top-left corner in the realm drop-down menu, select Add Realm
Select the ifix-realm.json file
After the realm gets created, select the ifix
realm from the drop-down near the top-left corner
Remember to select the ifix
realm from the Keycloak console before proceeding
From the Clients section of Keycloak Admin Console, create a client
Provide a unique username for the client
Go to the client's settings
Change Access Type to confidential
Turn on Service Account Enabled
In the Valid Redirect URIs field provide the root URL of the iFIX Instance (Not important for our purposes but need to set it because it is mandatory)
And Save these changes
In the Service Account Roles tab, assign the role "fiscal-event-producer"
In the Mappers tab, create a new mapper to associate the client with a tenantId
Select Mapper Type
to be "Hardcoded claim"
In Token Claim Name
, write "tenantId"
In Claim value
, write the under which the client is being created. (For example, "pb")
Set Name
same as Token Claim Name
i.e. "tenantId"
Select Claim Json Type
to be "String"
Now you can get the credentials from the Credentials tab and configure them in the client's system.
As part of the iFIX-2.0-alpha release, we have migrated the following master data from ifix_db to mgramseva_db:
Adapter Master Data Service
Department
Expenditure
Project
Department Entity Service
Department Entity
Department Hierarchy
Out of these, the project data SHOULD NOT be copied to the new DB because a new feature of the multi-tenant (GP) project is introduced with this release. New projects can be created and linked to multiple GPs.
Other master data can be copied.
Make sure the playground pod is “dwssio/playground:mongo-v2” or newer.
Keep the MongoDB Credentials handy in the following format
Host(in this format): “<host-address>:27017”
Username - a user that has access to BOTH source and destination dbs
Password
Source and Destination DB Names
List of collection names to be copied
The mongo-migration.sh
script copied to the playground pod. (You must have necessary kube permissions to copy a file to a pod.)
Sample Command to copy the file:
kubectl cp mongo_migration.sh ifix/<playground pod>:/
A MongoDB Dump script is provided that will copy a list of collections from a source DB to the destination DB.
Execute the provided script using following parameters:
-h = Host Address
-u = username
-p = password
-s = source db
-d = destination db
-c = collection name - You can provide multiple collection as depicted in the following example
Department Entity service manages the department and its hierarchies metadata. It deals with department entities and department hierarchy only. Department Entity and Department Hierarchy were earlier in the iFIX core. Now, they have been moved to the mGramSeva iFIX adapter side. This page provides details on how to migrate that data from iFIX DB to mGramSeva iFIX adapter DB.
Create (if it's not available) DB schema (Mongo) in mgramseva namespace.
Create (if it's not available) DB schema (Mongo) in mgramseva namespace.
We can drop unused collections from iFix DB using the below steps :
Connect to ifix namespace playground pod
kubectl exec -it <playground-pod-name> -n ifix -- /bin/bash
Connect to the particular mongo db
mongo --host <hostname>:27017 -u <username> -p <password>
Use db
use <db_name>
Check the above-mentioned collection name using the below commands
db.getCollectionNames();
If above mentioned (highlighted in bold) collections are there then drop them off using below commands
db.departmentEntity.drop();
db.departmentHierarchyLevel.drop();
Port-forward the Department Entity service in localhost from a specific environment (like QA/UAT/Prod). Below is the command to port-forward :
kubectl port-forward <pod-name> 8032:8080 -n mgramseva
mGramSeva Deployment
Post infra setup (Kubernetes Cluster), there starts the deployment process.
Pipeline as code is a practice of defining deployment pipelines through source code, such as Git. Pipeline as code is part of a larger “as code” movement that includes infrastructure as code. Teams can configure builds, tests, and deployment in code that is trackable and stored in a centralized source repository. Teams can use a declarative YAML approach or a vendor-specific programming language, such as Jenkins and Groovy, but the premise remains the same.
A pipeline as code file specifies the stages, jobs, and actions for a pipeline to perform. Because the file is versioned, changes in pipeline code can be tested in branches with the corresponding application release.
The pipeline as code model of creating continuous integration pipelines is an industry best practice, but deployment pipelines used to be created very differently.
The deployment process has got 2 stages and 2 modes. We can see the modes first and then the stages.
Essentially, mGramSeva deployment means that we need to generate Kubernetes manifests for each individual service. We use the tool called the helm, which is an easy, effective and customizable packaging and deployment solution. So depending on where and which env you initiate the deployment there are 2 modes that you can deploy.
From local machine - whatever we are trying in this sample exercise so far.
Advanced: Setup CI/CD System like Jenkins - Depending on how you want to set up your CI/CD and the expertise the steps will vary, however here you can find how we eGov has set up an exemplar CI/CD on Jenkins and the pipelines are created automatically without any manual intervention.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Please follow the below steps to create iFIX Master Data. Have a look at the individual service documentation for details here.
Create the Government by providing valid government details. Once created, you’ll get an Id with provided details in the response. we are calling this id as a tenant id.
Create the Chart of Account (COA) by providing valid details. While creating, you need to pass the valid tenant Id in the COA request. Once created, you’ll get an id with all the provided details in the response. We are calling this id as COA Id and a code as COACode.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
The platform consists of the following core services.
The master data service manages the following master data:
Department
Expenditure
Project
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Key Domain Service Details
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
iFix uses Keycloak to manage clients. Click on this link for instructions to set up a Keycloak server and manage clients.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
This page provides details on the steps involved in cleaning up the iFIX core data from various environments. Follow the instructions specific to select environment listed below.
Druid Data Clean-Up
Mongo DB Data Clean-Up
Postgres Data Clean-Up process.
Open the druid console in the respective environment.
Go to Ingestion → Supervisors. And select the particular supervisor (fiscal-event)
In Action, click on Terminate. This terminates the supervisor. Wait for a minute.
Go to DataSources and select the particular data source name (fiscal-event). And scroll down to Action.
Click on the Mark as unused all segments → Mark as unused all segments.
Click on Delete unused segments (issue kill task) and enable the permission to delete.
Once the clean-up process is completed, follow the instructions here - IFIX Fiscal Event Post Processor | Druid-Sink to run the supervisor.
Connect to the playground pod and run the command below to connect with mongo DB. mongo --host <mongo-db-host>:<mongo-db-port> -u <mongo-db-username> -p <mongo-db-password>
Use the ifix core DB; use <db name>
Run the below command to delete all the data from fiscal_event collection. db.fiscal_event.remove({});
If you want to delete all fiscal event records of a particular Gram Panchayat then run the command below.
Here let us assume, we have to delete "LODHIPUR" GP (Gram Panchayat) details (that is hierarchy level 6) in the DWSS department. Run the command below.
If you want to delete a fiscal event record based on some other attributes, you have to write a custom mongo delete query.
Connect to the playground pod and run the below command to connect with postgresDB.
psql -h <psql-host> -p <psql-port> -d <psql-database> -U <psql-username>
It prompts for a password. Enter the password.
Run the below query to delete all the data from fiscal_event_aggregated.
If you want to delete the particular Gram Panchayat details from fiscal event aggregated record, run the query below.
Here let us assume, we want to delete "LODHIPUR" GP (Gram Panchayat) details (that is hierarchy level 6) in the DWSS department.
Note: If you are not sure about deleting the fiscal event aggregated record, you can delete all the records from the fiscal_event_aggregated table. Once the records are deleted, either run the fiscal event aggregate Cron Job manually to UpSert all the records or the system UpSerts the records every midnight from Druid to Postgres.
The fiscal event post-processor receives a collection of fiscal event lists and after processing those events it pushes them to the Mongo DB and Druid DB. Before deploying the fiscal event post-processor build to the respective environment, make sure you have ingested the updated druid config file to the Druid DB as per the instructions given .
Before upgrading the iFIX fiscal event service v2.0, clean up the existing fiscal event data. This is a one-time activity. Follow the steps outlined in the .
Get the access token from keycloak server and pass it as a bearer token while requesting to iFIX fiscal event service.
All content on this page by is licensed under a .
The mGramSeva iFIX Adapter receives multiple event requests from the mGramSeva service and helps in attributes conversion to map with fix-fiscal-event-service request data.
Environment variable
Configure "fiscal-event-service", "adapter-master-data-service" and "ifix-department-entity-service" in egov-service-host in respective environment yaml file ( ).
Redis data clean up
Execute the below commands on redis-cli before doing any activity on the mgramseva iFix adapter while performing the deployment first time.
Connecting Redis Client:
kubectl exec -it <redis-pod> -n mgramseva -- /bin/bash
redis-cli
e.g.
kubectl exec -it redis-647449f6b9-gg4gw -n mgramseva -- /bin/bash
redis-cli
keys * (check all keys)
Redis cleanup command
del <key>
e.g.
del "10101" "10102" "10201" "20101" "20201" "20301" "20401"
keys * (check all keys)
Note: Make sure all client codes, which have been mentioned in the "ifix_adapter_coa_map" table, are listed in the redis "DEL" command list and recheck those are removed using "keys *" command.
Use the postman tool to verify mGramSeva iFix adapter API end. It is open-end and therefore does not require any access token.
This section provides the steps and details required to migrate the data from one service to another.
All content on this page by is licensed under a .
This page provides step-wise details on how to migrate Department, Project and Expenditure from iFix Master Data Service to Adapter Master Data Service.
Yaml configurations
Update "mongo-db-username", "mongo-db-password" and "mongo-db-authenticated-uri" in secrets db config in respective environment secret yaml file ( ).
Update "mongo-db-name", "mongo-db-host" and "mongo-db-url" in egov-config of configmaps in respective environment yaml file ( ).
Update "adapter-master-data-service" and "ifix-department-entity-service" in egov-service-host in respective environment yaml file ( ).
Create a new DB in the MongoDB instance for these new services.
Next, create the same master data again in the new (mgramseva) database using the adapter-master-data-service's /_create APIs.
After restoring DB collections, drop the unused collections from the iFIX MongoDB.
Follow the steps below to drop the unused collections.
Connect to ifix namespace playground pod
kubectl exec -it <playground-pod> -n ifix -- /bin/bash
Connect to the particular mongo db
mongo --host <hostname>:27017 -u <username> -p <password>
Use db
use <db_name>
Check the above-mentioned collection name using the below commands
db.getCollectionNames();
If collections are there then drop them off using the below commands
e.g.
db.department.drop();
db.expenditure.drop();
db.project.drop();
Port-forward the Adapter master data service in localhost from a specific environment (like QA/UAT/Prod).
Below is the command to port-forward :
e.g.
kubectl port-forward <pod-name> 8030:8080 -n mgramseva
We need the access token from keycloak server and pass it as a bearer token while requesting the ifix-master-data-service create APIs.
Essentially, there are 2 stages that should allow you to use the full potential of DeploymentConfig and pipeline-as-code.
Stage 1: Clone the DevOps , choose your iFix product branch as mGramSeva
Prepare an <> master config file, you can name this file as you wish which will have the following configurations, this env file needs to be in line with your cluster name.
each service global, local env variables
credentials, secrets (You need to encrypt using and create a <env>-secret.yaml separately)
Number of replicas/scale of individual services (Depending on whether dev or prod)
mdms, config repos (Master Data, ULB, Tenant details, Users, etc)
sms g/w, email g/w, payment g/w
GMap key (In case you are using Google Map services in your PGR, PT, TL, etc)
S3 Bucket for Filestore
URL/DNS on which the DIGIT will be exposed
SSL Certificate for the above URL
End-points configs (Internal/external)
Stage 2: Run the mGramSeva_setup deployment script and simply answer the questions that it asks
Create/Add new COA on iFix for a tenant
Details for the new COA + RequestHeader (meta data of the API).
RequestHeader should be used to carry meta information about the requests to the server as described in the fields below. All eGov APIs will use requestHeader as a part of the request body to carry this meta information. Some of this information will be returned back from the server as part of the ResponseHeader in the response body to ensure correlation.
time in epoch
The version of the API
Unique request message id from the caller
Capture the user information
System Generated User id of the authenticated user.
List of roles assigned to a user
List of tenants assigned to a user
Hash describing the current RequestHeader
Captures the COA data as map
Unique system generated UUID
Chart of account concatinated string
Unique tenant identifier
Capture the major head code
Capture the major head code name
Capture the major head code type
"Revenue"
Capture the sub major head code
Capture the sub major head code name
Capture the minor head code
Capture the minor head code name
Capture the sub head code
Capture the sub head code name
Capture the group head code
Capture the group head code name
Capture the object head code
Capture the object head code name
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Request has been accepted for processing
ResponseHeader should be used to carry metadata information about the response from the server. apiId, ver and msgId in ResponseHeader should always correspond to the same values in respective request's RequestHeader.
response time in epoch
unique response message id (UUID) - will usually be the correlation id from the server
message id of the request
status of request processing
Hash describing the current Request
The version of the API
Unique system generated UUID
Chart of account concatinated string
Unique tenant identifier
Capture the major head code
Capture the major head code name
Capture the major head code type
"Revenue"
Capture the sub major head code
Capture the sub major head code name
Capture the minor head code
Capture the minor head code name
Capture the sub head code
Capture the sub head code name
Capture the group head code
Capture the group head code name
Capture the object head code
Capture the object head code name
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Based on the criteria get the list of COA.
RequestHeader meta data.
RequestHeader should be used to carry meta information about the requests to the server as described in the fields below. All eGov APIs will use requestHeader as a part of the request body to carry this meta information. Some of this information will be returned back from the server as part of the ResponseHeader in the response body to ensure correlation.
time in epoch
The version of the API
Unique request message id from the caller
Capture the user information
System Generated User id of the authenticated user.
List of roles assigned to a user
List of tenants assigned to a user
Hash describing the current RequestHeader
The object contains all the search criteria of the fund
Tenant Id
List of COA ids
Chart of account concatinated string
Search by major head
Search by sub major head
Search by minor head
Search by sub head
Search by group head
Search by object head
Successful response
ResponseHeader should be used to carry metadata information about the response from the server. apiId, ver and msgId in ResponseHeader should always correspond to the same values in respective request's RequestHeader.
response time in epoch
unique response message id (UUID) - will usually be the correlation id from the server
message id of the request
status of request processing
Hash describing the current Request
The version of the API
Unique system generated UUID
Chart of account concatinated string
Unique tenant identifier
Capture the major head code
Capture the major head code name
Capture the major head code type
"Revenue"
Capture the sub major head code
Capture the sub major head code name
Capture the minor head code
Capture the minor head code name
Capture the sub head code
Capture the sub head code name
Capture the group head code
Capture the group head code name
Capture the object head code
Capture the object head code name
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Create the new fiscal event
Details for the new fiscal event + RequestHeader (meta data of the API).
RequestHeader should be used to carry meta information about the requests to the server as described in the fields below. All eGov APIs will use requestHeader as a part of the request body to carry this meta information. Some of this information will be returned back from the server as part of the ResponseHeader in the response body to ensure correlation.
time in epoch
The version of the API
Unique request message id from the caller
Capture the user information
System Generated User id of the authenticated user.
List of roles assigned to a user
List of tenants assigned to a user
Hash describing the current RequestHeader
Version of the Data Model Definition
"1.0.0"
System generated UUID.
"fecbbf1d-d6e3-4f24-9935-02c33b9248e0"
Tenant Id
"pb"
Client id of the registered source system(Get the client id from the request header)
Client ids of the registered data receivers system
Captures the event type (eg- 1.a. DEMAND, 1.b. BILL, 2.a. RECEIPT, 2.b. PAYMENT, 2.c. INTER_TRANSFER, 2.d. INTRA_TRANSFER, 3.a. SANCTION, 3.b. APPROPRIATION, 3.c. ALLOCATION)
"Appropriation"
when the event occured at source system level
1628177497000
when the event arrived in ifix
1628177497000
reference unique id(transaction id) of the caller system
"013e9c56-8207-4dac-9f4d-f1e20bd824e7"
If this is a follow up event then it will refer to the parent event using this reference id.
"7d476bb0-bc9f-48e2-8ad4-5a4a36220779"
If this is a follow up event then it will refer to the parent event in source system using this reference id.
"77f23efe-879d-407b-8f23-7b8dd5b2ecb1"
System generated UUID
"51c9c03c-1607-4dd5-9e0e-93bbf860f6f7"
Transaction Amount
10234.5
Chart of account code. Publish request should contain coaCode, but search response will not contain it.
"1234-123-123-12-12-12"
Unique system generated Chart of Account UUID
"e9f940d4-69aa-4bbb-aa82-111b8948a6b6"
Start date of the billing period for which transaction is applicable
1622907239000
Start date of the billing period for which transaction is applicable
1628177643000
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the location where fiscal event happened. This object represent geographical hierarchy
location code
location hierarchy type
"State, District etc"
location name
Capture the location where fiscal event happened. This object represent geographical hierarchy
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the extra information as a json
Event published successfully
ResponseHeader should be used to carry metadata information about the response from the server. apiId, ver and msgId in ResponseHeader should always correspond to the same values in respective request's RequestHeader.
response time in epoch
unique response message id (UUID) - will usually be the correlation id from the server
message id of the request
status of request processing
Hash describing the current Request
The version of the API
Version of the Data Model Definition
"1.0.0"
System generated UUID.
"fecbbf1d-d6e3-4f24-9935-02c33b9248e0"
Tenant Id
"pb"
Client id of the registered source system(Get the client id from the request header)
Client ids of the registered data receivers system
Captures the event type (eg- 1.a. DEMAND, 1.b. BILL, 2.a. RECEIPT, 2.b. PAYMENT, 2.c. INTER_TRANSFER, 2.d. INTRA_TRANSFER, 3.a. SANCTION, 3.b. APPROPRIATION, 3.c. ALLOCATION)
"Appropriation"
when the event occured at source system level
1628177497000
when the event arrived in ifix
1628177497000
reference unique id(transaction id) of the caller system
"013e9c56-8207-4dac-9f4d-f1e20bd824e7"
If this is a follow up event then it will refer to the parent event using this reference id.
"7d476bb0-bc9f-48e2-8ad4-5a4a36220779"
If this is a follow up event then it will refer to the parent event in source system using this reference id.
"77f23efe-879d-407b-8f23-7b8dd5b2ecb1"
System generated UUID
"51c9c03c-1607-4dd5-9e0e-93bbf860f6f7"
Transaction Amount
10234.5
Chart of account code. Publish request should contain coaCode, but search response will not contain it.
"1234-123-123-12-12-12"
Unique system generated Chart of Account UUID
"e9f940d4-69aa-4bbb-aa82-111b8948a6b6"
Start date of the billing period for which transaction is applicable
1622907239000
Start date of the billing period for which transaction is applicable
1628177643000
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the location where fiscal event happened. This object represent geographical hierarchy
location code
location hierarchy type
"State, District etc"
location name
Capture the location where fiscal event happened. This object represent geographical hierarchy
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the extra information as a json
Based on the criteria get the list of events.
RequestHeader meta data.
RequestHeader should be used to carry meta information about the requests to the server as described in the fields below. All eGov APIs will use requestHeader as a part of the request body to carry this meta information. Some of this information will be returned back from the server as part of the ResponseHeader in the response body to ensure correlation.
time in epoch
The version of the API
Unique request message id from the caller
Capture the user information
System Generated User id of the authenticated user.
List of roles assigned to a user
List of tenants assigned to a user
Hash describing the current RequestHeader
The object contains all the search criteria of the fiscal events
List of event ids
Tenant Id
Captures the event type(eg- bill, receipt, expenditure)
Search events b/w transaction time(Start date)
Search events b/w transaction time(End date)
Client id of the registered data receiver system
"mGramSeva"
Search events b/w ingestion time(the time when event published)
Search events b/w ingestion time(the time when event published)
Successful response
ResponseHeader should be used to carry metadata information about the response from the server. apiId, ver and msgId in ResponseHeader should always correspond to the same values in respective request's RequestHeader.
response time in epoch
unique response message id (UUID) - will usually be the correlation id from the server
message id of the request
status of request processing
Hash describing the current Request
The version of the API
Version of the Data Model Definition
"1.0.0"
System generated UUID.
"fecbbf1d-d6e3-4f24-9935-02c33b9248e0"
Tenant Id
"pb"
Client id of the registered source system(Get the client id from the request header)
Client ids of the registered data receivers system
Captures the event type (eg- 1.a. DEMAND, 1.b. BILL, 2.a. RECEIPT, 2.b. PAYMENT, 2.c. INTER_TRANSFER, 2.d. INTRA_TRANSFER, 3.a. SANCTION, 3.b. APPROPRIATION, 3.c. ALLOCATION)
"Appropriation"
when the event occured at source system level
1628177497000
when the event arrived in ifix
1628177497000
reference unique id(transaction id) of the caller system
"013e9c56-8207-4dac-9f4d-f1e20bd824e7"
If this is a follow up event then it will refer to the parent event using this reference id.
"7d476bb0-bc9f-48e2-8ad4-5a4a36220779"
If this is a follow up event then it will refer to the parent event in source system using this reference id.
"77f23efe-879d-407b-8f23-7b8dd5b2ecb1"
System generated UUID
"51c9c03c-1607-4dd5-9e0e-93bbf860f6f7"
Transaction Amount
10234.5
Chart of account code. Publish request should contain coaCode, but search response will not contain it.
"1234-123-123-12-12-12"
Unique system generated Chart of Account UUID
"e9f940d4-69aa-4bbb-aa82-111b8948a6b6"
Start date of the billing period for which transaction is applicable
1622907239000
Start date of the billing period for which transaction is applicable
1628177643000
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the location where fiscal event happened. This object represent geographical hierarchy
location code
location hierarchy type
"State, District etc"
location name
Capture the location where fiscal event happened. This object represent geographical hierarchy
Collection of audit related fields used by most models
UUID (preferred) or userid of the user that created the object
UUID (preferred) or userid of the user that last modified the object
epoch of the time object is created
epoch of the time object is last modified
Capture the extra information as a json
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
./mongo-migration.sh -h <host-address> -u <ifix username> -p <ifix password> - s <ifix db> -d <mgramseva db> -c department -c expenditure -c departmentHierarchyLevel -c departmentEntity
All content on this page by is licensed under a .
All content on this page by is licensed under a .
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by is licensed under a .
All content on this page by is licensed under a .
All content on this page by is licensed under a .
Link
/events/v1/_push
/events/v1/_search
Key
Value
Description
fiscal-kafka-push-topic
fiscal-event-request-validated
Once the fiscal event data will get validated and enriched , it will be published to this topic for further processing.
fiscal.event.kafka.mongodb.topic
fiscal-event-mongodb-sink
Once the fiscal event data will get validated and enriched , push the details to mongo db sink
Title
Link
Swagger Yaml
Postman collection
Title
Link
/government/v1/_create
/government/v1/_search
Title
Link
/chartOfAccount/v1/_create
/chartOfAccount/v1/_search
Title
Link
Swagger Yaml
Postman collection
Title
Link
/departmentEntity/hierarchyLevel/v1/_create
/departmentEntity/hierarchyLevel/v1/_search
Title
Link
/departmentEntity/v1/_create
/departmentEntity/v1/_search
Title
Link
Swagger Yaml
Postman collection
Key
Value
Description
Remarks
fiscal-event-kafka-push-topic
fiscal-event-request-validated
Fiscal event post processor will consume data from this topic
Kafka topic should be same as configured in Fiscal event service.
fiscal-event-kafka-dereferenced-topic
fiscal-event-request-dereferenced
Dereferenced fiscal event data will be pushed to this topic
NA
fiscal-event-kafka-flattened-topic
fiscal-event-line-item-flattened
NA
NA
fiscal-event-processor-kafka-druid-topic
fiscal-event-druid-sink
Flattened Fiscal Event data will be pushed to this topic.
While druid ingest of fiscal event , make sure it has the same topic as mentioned here.
Title
Link
Swagger Yaml