Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The Fiscal Event Service maintains the fiscal event flow activities of multiple projects. The root level entity is a tenant which is basically the State Government. The tenant has several projects that are tagged to multiple attributes like department, scheme, mission, department, hierarchy, etc. All the entity information and COA (Chart of Account) are passed along with amount details which constitutes an array of fiscal event instances under the system.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met.
Java 8
MongoDB instance
Required Service Dependencies.
iFIX-Master-Data-Service
iFIX-Fiscal Event Post Processor
Apache Kafka Server
This service provides two features - Push fiscal event data and search fiscal event data.
It is a secure endpoint and user info details are required to access it.
It receives fiscal detail along with mandatory tenant and project info under attributes detail in publish requests, which triggers the whole fiscal event process and forwards it to the post-process service.
Before encompassing all entities, it checks for entities validation via iFix core Master Data Service.
While we create any fiscal event instance in the system, we log ingestion time and event occurrence timestamp which makes event flow clear on timestamp activities.
After processing all these activities, it passes fiscal event information to other fiscal event post-processor services for further processing by publishing fiscal data into the Kafka topic "fiscal-event-request-validated" and “fiscal-event-mongodb-sink“.
It also responds to snapshots of enriched fiscal event data which can be used by the source system for reference or any further processing.
It provides a search feature on the existing fiscal events which has been processed before by post-processor service and persisted into the MongoDB instance.
We can mainly search on eventType, tenantId, referenceId and event time interval and in return we get enriched data (dereference with nested id details).
Note: Kafka topic needs to be configured with respect to the environment
Update all the DB and URI configuration in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running And have been configured with the required client ID.
Master data service maintains information about Government and Chart of Accounts. We can create these details and search for the same details based on the given parameters/request data.
Current version : 2.0.0
Before we proceed with the configuration, make sure the following pre-requisites are met:
Java 8
MongoDB instance
It creates secure endpoints for the master data service. The access token is required to create any master data.
The subsequent sections on this page discuss the service details maintained by the IFIX core master data service.
This service provides the capabilities to maintain the Government details and allow users to Create and Search data. For creating the Government, we need a unique Id for the Government and a name for the same. Optionally, we can pass some additional details as part of the attribute. In the case of search, passing the unique ID(s) as search parameters can give you all the details of the required Government.
This service provides the capabilities to maintain the Chart of Account (COA) details and support create and search of COA. The following information is passed while creating the Chart of Accounts - Government Id, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead and corresponding head names & types. A unique code named COACode is generated by combining (concatenating) majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead with a hyphen ("-") and stored with the given request. Searching the details for COA is done based on the given search parameters like the Chart of Account IDs, COACodes, Government ID, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead.
No environment variables are required specific to the environment (migration).
Update the DB and URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running and has been configured with the required client ID.
Fiscal Event Post Processor is a streaming pipeline for validated fiscal event data. Apache Kafka has been used to stream the validated fiscal event data, process it for dereferencing, unbundle, flatten, and finally push these details to the Druid data store.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Apache Kafka and Kafka-Connect server should be up and running
Druid DB should be up and running
Below dependent services are required: iFix Master data service iFix Fiscal Event service
Fiscal Event post-processor consumes the fiscal event validated data from Kafka topic named “fiscal-event-request-validated” and processes it by following the below steps:
Fiscal event validated data gets dereferenced. For dereferencing, service ids like COA id, Tenant id etc. are passed to corresponding services - master service and fetch the corresponding object(s). Once the fiscal event data is dereferenced, push/send the same data to the dereferenced Topic.
Unbundle consumers pick up the dereferenced fiscal event data from the dereferencing topic. Dereference fiscal event data gets unbundled and then flattened. Once the flattening is complete, push/send the same data to the Druid Sink topic.
Flattened fiscal event data is pushed to Druid DB from a topic named: fiscal-event-druid-sink.
The Kafka-connect is used to push the data from a Kafka topic to MongoDB. Follow the steps below to start the connector.
Connect (port-forward) with the Kafka-connect server.
Create a new connector with a POST API call to localhost:8083/connectors.
Within that file, wherever ${---} replace it with the actual value based on the environment. Get ${mongo-db-authenticated-uri} from the configured secrets of the environment. (Optional) Verify and make changes to the topic names.
The connector is ready. You can check it using API call GET localhost:8083/connectors.
The Druid console is used to start ingesting data from a Kafka topic to the Druid data store. Follow the steps below to start the Druid Supervisor.
Open the Druid console
Go to the Load Data section
Select Other
Click on Submit Supervisor
Verify the Kafka topic name and the Kafka bootstrap server address before submitting the config
Submit the config and the data ingestion should start into the fiscal-event data source
Note: Kafka topic needs to be configured with respect to the environment
Update the DB, Kafka producer & Consumer And URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Please follow the below steps to create iFIX Master Data. Have a look at the individual service documentation for details here.
Create the Government by providing valid government details. Once created, you’ll get an Id with provided details in the response. we are calling this id as a tenant id.
Create the Chart of Account (COA) by providing valid details. While creating, you need to pass the valid tenant Id in the COA request. Once created, you’ll get an id with all the provided details in the response. We are calling this id as COA Id and a code as COACode.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Key Domain Service Details
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by is licensed under a .
The request body for that API call is written in the file .
Copy...Paste the JSON from the file in the available text box.
All content on this page by is licensed under a .
Link
/events/v1/_push
/events/v1/_search
Key
Value
Description
fiscal-kafka-push-topic
fiscal-event-request-validated
Once the fiscal event data will get validated and enriched , it will be published to this topic for further processing.
fiscal.event.kafka.mongodb.topic
fiscal-event-mongodb-sink
Once the fiscal event data will get validated and enriched , push the details to mongo db sink
Title
Link
Swagger Yaml
Postman collection
Key | Value | Description | Remarks |
|
| Fiscal event post processor will consume data from this topic | Kafka topic should be same as configured in Fiscal event service. |
|
| Dereferenced fiscal event data will be pushed to this topic | NA |
|
| NA | NA |
|
| Flattened Fiscal Event data will be pushed to this topic. | While druid ingest of fiscal event , make sure it has the same topic as mentioned here. |
Title | Link |
/government/v1/_create |
/government/v1/_search |
Title | Link |
/chartOfAccount/v1/_create |
/chartOfAccount/v1/_search |
Title | Link |
Swagger Yaml |
Postman collection |
Title | Link |
Swagger Yaml |
Keycloak console is available at https://<host-name>/auth
. The Ops team will provide the username and password secrets.
Open the Keycloak console
Near the top-left corner in the realm drop-down menu, select Add Realm
Select the ifix-realm.json file
After the realm gets created, select the ifix
realm from the drop-down near the top-left corner
Remember to select the ifix
realm from the Keycloak console before proceeding
From the Clients section of Keycloak Admin Console, create a client
Provide a unique username for the client
Go to the client's settings
Change Access Type to confidential
Turn on Service Account Enabled
In the Valid Redirect URIs field provide the root URL of the iFIX Instance (Not important for our purposes but need to set it because it is mandatory)
And Save these changes
In the Service Account Roles tab, assign the role "fiscal-event-producer"
In the Mappers tab, create a new mapper to associate the client with a tenantId
Select Mapper Type
to be "Hardcoded claim"
In Token Claim Name
, write "tenantId"
In Claim value
, write the under which the client is being created. (For example, "pb")
Set Name
same as Token Claim Name
i.e. "tenantId"
Select Claim Json Type
to be "String"
Now you can get the credentials from the Credentials tab and configure them in the client's system.
Department Entity service manages the department and its hierarchies metadata. It deals with department entity and department hierarchy only. Department Hierarchy store only hierarchies definition like metadata for department level and department entity stores actual department data with its ancestry information.
Current Version: 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met.
Java 8
MongoDB instance
Required Service Dependencies - Adapter-Master-Data-Service
It defines the hierarchy definition for the department.
department id: It is the ID of the department from the department master
level: It defines the depth of hierarchy of department-level
parent: It provides details about department hierarchy parent (UUID)
Root level department hierarchy should not contain any parent value and the level value will be zero
When parent ID is having any value then we search parent in the department hierarchy record for hierarchy level evaluation.
Get level value from the parent department hierarchy and increment the current department hierarchy level value by one.
It contains department entity information along with its hierarchy level and also attaches master department information (department id - UUID). It keeps all child level information lists at every department node (department record). Leaf level department does not have any children info. Child list contains department entity ID list, which makes the current department entity parent of all children list (department ID list). This is how it maintains the department entity level.
Define the hierarchy level using the top to bottom because it has the parent's references. For department entity, create using the bottom-to-top approach. The leaf department entity does not have any child reference. When the department entity goes higher (parent) only then does it defines child reference in its child list.
Create Department Hierarchy: We pass the current hierarchy level and its parent details along with master department and tenant information. It stores data as meta-information about hierarchy level for department entity data processing, it works as a reference meta index which will tell about hierarchy level information.
Search Department Hierarchy: It just provides a preview of department hierarchies by providing request parameters - tenant ID, hierarchy level or department hierarchy ID.
Create Department Entity: It passes tenant Id, master department id, hierarchy level, its children list with department entity name and code. Tenant, hierarchy level and the master department is root info about the department entity. If the department entity does not contain any child, that means it is a leaf department entity it can only seek for its parent.
Search Department Entity: It can make search requests based on any department entity attribute but can't skip tenant information, it returns the whole department entity details along with its child information. It finds all ancestry information of the current department entity.
There will not be any environment variables required specific to the environment (migration).
Update the DB and URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Department Hierarchy defines the definition(Meta- Data) for actual department entity creation.
In the above image, the second row contains the actual Department Entities. Department Entity Create API - follows Bottom to Up approach.
In the above example data, the very first Department Entity will be created for “BARUWAL” with code “7278”. Below are the Steps to create Department Entity :
For reference, the Bottom to Up request & response of Department Entity Create API will look like :
Request :[For Department Entity at Bottom - “BARUWAL”]
Response:
Subsequent Request: [For Department Entity - “Kiratpur Sahib”]
Response:
And so on for the next Department Entity(ies) ...
Note: Below is the observation from the above example :
Tenant Id and Department Id will be created before creating the Department Entity hence before Department Hierarchy Create. Tenant Id and Department Id will be the same in all hierarchy levels for a Single Department Hierarchy and corresponding Department Entity(ies).
In the children attribute, we can pass a set of previous Department Entity Ids (or at least one or empty in case of Bottom level Department Entity).
When we have to update the existing children list of Department Entity then update the existing children list using mongodb command like below
Find the parent department entity where the new children need to be added. We should know which is the parent Department Entity beforehand. For the same, Do search by name and hierarchy level and find the corresponding parent department entity’s id (UUID).
Append the department entity id at the end of the parent department entity children's list that we got from step 1. And count the length of the parent department entity children’s list (it is 0 based indexing ) that we are calling as ‘n’ here (it could be any integer number as per the children’s list size ) and then set it as
"children.n": "<picked_from_step_1_query_department_entity_id>"
e.g.
This page provides details on the steps involved in cleaning up the iFIX core data from various environments. Follow the instructions specific to select environment listed below.
Druid Data Clean-Up
Mongo DB Data Clean-Up
Postgres Data Clean-Up process.
Open the druid console in the respective environment.
Go to Ingestion → Supervisors. And select the particular supervisor (fiscal-event)
In Action, click on Terminate. This terminates the supervisor. Wait for a minute.
Go to DataSources and select the particular data source name (fiscal-event). And scroll down to Action.
Click on the Mark as unused all segments → Mark as unused all segments.
Click on Delete unused segments (issue kill task) and enable the permission to delete.
Once the clean-up process is completed, follow the instructions here - IFIX Fiscal Event Post Processor | Druid-Sink to run the supervisor.
Connect to the playground pod and run the command below to connect with mongo DB. mongo --host <mongo-db-host>:<mongo-db-port> -u <mongo-db-username> -p <mongo-db-password>
Use the ifix core DB; use <db name>
Run the below command to delete all the data from fiscal_event collection. db.fiscal_event.remove({});
If you want to delete all fiscal event records of a particular Gram Panchayat then run the command below.
Here let us assume, we have to delete "LODHIPUR" GP (Gram Panchayat) details (that is hierarchy level 6) in the DWSS department. Run the command below.
If you want to delete a fiscal event record based on some other attributes, you have to write a custom mongo delete query.
Connect to the playground pod and run the below command to connect with postgresDB.
psql -h <psql-host> -p <psql-port> -d <psql-database> -U <psql-username>
It prompts for a password. Enter the password.
Run the below query to delete all the data from fiscal_event_aggregated.
If you want to delete the particular Gram Panchayat details from fiscal event aggregated record, run the query below.
Here let us assume, we want to delete "LODHIPUR" GP (Gram Panchayat) details (that is hierarchy level 6) in the DWSS department.
Note: If you are not sure about deleting the fiscal event aggregated record, you can delete all the records from the fiscal_event_aggregated table. Once the records are deleted, either run the fiscal event aggregate Cron Job manually to UpSert all the records or the system UpSerts the records every midnight from Druid to Postgres.
iFix uses Keycloak to manage clients. Click on this link for instructions to set up a Keycloak server and manage clients.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
The platform consists of the following core services.
The master data service manages the following master data:
Department
Expenditure
Project
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Title
Link
/departmentEntity/hierarchyLevel/v1/_create
/departmentEntity/hierarchyLevel/v1/_search
Title
Link
/departmentEntity/v1/_create
/departmentEntity/v1/_search
Title
Link
Swagger Yaml
Postman collection