The Fiscal Event Service maintains the fiscal event flow activities of multiple projects. The root level entity is a tenant which is basically the State Government. The tenant has several projects that are tagged to multiple attributes like department, scheme, mission, department, hierarchy, etc. All the entity information and COA (Chart of Account) are passed along with amount details which constitutes an array of fiscal event instances under the system.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met.
Java 8
MongoDB instance
Required Service Dependencies.
iFIX-Master-Data-Service
iFIX-Fiscal Event Post Processor
Apache Kafka Server
This service provides two features - Push fiscal event data and search fiscal event data.
It is a secure endpoint and user info details are required to access it.
It receives fiscal detail along with mandatory tenant and project info under attributes detail in publish requests, which triggers the whole fiscal event process and forwards it to the post-process service.
Before encompassing all entities, it checks for entities validation via iFix core Master Data Service.
While we create any fiscal event instance in the system, we log ingestion time and event occurrence timestamp which makes event flow clear on timestamp activities.
After processing all these activities, it passes fiscal event information to other fiscal event post-processor services for further processing by publishing fiscal data into the Kafka topic "fiscal-event-request-validated" and “fiscal-event-mongodb-sink“.
It also responds to snapshots of enriched fiscal event data which can be used by the source system for reference or any further processing.
It provides a search feature on the existing fiscal events which has been processed before by post-processor service and persisted into the MongoDB instance.
We can mainly search on eventType, tenantId, referenceId and event time interval and in return we get enriched data (dereference with nested id details).
Note: Kafka topic needs to be configured with respect to the environment
Update all the DB and URI configuration in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running And have been configured with the required client ID.
Master data service maintains information about Government and Chart of Accounts. We can create these details and search for the same details based on the given parameters/request data.
Current version : 2.0.0
Before we proceed with the configuration, make sure the following pre-requisites are met:
Java 8
MongoDB instance
It creates secure endpoints for the master data service. The access token is required to create any master data.
The subsequent sections on this page discuss the service details maintained by the IFIX core master data service.
This service provides the capabilities to maintain the Government details and allow users to Create and Search data. For creating the Government, we need a unique Id for the Government and a name for the same. Optionally, we can pass some additional details as part of the attribute. In the case of search, passing the unique ID(s) as search parameters can give you all the details of the required Government.
This service provides the capabilities to maintain the Chart of Account (COA) details and support create and search of COA. The following information is passed while creating the Chart of Accounts - Government Id, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead and corresponding head names & types. A unique code named COACode is generated by combining (concatenating) majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead with a hyphen ("-") and stored with the given request. Searching the details for COA is done based on the given search parameters like the Chart of Account IDs, COACodes, Government ID, majorHead, subMajorHead, minorHead, subHead, groupHead, objectHead.
No environment variables are required specific to the environment (migration).
Update the DB and URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
Make sure the keycloak server is up and running and has been configured with the required client ID.
Please follow the below steps to create iFIX Master Data. Have a look at the individual service documentation for details here.
Create the Government by providing valid government details. Once created, you’ll get an Id with provided details in the response. we are calling this id as a tenant id.
Create the Chart of Account (COA) by providing valid details. While creating, you need to pass the valid tenant Id in the COA request. Once created, you’ll get an id with all the provided details in the response. We are calling this id as COA Id and a code as COACode.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Key Domain Service Details
All content on this page by is licensed under a .
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
All content on this page by is licensed under a .
Link
/events/v1/_push
/events/v1/_search
Key
Value
Description
fiscal-kafka-push-topic
fiscal-event-request-validated
Once the fiscal event data will get validated and enriched , it will be published to this topic for further processing.
fiscal.event.kafka.mongodb.topic
fiscal-event-mongodb-sink
Once the fiscal event data will get validated and enriched , push the details to mongo db sink
Title
Link
Swagger Yaml
Postman collection
Title | Link |
/government/v1/_create |
/government/v1/_search |
Title | Link |
/chartOfAccount/v1/_create |
/chartOfAccount/v1/_search |
Title | Link |
Swagger Yaml |
Postman collection |
Fiscal Event Post Processor is a streaming pipeline for validated fiscal event data. Apache Kafka has been used to stream the validated fiscal event data, process it for dereferencing, unbundle, flatten, and finally push these details to the Druid data store.
Current version : 2.0.0
Before you proceed with the configuration, make sure the following pre-requisites are met:
Java 8
Apache Kafka and Kafka-Connect server should be up and running
Druid DB should be up and running
Below dependent services are required: iFix Master data service iFix Fiscal Event service
Fiscal Event post-processor consumes the fiscal event validated data from Kafka topic named “fiscal-event-request-validated” and processes it by following the below steps:
Fiscal event validated data gets dereferenced. For dereferencing, service ids like COA id, Tenant id etc. are passed to corresponding services - master service and fetch the corresponding object(s). Once the fiscal event data is dereferenced, push/send the same data to the dereferenced Topic.
Unbundle consumers pick up the dereferenced fiscal event data from the dereferencing topic. Dereference fiscal event data gets unbundled and then flattened. Once the flattening is complete, push/send the same data to the Druid Sink topic.
Flattened fiscal event data is pushed to Druid DB from a topic named: fiscal-event-druid-sink.
The Kafka-connect is used to push the data from a Kafka topic to MongoDB. Follow the steps below to start the connector.
Connect (port-forward) with the Kafka-connect server.
Create a new connector with a POST API call to localhost:8083/connectors.
The request body for that API call is written in the file fiscal-event-mongodb-sink.
Within that file, wherever ${---} replace it with the actual value based on the environment. Get ${mongo-db-authenticated-uri} from the configured secrets of the environment. (Optional) Verify and make changes to the topic names.
The connector is ready. You can check it using API call GET localhost:8083/connectors.
The Druid console is used to start ingesting data from a Kafka topic to the Druid data store. Follow the steps below to start the Druid Supervisor.
Open the Druid console
Go to the Load Data section
Select Other
Click on Submit Supervisor
Copy...Paste the JSON from the druid-ingestion-config.json file in the available text box.
Verify the Kafka topic name and the Kafka bootstrap server address before submitting the config
Submit the config and the data ingestion should start into the fiscal-event data source
Note: Kafka topic needs to be configured with respect to the environment
Update the DB, Kafka producer & Consumer And URI configurations in the dev.yaml, qa.yaml, prod.yaml file.
All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Key
Value
Description
Remarks
fiscal-event-kafka-push-topic
fiscal-event-request-validated
Fiscal event post processor will consume data from this topic
Kafka topic should be same as configured in Fiscal event service.
fiscal-event-kafka-dereferenced-topic
fiscal-event-request-dereferenced
Dereferenced fiscal event data will be pushed to this topic
NA
fiscal-event-kafka-flattened-topic
fiscal-event-line-item-flattened
NA
NA
fiscal-event-processor-kafka-druid-topic
fiscal-event-druid-sink
Flattened Fiscal Event data will be pushed to this topic.
While druid ingest of fiscal event , make sure it has the same topic as mentioned here.
Title
Link
Swagger Yaml