To orchestrate your simple or complex logic in an easy way, you can use the Business Rules Maintenance Tool - a web application which helps the user to define and also test the logic.
But the rule engine can also run in client/server mode. The rule engine server part will wait for connections of clients. The client can send data to the server, the business rules are run on the server against the data and the results are sent back to the client.
There is information available on my github account.
Readers of this blog will know, that the business rules for the rule engine are captured in a project file. A zip file containing all the business logic for a certain project. When the server is started it is provided with the relevant project file.
When the server is started it looks like this:
2017-11-18 12:16:59 - server start...
2017-11-18 12:16:59 - using properties from: server.properties
2017-11-18 12:16:59 - running rule engine file: /home/uwe/development/jare_server/travel discount_dev.zip
2017-11-18 12:16:59 - output with transformer: class com.datamelt.server.transform.log.LogTransformer
2017-11-18 12:16:59 - waiting on: 0.0.0.0/0.0.0.0, port: 9000 for connections
- The details for the server are in a properties file, which is processed at server start time. It contains also a reference to the project file used. The transformer used specifies how the output of the detailed results of the rule engine is done. Currently following outputs are possible:
- no ouput
- output to a log file
- output to a MongoDb collection
The client receives the results of the execution of the business rules in the form of "numbers". This means how many rule groups failed, how many rule groups were skipped (dependant rule groups), how many rules failed, how many actions were executed and so on. But the detailed results are kept on the server only. They can be output as described above.
This is done for performance reasons. The data sent back to the client should as as minimal as possible. The client can then make further decisions based on the provided numbers. E.g. to only further process data where no rules failed or no rule groups failed.
There is also a utility class to send messages to the server, which gives back informations about the status of the server. Such as:
- the version of the rule engine in use
- uptime of the server
- rule engine project file in use
- how many rule groups are defined
- how many rows have been processed
- reload the rule engine project file
The last bullet point is specifically interesting. The reload of the rule engine project file allows to make changes to the business logic and reload this file in the running server.
For the Pentaho ETL tool - Pentaho PDI - there is a plugin available, that allows to use the rule engine running in server mode within your ETL transformation. You could run one server or many servers and process the data that flows throught the ETL process, without the need to harcode logic in the ETL. That is a big advantage for keeping the ETL simple and straight and thus for quality as such.
The business logic - orchestrated in the Business Rules Maintenance Tool - can be maintained by IT. But also by a superuser or user from the business department. As the competence of the business logic is within the business it makes sense to delegate the maintenance of the logic to the business and to have IT concentrate on the infrastructure, processes and automation.
Give it a try and let me know what you think.