Skip to main content
Version: 23.04

Kafka Event Manager

Before starting

  • You can send events from a central server, a remote server or a poller.
  • By default, this stream connector sends host_status, service_status and ba_status events. The event format is shown there.
  • Aformentioned events are fired each time a host or a service is checked. Various parameters let you filter out events.

Installation

Login as root on the Centreon central server using your favorite SSH client.

Run the command according on your system:

dnf install centreon-stream-connector-kafka

Configuration

To configure your stream connector, you must head over the Configuration --> Poller --> Broker configuration menu. Select the central-broker-master configuration (or the appropriate broker configuration if it is a poller or a remote server that will send events) and click the Output tab when the broker form is displayed.

Add a new generic - stream connector output and set the following fields as follow:

FieldValue
NameKafka events
Path/usr/share/centreon-broker/lua/kafka-events-apiv2.lua
Filter categoryNeb,Bam

Add Kafka mandatory parameters

Each stream connector has a set of mandatory parameters. To add them you must click on the +Add a new entry button located below the filter category input.

TypeNameValue explanationValue exemple
stringtopicthe topic in which events are going to be writtenMonitoring
stringbrokersComa separeted list of brokers that are ready to receive databroker_address1:port1,broker_address2:port2

Add Kafka optional parameters

Some stream connectors have a set of optional parameters dedicated to the Software that are associated with. To add them you must click on the +Add a new entry button located below the filter category input.

TypeNameValue explanationdefault value
stringlogfilethe file in which logs are written/var/log/centreon-broker/kafka-stream-connector.log
numberlog_levellogging level from 1 (errors) to 3 (debug)1

Standard parameters

All stream connectors can use a set of optional parameters that are made available through Centreon stream connectors lua modules.

All those parameters are documented here.

Some of them are overridden by this stream connector.

TypeNameDefault value for the stream connector
stringaccepted_categoriesneb
stringaccepted_elementshost_status,service_status

Librdkafka (library dependency) parameters

In addition to parameters from stream connectors, there is a handfull of parameters available thanks to the librdkafka library. They are all documented in the librdkafka official documentation. To use them you just need to add the sc_kafka prefix.

With that in mind, the parameter sasl.mechanism becomes _sc_kafka_sasl.mechanism in your broker configuration.

El7 and El8 repos grant access to an old librdkafka library version.

Event bulking

This stream connector is compatible with event bulking. Meaning that it is able to send more that one event in each call to kafka brokers.

To use this feature you must add the following parameter in your stream connector configuration.

TypeNameValue
numbermax_buffer_sizemore than one

Event format

This stream connector will send event with the following format.

service_status event

{
"host": "my_host",
"service": "my_service",
"output": "CRITICAL: the wind broke my umbrella",
"state": "CRITICAL"
}

host_status event

{
"host": "my_host",
"output": "DOWN: putting gas in my eletric car was not a good idea",
"state": "DOWN"
}

ba_status event

{
"ba": "my_ba",
"state": "CRITICAL"
}

Custom event format

This stream connector allows you to change the format of the event to suit your needs. Only the event part of the json is customisable. It also allows you to handle events type that are not handled by default such as acknowledgement events.

In order to use this feature you need to configure a json event format file and add a new stream connector parameter.

TypeNameValue
stringformat_file/etc/centreon-broker/kafka-events-format.json

The event format configuration file must be readable by the centreon-broker user.

To learn more about custom event format and templating file, head over the following documentation.

Test connexion

Sending data to Kafka can be quite complicated because of all the involved parameters (either from the stream connector itself or the kafka library).

To make things easier, a lua connection test script is available.

To install it you must follow the installation procdure and then:

wget -O /tmp/kafka_test_connection.lua https://raw.githubusercontent.com/centreon/centreon-stream-connector-scripts/master/modules/tests/kafka_test_connexion.lua 

Open the script and configure the kafka options that you want to use from the librdkafka official documentation (you do not need to add the sc_kafka prefix this time, just put the parameter inside the config[] brackets).

There are already configuration set up as examples to guide you.

If it doesn't work, you should have an error message like below (with the appropriate error message). It is strongly advised to have access to kafka to check if a message is sent from the test script.

%3|1622459610.760|FAIL|rdkafka#producer-1| [thrd:sasl_plaintext://cps-kafkan:9093/bootstrap]: sasl_plaintext://cps-kafkan:9093/bootstrap: Failed to resolve 'cps-kafkan:9093': Name or service not known