Skip to content


If you are not familiar with Kafka, follow this short tour.

Kafka is a message broker in which you can produce (i.e. publish) and consume records. The standalone punch has a kafka running already. All you have to do is to create a kafka topic, and try producing and consuming records.

We will do that using the handy standalone tools. First let us create a topic: --create \
  --kafkaCluster local \
  --topic test_topic \
  --replication-factor 1 \
  --partitions 1

Each topic can be defined with some level or replication and number of partitions. These are Kafka concepts. Next fill your topic with some apache logs. To do that you must start the apache_httpd channel

channelctl start --channel apache_httpd 
then start injecting logs: -c $PUNCHPLATFORM_CONF_DIR/resources/injectors/mytenant/apache_httpd_injector.json -n 100


Have a look at the apache_httpd_injector.json file. It is self explanatory. The tool is extremely powerful and enables you to produce arbitrary data, that you can in turn send to kafka, elasticsearch, punchlines etc..

Check now that your logs are stored in our topic, as expected. You can again use the punch injector, but this time in consumer mode: \
  --kafka-consumer \
  --topic mytenant_apache_httpd_archiving \
  -brokers local \

It should show your expected number of records. Try it also using -v.

Trying Kafka

To try punchlines sendin/receiving data through Kafka, follow the (./ tour. [/]. Using templates makes it easy to change the setup of channels from single-punchline to dual-through-kafka models.