If you are not familiar with Kafka, follow this short tour.
Kafka is a message broker in which you can produce (i.e. publish) and consume records. The standalone punch
has a kafka running already. All you have to do is to create a kafka
topic, and try producing and consuming records.
We will do that using the handy standalone tools. First let us create a topic:
punchplatform-kafka-topics.sh --create \ --kafkaCluster local \ --topic test_topic \ --replication-factor 1 \ --partitions 1
Each topic can be defined with some level or replication and number of partitions. These are Kafka concepts. Next fill your topic with some apache logs. To do that you must start the apache_httpd channel
channelctl start --channel apache_httpd
punchplatform-log-injector.sh -c $PUNCHPLATFORM_CONF_DIR/resources/injectors/mytenant/apache_httpd_injector.json -n 100
Have a look at the
apache_httpd_injector.json file. It is self explanatory.
The punchplatform-log-injector.sh tool is extremely powerful and enables you
to produce arbitrary data,
that you can in turn send to kafka, elasticsearch, punchlines etc..
Check now that your logs are stored in our topic, as expected. You can again use the punch injector, but this time in consumer mode:
punchplatform-log-injector.sh \ --kafka-consumer \ --topic mytenant_apache_httpd_archiving \ -brokers local \ -earliest
It should show your expected number of records. Try it also using
To try punchlines sendin/receiving data through Kafka, follow the (./Templates.md) tour. [/Templates.md]. Using templates makes it easy to change the setup of channels from single-punchline to dual-through-kafka models.