public class RythmerNode
extends org.thales.punch.libraries.storm.api.BaseProcessingNode
In order to reproduce the same time pacing between subsequent messages, the RytmerBolt uses a busy wait loop. It uses a single thread. It replays 1000 events per seconds with an error of less than 10%.
Note that the RythmerBolt expects to received tuple sorted in time. If you need to replay some data, make sure you get an ordered list. This can easily be achieved by preparing your data and make it available in a file or in a Kafka topic for example.
The rythmer bolt expects the following parameters:
field name | mandatory | type | default | comment |
---|---|---|---|---|
timestamp_field | no | string | timestamp | the storm field where the input timestamp is expected |
threshold | no | long | 0 | the number of input tuple required before starting the replay |
date_format | no | string | "yyyy-MM-dd hh:mm:ss.SSSS" | the expected timestamp format |
Modifier and Type | Field and Description |
---|---|
boolean |
isRunning
use to verify if the consummer is aldreay running
|
Constructor and Description |
---|
RythmerNode(org.thales.punch.libraries.storm.api.NodeSettings config) |
Modifier and Type | Method and Description |
---|---|
void |
prepare(Map stormConf,
org.apache.storm.task.TopologyContext context,
org.apache.storm.task.OutputCollector collector) |
void |
process(org.apache.storm.tuple.Tuple tuple)
Before sending the tuple to the consumer, the time difference between
the previous message is calculated.
|
public RythmerNode(org.thales.punch.libraries.storm.api.NodeSettings config)
config
- the bolt configurationpublic void prepare(Map stormConf, org.apache.storm.task.TopologyContext context, org.apache.storm.task.OutputCollector collector)
prepare
in interface org.apache.storm.task.IBolt
prepare
in class org.thales.punch.libraries.storm.api.BaseProcessingNode
public void process(org.apache.storm.tuple.Tuple tuple)
process
in class org.thales.punch.libraries.storm.api.BaseProcessingNode
tuple
- the input storm tuple.Copyright © 2023. All rights reserved.