Skip to content

CRAIG 5.6.1

Below is a summary of the JIRA issues addressed in the CRAIG-5.6.1 release of Punchplatform. For full documentation of the release, a guide to get started, and information about the project, see the Punchplatform project site.

Note about upgrades: Please carefully review the upgrade documentation for this release thoroughly before upgrading your cluster. The upgrade notes discuss any critical information about incompatibilities and breaking changes, performance changes, and any other changes that might impact your production deployment of Punchplatform.

The documentation for the most recent release can be found at https://doc.punchplatform.com.

Release summary

This release fixes some minor issues

Patch changes

No migration guide is needed for this release.

Release notes

Features

  • [PP-2809] - Create punchplatform-kafka-consumers.sh to provide offsets management features
  • [PP-3625] - PML PySpark 1-node deployer compatibility (duplicates [PP-3646], [PP-3648])
  • [PP-3636] - Provides Punchplatform NiFi .jar and .nar archives

Internal

  • [PP-3611] - Improve logging on archiving (duplicates [PP-3612])
  • [PP-3621] - Finalize "at_least_once" filebolt
  • [PP-3637] - Improve the csv resource handling

Bug

  • [PP-3615] - Build pp-pyspark failed if make flush is not make
  • [PP-3616] - Confusion between input ports of multiple syslog spouts in same topology
  • [PP-3623] - Syslog Spout NullPointerException (temporary workaround)
  • [PP-3627] - Missing metrics jar with dependencies in deployed mode
  • [PP-3635] - Doc: at_least_once FileBolt needs _ppf_partition_id and _ppf_partition_offset fields
  • [PP-3638] - Kibana plugin deployment with invalid zookeeper_host instead of zookeeper_hosts variable
  • [PP-3639] - Wrong cluster address and compression rate in indexed record when archiving with ceph
  • [PP-3657] - JVM memory of Shiva topologies are not taking storm settings into account
  • [PP-3658] - NullPointerException when using _ppf_errors stream in kafka spout (documentation workaround)