Punchplatform Inspect Node¶
NAME¶
1 |
|
SYNOPSIS¶
punchplatform-inspect-node.sh --package PACKAGES --runtime RUNTIME --output-dir OUTPUT_DIR --pex PEX_ARCHIVE
# or
punchplatform-inspect-node.sh --package PACKAGES --runtime RUNTIME --output-dir OUTPUT_DIR --jar JAR_ARCHIVE
DESCRIPTION¶
To make an external node available through our kibana Punchline Editor, a json
file containing basics information on your node. This cli enable you to generate such json by scanning either a jar or a pex archive.
OPTIONS¶
-r, --runtime runtime for scanner to produce node ['storm' or 'spark'], for e.g.
--runtime spark --mllib can be used only with '--runtime spark' and '--base-class org.apache.spark.ml.PipelineStage' will display spark mllib nodes
--base-class for new API use '--base-class com.github.punch.api.node.PunchBaseNode', spark mllib: '--base-class org.apache.spark.ml.PipelineStage'
-p, --packages packages list to scan, for e.g. --packages org.thales.punch,org.apache.spark
-j, --jar jar to be added to this application classpath, this will enable this application to produce a JSON document that can be read by Kibana UI for displaying your custom node e.g.: --pex /tmp/my_custom_node.pex --pex PEX to be added to this application PEX_PATH, this will enable this application to produce a JSON document that can be read by Kibana UI for displaying your custom node e.g.: --jar /tmp/my_custom_node.jar -v, --verbose display additional information
-h, --help display this help menu
EXAMPLES¶
Spark, mllib and internal nodes¶
punchplatform-inspect-node.sh --packages org.thales --runtime spark --jar my_spark_nodes.jar
punchplatform-inspect-node.sh --packages org.thales --runtime spark --jar my_spark_nodes.jar --output-dir /tmp/my_folder
punchplatform-inspect-node.sh --packages org.thales --runtime spark --output-dir /tmp/my_folder --mllib --base-class org.apache.spark.ml.PipelineStage
Pyspark and internal nodes¶
punchplatform-inspect-node.sh --packages punchline_python --runtime pyspark
punchplatform-inspect-node.sh --packages mypackagenode --runtime pyspark --pex mypysparknode.pex --output-dir /tmp/myfolder
Storm and internal nodes¶
punchplatform-inspect-node.sh --packages org.thales --runtime storm --jar my_storm_nodes.jar
punchplatform-inspect-node.sh --packages org.thales --runtime storm --jar my_storm_nodes.jar --output-dir /tmp/my_folder
LIBRARIES¶
punchplatform-inspect-node.sh uses the library located :
- $PUNCHPLATFORM_INSTALL_DIR/lib/spark/punchplatform-spark-scanner*with-dependencies.jar
- $PUNCHPLATFORM_INSTALL_DIR/lib/pyspark/punchplatform-pyspark-*.pex
LOGGERS¶
The logging verbosity of punchplatform-inspect-node.sh is controlled by the following two files:
- $PUNCHPLATFORM_LOG4J_CONF_DIR/log4j2-punchline.xml
- $PUNCHPLATFORM_LOG4J_CONF_DIR/log4j2.properties
ENVIRONMENT¶
punchplatform-inspect-node.sh is only executed in the context of the operator terminal environment.