Skip to content

HOWTO activate logging

Why do that

To debug or understand better the behavior of a PML job.


You need a punchplatform-standalone installed with spark. The easiest way to work with Spark (and PML) is to launch the job in foreground mode by using the command. For example if you have a job defined in the [job.pml] file, use the following command

$ punchlinectl --punchline job.pml -v

What to do

Configure the Spark

Spark use log4j. It is located in

$ punchplatform-standalone-*/external/spark-x.y.z-bin-hadoop2.7/conf/

There activate the loggers you need. For example should you need debug logging for the punch stage :

log4j.rootCategory=INFO, console
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %t %c: %m%n

!!! warning By default the delivered is configured only with ERROR level so as to limit standard output to the most relevant spark outputs.""

Important loggers

  • org.apache.spark : the Spark loggers
  • org.thales.punch : the various punchplatform loggers
  • : these are the legacy classes. They will progressively vanish.