需要在log4j xml文件中设置动态参数,并支持spark任务在集群模式下,动态参数读取正常;
1.log4j配置文件 log4j2.xml
bash
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="info" name="Log-Appender-Config" >
<Properties>
<Property name="logServerHost">${sys:logServer}</Property>
<Property name="logServerPort">${sys:logServerPort}</Property>
</Properties>
<Appenders>
<Socket name="Socket" host="${logServerHost}" port="${logServerPort}" >
<PatternLayout pattern="%d{HH:mm:ss.SSS} %-5p [%-t] - %m%n"/>
<ThresholdFilter level="info" />
</Socket>
<Async name="Async">
<AppenderRef ref="Socket"/>
</Async>
<Console name="stdout" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} %-5p [%-7t] %F:%L - %m%n"/>
<ThresholdFilter level="info" />
</Console>
</Appenders>
<Loggers>
<Logger name="org.apache.spark" level="info" additivity="false">
<AppenderRef ref="Socket"/>
</Logger>
<Root level="info">
<AppenderRef ref="stdout"/>
</Root>
</Loggers>
</Configuration>
spark 配置log4j2.xml,并设置了两个动态参数,logServer和logServerPort;
spark提交任务后driver和executer均需要获取上述参数初始化日志配置;
2.driver和executer获取参数方式
bash
--conf "spark.driver.extraJavaOptions=-DlogServer=127.0.0.1 -DlogServerPort=60201"
--conf "spark.executor.extraJavaOptions=-DlogServer=127.0.0.1 -DlogServerPort=60201"
spark提交任务时设定动态参数值,给driver和executer设置额外jvm参数;
注意:
参数key命名不要以spark*开头,会被spark认为是内部变量, 在Spark中,spark.executor.extraJavaOptions
是用来为Executor进程设置额外的JVM选项的,但不包括Spark的内部配置选项;
错误示例如下:
Caused by: java.lang.Exception: spark.executor.extraJavaOptions is not allowed to set Spark options (was '-Dspark.log.server=127.0.0.1 -Dspark.log.server.port=60201'). Set them directly on a SparkConf or in a properties file when using ./bin/spark-submit.
将上述key名spark.log.server和spark.log.server.port修改为非spark开头即可;
参考: