spark log4j日志配置

1.spark启动参数

先把log4j配置文件放到hdfs:hdfs://R2/projects/log4j-debug.properties

复制代码
--conf spark.yarn.dist.files=hdfs://R2/projects/log4j-debug.properties#log4j-first.properties \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:log4j-first.properties" \
--conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/heapdump.hprof -Dlog4j.configuration=file:log4j-first.properties" \

2.log4j.properties(INFO日志)

复制代码
# Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=INFO

# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=ERROR
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache=WARN
log4j.logger.parquet=ERROR
log4j.logger.org.apache.spark.deploy.yarn=INFO

log4j.logger.org.apache.hudi=INFO

log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStoreClient=INFO
log4j.logger.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient=INFO
log4j.logger.hive.metastore=INFO

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

3.log4j-debug.properties(DEBUG日志)

复制代码
# Set everything to be logged to the console
log4j.rootCategory=DEBUG, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=INFO

# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=ERROR
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache=WARN
log4j.logger.parquet=ERROR
log4j.logger.org.apache.spark.deploy.yarn=INFO

log4j.logger.org.apache.hudi=INFO

log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStoreClient=INFO
log4j.logger.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient=INFO
log4j.logger.hive.metastore=INFO

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
相关推荐
河南博为智能科技有限公司1 小时前
高集成度国产八串口联网服务器:工业级多设备联网解决方案
大数据·运维·服务器·数据库·人工智能·物联网
无代码专家3 小时前
设备巡检数字化解决方案:构建高效闭环管理体系
java·大数据·人工智能
天远数科3 小时前
Node.js 原生加密指南:详解 Crypto 模块对接天远银行卡黑名单接口
大数据·api
expect7g3 小时前
Paimon Branch --- 流批一体化之二
大数据·后端·flink
天远云服3 小时前
高并发风控实践:AES 加密与银行卡风险标签清洗的 Go 语言实现
大数据·api
无级程序员3 小时前
datasophon中dolpinscheduler的自定义配置common.properties不生效问题解决
大数据
珠海西格电力3 小时前
零碳园区基础架构协同规划:能源-建筑-交通-数字系统的衔接逻辑
大数据·人工智能·智慧城市·能源
weixin_537217064 小时前
AI 智能体如何利用文件系统进行上下文工程
大数据·人工智能
见识星球4 小时前
名企校招攻略
大数据·python
路边草随风4 小时前
starrocks compaction 进度问题定位
大数据·sql