用Scala代码限制spaek中的LOGS



我已经在我的电脑中安装了spark 3.3.0版本。我想限制日志,只在控制台中显示我使用的错误日志

Logger.getLogger("org&"(.setLevel(Level.ERROR(

我仍在控制台中获取信息日志

Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
22/10/21 18:04:39 INFO SparkContext: Running Spark version 3.3.0
22/10/21 18:04:39 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/10/21 18:04:39 INFO ResourceUtils: ==============================================================
22/10/21 18:04:39 INFO ResourceUtils: No custom resources configured for spark.driver.
22/10/21 18:04:39 INFO ResourceUtils: ==============================================================
22/10/21 18:04:39 INFO SparkContext: Submitted application: My first assignment
22/10/21 18:04:40 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
22/10/21 18:04:40 INFO ResourceProfile: Limiting resource is cpu

我应该如何控制这些日志。我是新的火花

从修改您的spark-3.3.0/conf/log4j.properties

log4j.rootCategory=INFO, console

log4j.rootCategory=WARN, console

如果log4j.properties不存在,请使用:

cp spark-3.3.0/conf/log4j.properties.template spark-3.3.0/conf/log4j.properties.template

最新更新