数据节点错误.DataNode:secureMain Hadoop配置中出现异常



我正在Windows上尝试配置hadoop我有这个错误:

org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
at org.apache.hadoop.hdfs.server.datanode.checker.StorageLocationChecker.check(StorageLocationChecker.java:233)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2841)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2754)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2798)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2942)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2966)
2021-10-12 11:07:43,633 INFO util.ExitUtil: Exiting with status 1: org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
2021-10-12 11:07:43,641 INFO datanode.DataNode: SHUTDOWN_MSG:

我已经尝试设置:

<property>
<name>dfs.datanode.failed.volumes.tolerated</name>
<value>0</value>
</property>

但是,我抛出了其他错误,不起作用。

我只在hadoop-env.sh:中解决了这个问题

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_CLASSPATH=/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar

相关内容

  • 没有找到相关文章

最新更新