为什么HDFS FileNotFoundException:文件不存在



我从K8S pod访问HDFS,但它找不到HDFS文件。

Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://192.168.65.2:8020/user/flink/[/user/flink/.flink/job-1234/1624538371951/systemShipFiles/examples-wc.jar]
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1309)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2030)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1999)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1975)
at org.apache.flink.kubernetes.entrypoint.KubernetesInitContainerEntrypoint.fetchRemoteDependencies(KubernetesInitContainerEntrypoint.java:121)
at org.apache.flink.kubernetes.entrypoint.KubernetesInitContainerEntrypoint.main(KubernetesInitContainerEntrypoint.java:72)
Error from server (BadRequest): container "flink-job-manager" in pod "job-1234-b8d68f956-c4g58" is waiting to start: PodInitializing

但我可以用hadoop命令获得文件作为流:

hadoop fs -ls /user/flink/.flink/job-1234/1624538371951/systemShipFiles/oceanus-examples-wc.jar
21/06/24 20:43:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
-rw-r--r--   1 flink supergroup      13981 2021-06-24 20:39 /user/flink/.flink/job-1234/1624538371951/systemShipFiles/examples-wc.jar

HADOOP处于本地模式。

谢谢你的帮助。

您可以尝试ls的完整路径前缀为hdfs://192.168.65.2:8020/user/......正如它在堆栈中出现的那样。

在core-site.xml中检查,fs.defaultFs值已指定并且正确。

相关内容

  • 没有找到相关文章

最新更新