执行 sqoop 作业时出现"找不到文件"错误



当我执行 sqoop 作业时,它会抛出一个 FileNotFoundException 错误,如下所示

29-05-18 06:18:59 信息或编译管理器:编写jar文件:/tmp/sqoop-hduser/compile/0ce66d1f09ce960a71c165855afbe42c/QueryResult.jar 29-05-18 06:18:59 信息地图减少。ImportJobBase:开始查询导入。 18/05/29 06:18:59 INFO Configuration.deprecation: mapred.job.tracker 已被弃用。相反,请使用mapreduce.jobtracker.address 29-05-18 06:18:59 警告效用。NativeCodeLoader:无法为您的平台加载原生 hadoop 库...在适用的情况下使用内置的 Java 类 29-05-18 06:18:59 INFO 配置.弃用:映射.jar 已弃用。相反,请使用mapreduce.job.jar 18/05/29 06:19:01 INFO Configuration.deprecation: mapred.job.tracker 已被弃用。相反,请使用mapreduce.jobtracker.address 18/05/29 06:19:01 INFO Configuration.deprecation: mapred.map.tasks 已弃用。相反,请使用mapreduce.job.maps 18/05/29 06:19:01 INFO 配置.弃用:session.id 已弃用。请改用 dfs.metrics.session-id 29-05-18 06:19:01 信息jvm。JvmMetrics: 使用 processName=JobTracker, sessionId= 初始化 JVM Metrics 29-05-18 06:19:01 信息地图减少。JobSubmitter:清理暂存区域文件:/app/hadoop/tmp/mapred/staging/hduser1354549662/.staging/job_local1354549662_0001 18/05/29 06:19:01 错误工具。ImportTool:遇到 IOException 运行导入作业:java.io.FileNotFoundException:文件不存在:hdfs://svn-server:54310/home/hduser/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/postgresql-9.2-1002-jdbc4.jar at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122( at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114( at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81( at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114( at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288( at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224( at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:99( at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57( at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:269( at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:390( at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:483( at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296( at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293( at java.security.AccessController.doPrivileged(Native Method( at javax.security.auth.Subject.doAs(Subject.java:415( at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628( at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293( at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314( at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196( at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169( at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266( at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:729( at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:499( at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605( at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:228( at org.apache.sqoop.tool.JobTool.run(JobTool.java:283( at org.apache.sqoop.Sqoop.run(Sqoop.java:143( at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70( at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179( at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218( at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227( at org.apache.sqoop.Sqoop.main(Sqoop.java:236( `

它应该在本地sqoop/lib目录中查找jar和其他依赖项,但它在HDFS中查找它,其文件路径与我的本地sqoop lib路径相同。根据项目要求,我需要 sqoop 来查看我当地的图书馆。我怎样才能做到这一点?谢谢。

您是否在 .bashrc 文件中为 sqoop 设置了SQOOP_HOMEPATH环境变量

如果没有,请添加

vi ~/.bashrc

包括以下行

export SQOOP_HOME=/your/path/to/the/sqoop
export PATH=$PATH:$SQOOP_HOME/bin

保存文件并执行以下命令

来源 ~/.巴什尔克

希望这对您有帮助!!

相关内容

最新更新