在hadoop上运行kmean时出现PriviledgedActionException



我正在尝试使用此准则在hadoop上运行KMean。

http://www.slideshare.net/titusdamaiyanti/hadoop-installation-k-means-clustering-mapreduce?qid=44b5881c-089d-474b-b01d-c35a2f91cc67&v=qf1&b=&from_search=1#喜欢面板

我在eclipse luna上运行这个。当我执行时,map和reduce都显示它们完成了100%。但我没有得到产出。相反,我在最后得到了以下错误。请帮我解决这个问题。。

15/03/20 11:29:44 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-hduser/mapred/staging/hduser378797276/.staging/job_local378797276_0002
15/03/20 11:29:44 ERROR security.UserGroupInformation: PriviledgedActionException as:hduser cause:java.io.IOException: No input paths specified in job
Exception in thread "main" java.io.IOException: No input paths specified in job

at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:193)

at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:55)

at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)

at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1054)

at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1071)

at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)

at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)

at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)

at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)

at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)

at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)

at com.clustering.mapreduce.KMeansClusteringJob.main(KMeansClusteringJob.java:114)

在运行map reduce程序之前,您必须提供输入文件的位置。提供输入文件位置有两种方法。

  1. 使用ecliple go运行配置并提供文件名作为参数
  2. 将程序转换为jar文件,并在hadoop集群中运行以下命令

    hadoop jar NameOfYourJarFile InputFileLocation OutPutFileLocation`

相关内容

  • 没有找到相关文章

最新更新