这是我的运行方法
Configuration conf = new Configuration();
MongoConfigUtil.setOutputURI(conf, "mongodb://localhost/test/sensors");
System.out.println("Conf : " + conf);
conf.set("fs.defaultFS", "hdfs://localhost:9000");
@SuppressWarnings("deprecation")
Job job = new Job(conf, "sensor");
job.setJarByClass(HdfsToMongo.class);
job.setMapperClass(TokenzierMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
job.setInputFormatClass(TextInputFormat.class);
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.setInputPaths(job, new Path("In"));
job.setOutputFormatClass(MongoOutputFormat.class);
return job.waitForCompletion(true) ? 0 : 1;
我有一个hadoop的工作,它连接mongodb,并获得数据到hdfs。当我运行代码时,我得到这个异常
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.LocalJobRunner.<init>(Lorg/apache/hadoop/conf/Configuration;)V
at org.apache.hadoop.mapred.LocalClientProtocolProvider.create(LocalClientProtocolProvider.java:42)
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1266)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1262)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1261)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1290)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
at tr.com.vedat.hadoop.HdfsToMongo.run(HdfsToMongo.java:86)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at tr.com.vedat.hadoop.HdfsToMongo.main(HdfsToMongo.java:90)
在HdfsToMongo类中,错误发生在job.waitForCompletion(true) ? 0 : 1;
行。有人能指出我的缺点吗?我是不是忘记写代码了?
小心你导入的类。您需要在导入中使用正确的包。
您可能正在从包org.apache.hadoop.mapred
导入旧的API。
但是你应该总是从org.apache.hadoop.mapreduce
包中导入类