MapReduce驱动程序的addInputPath中出错



我在MapReduce驱动程序的addInputPath方法中得到错误。错误为

"The method addInputPath(Job, Path) in the type FileInputFormat is not applicable for the arguments (JobConf, Path)"

这是我给司机的代码:

package org.myorg;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class WordCount extends Configured implements Tool{
    public int run(String[] args) throws Exception
    {
          //creating a JobConf object and assigning a job name for identification purposes
          JobConf conf = new JobConf(getConf(), org.myorg.WordCount.class);
          conf.setJobName("WordCount");
          //Setting configuration object with the Data Type of output Key and Value
          conf.setOutputKeyClass(Text.class);
          conf.setOutputValueClass(IntWritable.class);
          //Providing the mapper and reducer class names
          conf.setMapperClass(WordCountMapper.class);
          conf.setReducerClass(WordCountReducer.class);
          //the hdfs input and output directory to be fetched from the command line
          **FileInputFormat.addInputPath(conf, new Path(args[0]));**
          FileOutputFormat.setOutputPath(conf, new Path(args[1]));
          JobClient.runJob(conf);
          return 0;
    }
    public static void main(String[] args) throws Exception
    {
          int res = ToolRunner.run(new Configuration(), new WordCount(),args);
          System.exit(res);
    }
}

我导入正确的org.apache.hadop.mapred.FileOutputFormat.

我的WordCountMapper正确地实现了Mapper。

FileOutputFormat.setOutputPath工作正常。

addInputhPaths为什么抛出错误?

问题是您正在混合旧的API(.mapred.)和新的API(.mapreduce.)。这两个API不兼容。

我建议您使用新的API中的所有对象,而不要使用旧的API中的任何对象。也就是说,不要使用JobConfJobClient。请改用JobConfiguration。请确保从包含.mapreduce.而非.mapred.的导入中使用MapperReducer等。

相关内容

  • 没有找到相关文章