有没有一种方法可以将系统参数(类似于-Dmy_param=XXX)传递到hadoop map reduce框架中的map函数。向hadoop集群提交作业是通过.setJarByClass()完成的。在mapper中,我必须创建配置,所以我想使其可配置,所以认为通过属性文件的标准方式是可以的。只是在设置属性的地方传递参数。另一种方法是将属性文件添加到提交的jar中。有人有解决问题的经验吗?
如果您还没有在作业中使用它,您可以尝试GenericOptionsParser、Tool和ToolRunner来运行Hadoop作业。
注意:MyDriver扩展了Configured并实现了Tool。并且,要运行您的作业,请使用此
hadoop -jar somename.jar MyDriver -D your.property=value arg1 arg2
有关详细信息,请查看此链接。
以下是我为您准备的一些示例代码:
public class MyDriver extends Configured implements Tool {
public static class MyDriverMapper extends Mapper<LongWritable, Text, LongWritable, NullWritable> {
protected void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
// In the mapper you can retrieve any configuration you've set
// while starting the job from the terminal as shown below
Configuration conf = context.getConfiguration();
String yourPropertyValue = conf.get("your.property");
}
}
public static class MyDriverReducer extends Reducer<LongWritable, NullWritable, LongWritable, NullWritable> {
protected void reduce(LongWritable key, Iterable<NullWritable> values, Context context)
throws IOException, InterruptedException {
// --- some code ---
}
}
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new MyDriver(), args);
System.exit(exitCode);
}
@Override
public int run(String[] args) throws Exception {
Configuration conf = getConf();
// if you want you can get/set to conf here too.
// your.property can also be file location and after
// you retrieve the properties and set them one by one to conf object.
// --other code--//
Job job = new Job(conf, "My Sample Job");
// --- other code ---//
return (job.waitForCompletion(true) ? 0 : 1);
}
}