Passing parameters to Mappers and Reducers

There might be a requirement to pass additional parameters to the mapper and reducers, besides the the inputs which they process. Lets say we are interested in Matrix multiplication and there are multiple ways/algorithms of doing it. We could send an input parameter to the mapper and reducers, based on which the appropriate way/algorithm is picked. There are multiple ways of doing this

Setting the parameter:

1. Use the -D command line option to set the parameter while running the job.

2. Before launching the job using the old MR API

?
1
2
JobConf job = (JobConf) getConf();
job.set("test", "123");

3. Before launching the job using the new MR API

?
1
2
3
Configuration conf = new Configuration();
conf.set("test", "123");
Job job = new Job(conf);

Getting the parameter:

1. Using the old API in the Mapper and Reducer. The JobConfigurable#configure has to be implemented in the Mapper and Reducer class.

?
1
2
3
4
private static Long N;
public void configure(JobConf job) {
    N = Long.parseLong(job.get("test"));
}

The variable N can then be used with the map and reduce functions.

2. Using the new API in the Mapper and Reducer. The context is passed to the setup, map, reduce and cleanup functions.

?
1
2
Configuration conf = context.getConfiguration();
String param = conf.get("test");

Comments

Popular posts from this blog

Cloudera Data Hub: Where Agility Meets Control

3X FASTER INTERACTIVE QUERY WITH APACHE HIVE LLAP

Introduction to HDFS Erasure Coding in Apache Hadoop

Big Data Trendz