java - Hadoop setMapperClass and SetReducerClass not working? -
i new hadoop.i have 3 separate files map,reduce , mapreduce code.the mapper , reducer files got compiled main class throws cannot find symbol error in setmapperclass , setreducerclass methods.this code main:
import org.apache.hadoop.fs.path; import org.apache.hadoop.io.intwritable; import org.apache.hadoop.io.text; import org.apache.hadoop.mapreduce.job; import org.apache.hadoop.mapreduce.lib.input.fileinputformat; import org.apache.hadoop.mapreduce.lib.output.fileoutputformat; import org.apache.hadoop.mapreduce.mapper; public class maxtemperature { public static void main(string[] args) throws exception { if (args.length != 2) { system.err.println("usage: maxtemperature <input path> <output path>"); system.exit(-1); } job job = new job(); job.setjarbyclass(maxtemperature.class); job.setjobname("max temperature"); fileinputformat.addinputpath(job, new path(args[0])); fileoutputformat.setoutputpath(job, new path(args[1])); job.setmapperclass(maxtemperaturemapper.class); //error cannot find symbol job.setreducerclass(maxtemperaturereducer.class); //error cannot find symbol job.setoutputkeyclass(text.class); job.setoutputvalueclass(intwritable.class); }}
the version of hadoop utilize 2.5.1 , compile using
hdfs com.sun.tools.javac.main /usr/local/hadoop/share/hadoop/mapreduce/maxtemperature.java
i used same command compile mapper , reducer programs.
java hadoop mapreduce
No comments:
Post a Comment