Hadoop入門之Hadoop中的HelloWorld程序
初步接觸Hadoop,必不可少的就是運行屬于Hadoop的Helloworld程序——wordcount,其實,安裝好的Hadoop集群上已有相應的程序,但是我想在window平臺,eclipse上執行。
在linux平臺上執行wordcount,有官方示例,相應的jar包放在hadoop-2.0.0-cdh4.5.0\share\hadoop\mapreduce1下的hadoop-examples-2.0.0-mr1-cdh4.5.0.jar(注:本人用的是CDH4.5.0版本),我們首先需要準備好數據:
echo "Hello World Hello Hadoop" > 1.txt echo "Hello Hadoop Bye " >2.txt
然后把數據put到HDFS里:
hadoop fs -mkdir /input hadoop fs -put /root/1.txt /input hadoop fs -put /root/2.txt /input
再然后進入到jar所在的目錄里“
cd hadoop-2.0.0-cdh4.5.0\share\hadoop\mapreduce1
執行命令:
hadoop jar hadoop-mapreduce-examples-2.0.0-cdh4.5.0.jar WordCount /input /output
其中,/output是執行結果輸出目錄。
到此,HelloWorld就順利執行了,你可以用hadoop fs -cat /output/part 命令來查看結果.
接下來,我們在看看在window上的eclipse如何執行。
首先貼出代碼:
public class WordCount { // mapper public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> { private static IntWritable one = new IntWritable(1); private Text word = new Text(); @Override public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String line = value.toString(); StringTokenizer token = new StringTokenizer(line); while (token.hasMoreElements()) { word.set(token.nextToken()); context.write(word, one); } }; } // reduce public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> { protected void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException { int sum = 0; for (IntWritable value : values) { sum += value.get(); } context.write(key, new IntWritable(sum)); }; } public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); System.setProperty("HADOOP_USER_NAME", "root");//這句話很重要,要不然會告你沒有權限執行 Job job = new Job(conf); String[] ioArgs = new String[] { "hdfs://192.168.1.101:7001/input", "hdfs://192.168.1.101:7001/output" }; String[] otherArgs = new GenericOptionsParser(conf, ioArgs).getRemainingArgs(); job.setJarByClass(WordCount.class); FileInputFormat.addInputPath(job, new Path(otherArgs[0])); FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); job.setMapperClass(Map.class); job.setReducerClass(Reduce.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); System.exit(job.waitForCompletion(true) ? 0 : 1); } }
然后在eclipse上點執行即可,在執行時可能發現jvm內存不夠,添加-Xmx1024M參數執行即可。
來自:http://my.oschina.net/savez/blog/203542
本文由用戶 jopen 自行上傳分享,僅供網友學習交流。所有權歸原作者,若您的權利被侵害,請聯系管理員。
轉載本站原創文章,請注明出處,并保留原始鏈接、圖片水印。
本站是一個以用戶分享為主的開源技術平臺,歡迎各類分享!