Windows 上运行 Hadoop WordCount 用例
2021-03-09 04:29
标签:host false blob split out inf complete dir directory 下载文件夹放置你的目录: https://github.com/yjy24/bigdata_learning/blob/master/hadoopMapRedSimple.zip datanode log: namenode log: yarn resource manager log yarn nodemanager log C:\WINDOWS\system32>hadoop fs -mkdir /input_dirs C:\WINDOWS\system32>hadoop dfsadmin -safemode leave C:\WINDOWS\system32>hadoop fs -mkdir /input_dirs C:\WINDOWS\system32> Windows 上运行 Hadoop WordCount 用例 标签:host false blob split out inf complete dir directory 原文地址:https://www.cnblogs.com/yjyyjy/p/12763173.html1. 下载wordcount jar 文件
2. 启动 hadoop
C:\WINDOWS\system32>start-all.cmd
This script is Deprecated. Instead use start-dfs.cmd and start-yarn.cmd
starting yarn daemons
3. 运行程序
mkdir: Cannot create directory /input_dirs. Name node is in safe mode.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Safe mode is OFF
C:\WINDOWS\system32>hadoop fs -put C:/Learning/lessons/Hadoop-class/hadoopMapRedSimple/inputfile.txt /input_dirs
C:\WINDOWS\system32>
C:\WINDOWS\system32>hadoop dfs -cat /input_dirs/inputfile.txt
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
23 23 27 43 24 25 26 26 26 26 25 26 25
26 27 28 28 28 30 31 31 31 30 30 30 29
31 32 32 32 33 34 35 36 36 34 34 34 34
39 38 39 39 39 41 42 43 40 39 38 38 40
38 39 39 39 39 41 41 41 28 40 39 39 45
C:\WINDOWS\system32>
C:\WINDOWS\system32>hadoop jar C:/Learning/lessons/Hadoop-class/hadoopMapRedSimple/MapReduceClient.jar wordcount /input_dirs /output_dirs
20/04/23 14:49:52 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
20/04/23 14:49:55 INFO input.FileInputFormat: Total input files to process : 1
20/04/23 14:49:55 INFO mapreduce.JobSubmitter: number of splits:1
20/04/23 14:49:56 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled
20/04/23 14:49:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1587642121010_0001
20/04/23 14:49:58 INFO impl.YarnClientImpl: Submitted application application_1587642121010_0001
20/04/23 14:49:58 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1587642121010_0001/
20/04/23 14:49:58 INFO mapreduce.Job: Running job: job_1587642121010_0001
20/04/23 14:50:28 INFO mapreduce.Job: Job job_1587642121010_0001 running in uber mode : false
20/04/23 14:50:28 INFO mapreduce.Job: map 0% reduce 0%
20/04/23 14:50:47 INFO mapreduce.Job: map 100% reduce 0%
20/04/23 14:51:07 INFO mapreduce.Job: map 100% reduce 100%
20/04/23 14:51:08 INFO mapreduce.Job: Job job_1587642121010_0001 completed successfully
20/04/23 14:51:08 INFO mapreduce.Job: Counters: 49
......
C:\WINDOWS\system32>hadoop dfs -cat /output_dirs/*
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
23 12
24 6
25 18
26 36
27 12
28 24
29 6
30 24
31 24
32 18
33 6
34 30
35 6
36 12
38 24
39 66
40 18
41 24
42 6
43 12
45 6
C:\WINDOWS\system32>
文章标题:Windows 上运行 Hadoop WordCount 用例
文章链接:http://soscw.com/index.php/essay/62130.html