1. 官網下載 wget http://mirror.bit.edu.cn/apache/hadoop/common /hadoop-3.0.0-alpha3/hadoop-3.0.0-alpha3.tar.gz ..註意下載二進位(省時間) 1解壓 tar -zxvf hadoop-3.0.0-a ...
1. 官網下載
wget http://mirror.bit.edu.cn/apache/hadoop/common /hadoop-3.0.0-alpha3/hadoop-3.0.0-alpha3.tar.gz
..註意下載二進位(省時間)
1解壓
tar -zxvf hadoop-3.0.0-alpha3-src.tar.gz -C /opt/
cd /opt/
mv hadoop-3.0.0-alpha3/ hadoop3
3. 環境變數
vi /etc/profile
#Hadoop 3.0
export HADOOP_HOME=/opt/hadoop3
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
source /etc/profile
4. 配置文件 hadoop3/etc/hadoop
1) hadoop-env.sh
註意: /usr/java/jdk1.8為jdk安裝路徑
2)core-site.xml
註意:hdfs://ha01:9000 中ha01是主機名
3) hdfs-site.xml
4) yarn-site.xml
註意:ha01是主機名
5) mapred-site.xml
註意:如果不配置mapreduce.admin.user.env 和 yarn.app.mapreduce.am.env這兩個參數mapreduce運行時有可能報錯。
6)workers
這裡跟以前的slaves配置文件類似,寫上從節點的主機名
5. 複製到其他節點
scp -r hadoop-3.0.0-alpha1 root@ha02:/usr/local
scp -r hadoop-3.0.0-alpha1 root@ha03:/usr/local
6. 啟動
start-dfs.sh
start-yarn.sh
hdfs dfs -ls /
7. 運行wordcount
hdfs dfs -cat /logs/wordcount.log
hadoop jar /usr/local/hadoop3/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.0.0-alpha1.jar wordcount /logs/wordcount.log /output/wordcount
註意: /usr/local/hadoop3是解壓路徑,/logs/wordcount.log是輸入文件路徑,/output/wordcount運行結果路徑
查看運行結果: hdfs dfs -text /output/wordcount/part-r-00000
8. Web界面
http://ha01:9870/
http://ha01:8088/