1增加主機映射(與namenode的映射一樣): 增加最後一行 2新建用戶hadoop 建立hadoop用戶組 新建用戶,useradd -d /usr/hadoop -g hadoop -m hadoop (新建用戶hadoop指定用戶主目錄/usr/hadoop 及所屬組hadoop) pass ...
1增加主機映射(與namenode的映射一樣):
增加最後一行
[root@localhost ~]# su - root
1 [root@localhost ~]# vi /etc/hosts 2 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 3 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 4 192.168.48.129 hadoop-master 5 [root@localhost ~]#
2新建用戶hadoop
建立hadoop用戶組
新建用戶,useradd -d /usr/hadoop -g hadoop -m hadoop (新建用戶hadoop指定用戶主目錄/usr/hadoop 及所屬組hadoop)
passwd hadoop 設置hadoop密碼(這裡設置密碼為hadoop)
[root@localhost ~]# groupadd hadoop
[root@localhost ~]# useradd -d /usr/hadoop -g hadoop -m hadoop
[root@localhost ~]# passwd hadoop
3配置jdk環境
本次安裝的是hadoop-2.7.5,需要JDK 7以上版本。若已安裝可跳過。
JDK安裝可參考:https://www.cnblogs.com/shihaiming/p/5809553.html
或者直接拷貝master上的JDK文件更有利於保持版本的一致性。
[root@localhost java]# su - root
[root@localhost java]# mkdir -p /usr/java
[root@localhost java]# scp -r hadoop@hadoop-master:/usr/java/jdk1.7.0_79 /usr/java
[root@localhost java]# ll
total 12
drwxr-xr-x. 8 root root 4096 Feb 13 01:34 default
drwxr-xr-x. 8 root root 4096 Feb 13 01:34 jdk1.7.0_79
drwxr-xr-x. 8 root root 4096 Feb 13 01:34 latest
設置Java及hadoop環境變數
確保/usr/java/jdk1.7.0.79存在
su - root
vi /etc/profile
確保/usr/java/jdk1.7.0.79存在
1 unset i 2 unset -f pathmunge 3 JAVA_HOME=/usr/java/jdk1.7.0_79 4 CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar 5 PATH=/usr/hadoop/hadoop-2.7.5/bin:$JAVA_HOME/bin:$PATH
設置生效(重要)
[root@localhost ~]# source /etc/profile
[root@localhost ~]#
JDK安裝後確認:
[hadoop@localhost ~]$ java -version
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
[hadoop@localhost ~]$
4設置hadoop的環境變數
拷貝namenode上已配置好的hadoop目錄到當前主機
[root@localhost ~]# su - hadoop Last login: Sat Feb 24 14:04:55 CST 2018 on pts/1 [hadoop@localhost ~]$ pwd /usr/hadoop
[hadoop@localhost ~]$ scp -r hadoop@hadoop-master:/usr/hadoop/hadoop-2.7.5 .
The authenticity of host 'hadoop-master (192.168.48.129)' can't be established.
ECDSA key fingerprint is 1e:cd:d1:3d:b0:5b:62:45:a3:63:df:c7:7a:0f:b8:7c.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'hadoop-master,192.168.48.129' (ECDSA) to the list of known hosts.
hadoop@hadoop-master's password:
[hadoop@localhost ~]$ ll total 0 drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Desktop drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Documents drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Downloads drwxr-xr-x 10 hadoop hadoop 150 Feb 24 14:30 hadoop-2.7.5 drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Music drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Pictures drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Public drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Templates drwxr-xr-x 2 hadoop hadoop 6 Feb 24 11:32 Videos [hadoop@localhost ~]$
到此,Hadoop的客戶端安裝就算完成了,接下來就可以使用了。
執行hadoop命令結果如下,
[hadoop@localhost ~]$ hadoop
Usage: hadoop [--config confdir] [COMMAND | CLASSNAME]
CLASSNAME run the class named CLASSNAME
or
where COMMAND is one of:
fs run a generic filesystem user client
version print the version
jar <jar> run a jar file
note: please use "yarn jar" to launch
YARN applications, not this command.
checknative [-a|-h] check native hadoop and compression libraries availability
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
classpath prints the class path needed to get the
credential interact with credential providers
Hadoop jar and the required libraries
daemonlog get/set the log level for each daemon
trace view and modify Hadoop tracing settings
Most commands print help when invoked w/o parameters.
[hadoop@localhost ~]$
5.使用hadoop
創建本地文件
[hadoop@localhost ~]$ hdfs dfs -ls
Found 1 items
drwxr-xr-x - hadoop supergroup 0 2018-02-22 23:41 output
[hadoop@localhost ~]$ vi my-local.txt hello boy! yehyeh
上傳本地文件至集群
[hadoop@localhost ~]$ hdfs dfs -mkdir upload [hadoop@localhost ~]$ hdfs dfs -ls upload [hadoop@localhost ~]$ hdfs dfs -ls Found 2 items drwxr-xr-x - hadoop supergroup 0 2018-02-22 23:41 output drwxr-xr-x - hadoop supergroup 0 2018-02-23 22:38 upload
[hadoop@localhost ~]$ hdfs dfs -ls upload
[hadoop@localhost ~]$ hdfs dfs -put my-local.txt upload
[hadoop@localhost ~]$ hdfs dfs -ls upload
Found 1 items
-rw-r--r-- 3 hadoop supergroup 18 2018-02-23 22:45 upload/my-local.txt
[hadoop@localhost ~]$ hdfs dfs -cat upload/my-local.txt
hello boy!
yehyeh
[hadoop@localhost ~]$
ps:註意本地java版本與master拷貝過來的文件中/etc/hadoop-env.sh配置的JAVA_HOME是否要保持一致沒有驗證過,本文是保持一致的