1. VMware tools 설치 [root계정]

VM - vmware install tools - 폴더 > VMwareTools.tar.gz를 /root로 복사 

> tar 풀기 : tar -xvf VMwareTools~.tar.gz > 

> cd vmware-tools-distrib > 설치 : ./vmware-install.pl > 모두 enter, GCC no


2. 다운로드 [hadoop계정]

- JDK : http://java.oracle.com > JavaSE > JDK1.8 > linux64/tar.gz 

- Eclipse : http://www.eclipse.org > oxygen > linux64/tar.gz 

- Hadoop : http://hadoop.apache.org > download > release > 2.8/binary > HTTP 링크 


> 파일저장 후 

> root 계정 전환

> /usr/local 위치에 풀기 (tar -xvf 파일) 

> 소유권변경 (chown -R hadoop:hadoop 폴더)


[hadoop@localhost ~]$ su -

암호:

[root@localhost ~]# cd /usr/local

[root@localhost local]# tar -xvf /home/hadoop/다운로드/jdk-8u151-linux-x64.tar.gz 

[root@localhost local]# tar -xvf /home/hadoop/다운로드/hadoop-2.8.2.tar.gz 
[root@localhost local]# tar -xvf /home/hadoop/다운로드/eclipse-inst-linux64.tar.gz 

[root@localhost local]# chown -R hadoop:hadoop jdk1.8.0_151/

[root@localhost local]# chown -R hadoop:hadoop hadoop-2.8.2/

[root@localhost local]# chown -R hadoop:hadoop eclipse-installer/


3. Open JDK 삭제 >> master, slave1, slave2

[root@localhost local]# java -version

openjdk version "1.8.0_131"

OpenJDK Runtime Environment (build 1.8.0_131-b12)

OpenJDK 64-Bit Server VM (build 25.131-b12, mixed mode)

[root@localhost local]# rpm -qa | grep jdk

copy-jdk-configs-2.2-3.el7.noarch

java-1.8.0-openjdk-1.8.0.131-11.b12.el7.x86_64

java-1.8.0-openjdk-headless-1.8.0.131-11.b12.el7.x86_64

[root@localhost local]# yum remove java-1.8.0-openjdk-1.8.0.131-11.b12.el7.x86_64

[root@localhost local]# yum remove java-1.8.0-openjdk-headless-1.8.0.131-11.b12.el7.x86_64


4. bashrc 환경설정 [hadoop계정] >> master, slave1, slave2

[root@localhost local]# exit

logout

[hadoop@localhost ~]$ vi .bash_profile


export PATH=$PATH:$HOME/bin

export JAVA_HOME=/usr/local/jdk1.8.0_151

export HADOOP_INSTALL=/usr/local/hadoop-2.7.4

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_INSTALL/bin


[hadoop@localhost ~]$ source .bash_profile

[hadoop@localhost ~]$ hadoop version

[hadoop@localhost ~]$ java -version

[hadoop@localhost ~]$ javac -version


5. IP주소 확인 >> master, slave1, slave2

[root@localhost ~]# vi /etc/hosts

[root@localhost ~]#

192.168.78.128 master

192.168.78.128 backup

192.168.78.129 slave1

192.168.78.130 slave2


6. 방화벽 설정 >> master, slave1, slave2

7. ssh rsa키를 이용하여 비밀번호 입력 없이 로그인하기

1) 공개키생성

[hadoop@master ~]$ ssh-keygen -t rsa


2) ssh 확인

[hadoop@master ~]$ cd .ssh

[hadoop@master .ssh]$ ls -l

-rw-------. 1 hadoop hadoop 1679 12월 30 23:26 id_rsa        //개인키

-rw-r--r--. 1 hadoop hadoop  395 12월 30 23:26 id_rsa.pub    //공개키


3) master 공개키를 authorized_keys에 추가

[hadoop@master .ssh]$ cp ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys


4) slave1 에서도 개인키/공개키 생성 (slave2)


5) slave1 공개키를 master의 authorize_keys 파일에 추가 (slave2, backup)

[hadoop@master .ssh]$ ssh hadoop@slave1 cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Are you sure you want to continue connecting (yes/no)? yes

hadoop@slave1's password:

[hadoop@master .ssh]$ ssh hadoop@backup cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

[hadoop@master .ssh]$ ssh hadoop@slave2 cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys


6) 모든 노드에 공개키 재분배 - 모든 노드에서 서로의 공개키를 공유함

[hadoop@master .ssh]$ scp authorized_keys hadoop@slave1:~/.ssh/

[hadoop@master .ssh]$ scp authorized_keys hadoop@backup:~/.ssh/

[hadoop@master .ssh]$ scp authorized_keys hadoop@slave2:~/.ssh/

 

[hadoop@master .ssh]$ ssh-add //master에서만 해주면 된다

Identity added: /home/hadoop/.ssh/id_rsa (/home/hadoop/.ssh/id_rsa)


7) 권한을 644로 수정

[hadoop@master .ssh]$ chmod 644 ~/.ssh/authorized_keys


[실습]

$ ssh hadoop@master date

$ ssh hadoop@backup date

$ ssh hadoop@slave1 date

$ ssh hadoop@slave2 date

$ ssh slave1

$ ssh master










'강의노트 > 기타' 카테고리의 다른 글

[하둡] 하둡 설정  (2) 2017.12.17
네트워크 기초  (0) 2017.10.15