### 下载Hadoop2.7并解压出来wget http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.7.3/hado
### 下载Hadoop2.7并解压出来wget http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.7.3/hadoop-2.7.3.tar.gztar -zxvf hadoop-2.7.3.tar.gzcp -R /tmp/hadoop-2.7.3 /usr/hadoop### 设置环境变量vi /etc/profile export JAVA_HOME=/usr/java/jdk1.8.0_131HADOOP_HOME=/usr/hadoopexport JAVA_LIBRARY_PATH=/usr/hadoop/lib/nativeexport HADOOP_INSTALL=$HADOOP_HOMEexport HADOOP_MAPRED_HOME=$HADOOP_HOMEexport HADOOP_COMMON_HOME=$HADOOP_HOMEexport HADOOP_HDFS_HOME=$HADOOP_HOMEexport YARN_HOME=$HADOOP_HOMEexport HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/nativeexport PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$JAVA_HOME### 检查是否存在SSH服务rpm -qa | grep sshopenssh-clients-6.6.1p1-33.el7_3.x86_64openssh-6.6.1p1-33.el7_3.x86_64openssh-server-6.6.1p1-33.el7_3.x86_64libssh2-1.4.3-10.el7_2.1.x86_64### 如果不存在则安装yum install -y openssh-clients yum install -y openssh-server ### 设置一个空的KEY,输入命令后一路回车则会生成cd ~/.ssh/ ssh-keygen -t dsa cat id_dsa.pub >> authorized_keys### 设置权限chmod 600 authorized_keys### 修改SSH的配置vi /etc/ssh/sshd_config RSAAuthentication yes PubkeyAuthentication yes AuthorizedKeysFile .ssh/authorized_keys### 重启SSHsystemctl restart sshd### 错误The authenticity of host 0.0.0.0 can't be established. 解决方式ssh -o StrictHostKeyChecking=no 0.0.0.0################################################### hadoop的配置文件更改######################################################### 修改/usr/hadoop/etc/hadoop/core-site.xml文件<configuration> <property> <name>fs.defaultFS</name> <value>hdfs://0.0.0.0:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/usr/hadoop/temp</value> </property></configuration>### 修改/usr/hadoop/etc/hadoop/hdfs-site.xmll文件<configuration> <property> <name>dfs.replication</name> <value>2</value> </property></configuration>### 设置hadoop使用的JAVA环境变量vi /usr/hadoop/etc/hadoop/hadoop-env.shexport JAVA_HOME=/usr/java/jdk1.6.0_45### 首次启动需要格式化namenode文件/usr/hadoop/bin/hdfs namenode -format### 然后可以启动和停止hdfs/usr/hadoop/sbin/start-dfs.sh/usr/hadoop/sbin/stop-dfs.sh
安装好后通过WEB查看
http://{YourServerIP}:50070/dfshealth.html#tab-overview
注意防火墙: systemctl stop firewalld