Centos7安装配置单节点Hadoop

jps 没法找到:java

  • yum install java-1.8.0-openjdk-devel.x86_64

安装和配置ssh免密码登陆

  • yum install ssh
  • ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
  • cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
  • chmod 0600 ~/.ssh/authorized_keys

安装hadoop

[root@localhost hadoop-2.6.5]# cd ~
[root@localhost ~]# vim .bash_profilenode

在里面添加:json

export HADOOP_HOME=/home/kkxmoye/Downloads/hadoop-2.6.5
PATH=$JAVA_HOME/bin:$PATH:$HOME/bin:$HADOOP_HOME/binvim

执行   source .bash_profile   使配置生效bash

  • cd  /home/kkxmoye/Downloads/hadoop-2.6.5
  • vim etc/hadoop/core-site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>
  • vim etc/hadoop/hdfs-site.xml
<configuration>
    <property>
        <name>dfs.replication</name>
    <value>1</value>
    </property>
</configuration>

 

  • vim etc/hadoop/mapred-site.xml
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.admin.user.env</name>
        <value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
    </property>
    <property>
        <name>yarn.app.mapreduce.am.env</name>
        <value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
    </property>
</configuration>

 

  • vim etc/hadoop/yarn-site.xml
<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>

 

  • vim sbin/start-dfs.sh
    • 在顶部空白处添加
HDFS_DATANODE_USER=root  
HDFS_DATANODE_SECURE_USER=hdfs  
HDFS_NAMENODE_USER=root  
HDFS_SECONDARYNAMENODE_USER=root 

  • vim sbin/stop-dfs.sh

在顶部空白处添加app

HDFS_DATANODE_USER=root  
HDFS_DATANODE_SECURE_USER=hdfs  
HDFS_NAMENODE_USER=root  
HDFS_SECONDARYNAMENODE_USER=root 

  • vim sbin/start-yarn.sh

    • 在顶部空白处添加:
YARN_RESOURCEMANAGER_USER=root
HADOOP_SECURE_DN_USER=yarn
YARN_NODEMANAGER_USER=root

  • vim sbin/stop-yarn.sh

    • 在顶部空白处添加:
YARN_RESOURCEMANAGER_USER=root
HADOOP_SECURE_DN_USER=yarn
YARN_NODEMANAGER_USER=root

  • vim  etc/hadoop/hadoop-env.sh

    • 设置JAVA_HOME

启动hadoopssh

  •  sbin/start-dfs.sh
  •  sbin/start-yarn.sh 

中止hadoopoop

  • sbin/stop-dfs.sh
  • sbin/stop-yarn.sh

访问Hadoop测试

  • http://192.168.48.133:8088/

wordcount测试

  • 建立本地示例文件

[root@localhost hadoop-3.0.0]# mkdir /home/zby/file
[root@localhost hadoop-3.0.0]# cd ../file/
[root@localhost file]# 
[root@localhost file]# 
[root@localhost file]# echo "hello world" > file1.txt
[root@localhost file]# echo "hello hadoop" > file2.txt
[root@localhost file]# echo "hello mapreduce" >> file2.txt
[root@localhost file]# ls
file1.txt  file2.txtspa

  • 在HDFS上建立输入文件夹

[root@localhost file]# cd ../hadoop-3.0.0/
[root@localhost hadoop-3.0.0]# bin/hadoop fs -mkdir /hdfsinput
#[root@localhost hadoop-3.0.0]# bin/hadoop fs -put /home/kkxmoye/Downloads/file
#file1.txt  file2.txt  
[root@localhost hadoop-3.0.0]# bin/hadoop fs -put /home/kkxmoye/Downloads/file/file* /hdfsinput

运行Hadoop 自带示例wordcount

  • bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /hdfsinput /hdfsoutput

  • 运行成功:

  • bin/hadoop fs -ls /hdfsoutput

  • bin/hadoop fs -cat /hdfsoutput/part-r-00000

  • 查看part-r-00000文件发现hello出现3次,hadoop出现1次,world出现1次
相关文章
相关标签/搜索