hadoop2.7ubuntu伪分布式搭建

一.安装Java

下载java的压缩包,解压java

sudo tar -zxvf 8.tar.gz

配置java环境变量node

sudo vim ~/.bashrc
export JAVA_HOME=/usr/local/java-8-openjdk-amd64
export JRE_HOME=/usr/local/java-8-openjdk-amd64/jre
export CLASS_PATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin
source ~/.bashrc

二.ssh免密码登陆

先判断是否安装ssh,执行ssh localhost,若须要输入密码,则已经安装vim

未安装,执行如下命令安装sshbash

sudo apt-get install openssh-server

登陆ssh localhost须要输入密码ssh

退出exitoop

查看/home/hadoop家目录下的/.ssh,没有就建立spa

ls
cd  #进入用户目录
mkdir ./.ssh
cd ./.ssh
ssh-keygen -t rsa
cat ./id_rsa.pub >> ./authorized_keys  #将生成的公钥加入到key中
ssh -version  #验证是否安装成功
ssh localhost  #不须要输入密码便可登陆
exit  #登出

三。添加IP映射

为了方便,咱们在hosts添加IP地址映射code

sudo vim /etc/hosts
127.0.0.1 localhost
192.168.153.133  hadoop
#IP地址           主机名

四.安装hadoop

1.将压缩包放到/usr/local/目录下,解压hadooporm

cd /usr/local
sudo tar -zxvf hadoop-2.7.5.tar.gz  #解压
sudo mv hadoop-2.7.5 hadoop  #重命名

2.赋予用户权限server

sudo chown -R hadoop ./hadoop

3.查看hadoop版本

cd ./hadoop 
./bin/hadoop version

4.添加hadoop环境变量

sudo vim /etc/profile
export HADOOP_HOME=/usr/local/hadoop
export PATH=$HADOOP_HOME/bin:$PATH

使环境变量生效

source /etc/profile

5.修改hadoop配置文件

cd /usr/local/hadoop/etc/hadoop
sudo cp mapred-site.xml.template mapred-site.xml
sudo vim core-site.xml

core-site.xml

sudo vim core-site.xml
<configuration>

    <property>
    <name>hadoop.tmp.dir</name>
    <value>/home/hadoop/hadoop/tmp</value> <!--设置临时文件夹,只要在home下便可  -->
    </property>
    <property>
    <name>fs.defaultFS</name>
    <value>hdfs://hadoop:9000</value><!-- 修改成本机IP地址  -->
    </property>

​</configuration>
​

 hdfs-site.xml

sudo vim hdfs-site.xml
<configuration>

        <property>
             <name>dfs.replication</name>
             <value>1</value>
        </property>
        <property>
             <name>dfs.namenode.name.dir</name>
             <value>/home/hadoop/hadoop/dfs/name</value>
        </property>
        <property>
             <name>dfs.datanode.data.dir</name>
             <value>/home/hadoop/hadoop/dfs/data</value>
        </property>
        <property>
            <name>dfs.permissions</name>
           <value>false</value>
        </property>

</configuration>

mapred-site.xml

sudo vim mapred-site.xml
<configuration>

<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>

</configuration>

​

yarn-site.xml

sudo vim yarn-site.xml
<configuration>

    <property>
    <name>yarn.resourcemanager.hostname</name>
    <value>hadoop</value>
    </property>
    <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
    </property>

​</configuration>
​

slaves中添加主机名或者IP地址

sudo vim slaves
hadoop

hadoop-env.sh 添加Javahome路径

sudo vim hadoop-env.sh
export JAVA_HOME=/usr/local/java-8-openjdk-amd64
​

 

6.格式化namenode

cd /usr/lcoal/hadoop/bin
./hadoop namenode -format

7. 启动hadoop

cd /usr/local/hadoop/sbin
./start-all.sh

8. 查看进程jps

5、hadoop网址

http://hadoop:8088   8088端口查看yarn运行状况

http://hadoop:50070    50070查看HDFS状况

相关文章
相关标签/搜索