1.安装jdk,配置环境变量node
下载地址:http://yunpan.cn/csUMbjpYd6LIU 提取码 8e1dssh
修改/etc/profileide
export JAVA_HOME=/opt/jdk1.6.0_12oop
export PATH=$PATH:$JAVA_HOME/binthis
重启spa
2.下载hadoop1.1.2orm
下载地址:http://yunpan.cn/csUMinfxTYFZr 提取码 7bf8xml
修改配置文件hadoop
hadoop-env.shci
export JAVA_HOME=/opt/jdk1.6.0_12 |
core-site.xml
<?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/opt/hadoop-1.1.2/tmp</value> </property> </configuration> |
hdfs-site.xml
<?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration> <!-- 设置备份的份数 --> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration> |
mapred-site.xml
<?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>mapred.job.tracker</name> <value>localhost:9001</value> </property> </configuration> |
免密码登录
$ ssh-keygen -t rsa $ cd $HOME/.ssh $ cp id_rsa.pub authorized_keys $ ssh localhost |
格式化HDFS:hadoop namenode -format
启动hadoop群集:start-all.sh
卸载hadoop,因为没配环境变量,直接删除文件就行