win10 WSL Ubuntu 安装Hadoop Single Node Cluster

2019/05/04

Categories: hadoop Tags: hadoop

准备工作

lsb_release -a
sudo nano /etc/apt/sources.list
sudo apt update

SSH 无密码登录

安装

sudo apt install openssh-server
sudo service ssh restart

测试

which ssh
which sshd

产生 SSH key

生成 public/private key pair

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

将产生的 key 放置授权文件

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

指令 >> 文件

测试【不成功则尝试重启 sudo service ssh restart

ssh localhost

安装 JDK

java -version
sudo apt install default-jdk

下载安装 Hadoop

下载binary

link

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.1.2/hadoop-3.1.2.tar.gz

解压缩

sudo tar -zxvf hadoop-3.1.2.tar.gz

移动

sudo mv hadoop-3.1.2 /usr/local/hadoop

查看

ll /usr/local/hadoop

设定 Hadoop 参数

编辑 .bashrc

sudo nano .bashrc
#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME 
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
export
JAVA_LIBRARY_PATH=HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
#Hadoop variables

生效设定

source .bashrc

测试

echo $HADOOP_HOME

测试

hadoop version

Hadoop组件设定


sudo nano /usr/local/hadoop/etc/hadoop/hadoop-env.sh

设定Java安装路径

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

sudo nano /usr/local/hadoop/etc/hadoop/core-site.xml

hdfs预设名称

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

sudo nano /usr/local/hadoop/etc/hadoop/yarn-site.xml
<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
</configuration>

mapred-site.xml用于监控map和reduce程序的JobTracker工作分配状况、以及TaskTracker工作执行状况

设定mapreduce框架为yarn

sudo nano /usr/local/hadoop/etc/hadoop/mapred-site.xml
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

hdfs-site.xml用于设定HDFS分散式档案系统

sudo nano /usr/local/hadoop/etc/hadoop/hdfs-site.xml
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>3</value>
    </property>
     <property>
        <name>dfs.namenode.name.dir</name>
        <value>file:/usr/local/hadoop/hadoop_data/hdfs/namenode</value>
    </property>
     <property>
        <name>dfs.datanode.data.dir</name>
        <value>file:/usr/local/hadoop/hadoop_data/hdfs/datanode</value>
    </property>
    <property>
        <name>dfs.http.address</name>
        <value>127.0.0.1:50070</value>
    </property>
</configuration>

建立与格式化HDFS目录

sudo mkdir -p /usr/local/hadoop/hadoop_data/hdfs/namenode
sudo mkdir -p /usr/local/hadoop/hadoop_data/hdfs/datanode
sudo chown <username>:root -R /usr/local/hadoop
hadoop namenode -format

启动Hadoop

启动 HDFS

start-dfs.sh

查看HDFS web界面:http://localhost:8088

启动 YARN

start-yarn.sh

查看Resource Manager web界面:http://localhost:8088

查看已经启动的进程

jps

关闭

stop-all.sh

参考

  1. windows子系统(wsl) Ubuntu16.04 下安装hadoop
  2. Hadoop 2.6 Installing on Ubuntu 14.04 (Single-Node Cluster)
  3. Python+Spark2.0+Hadoop机器学习与大数据分析实战