需要的配置文件
(资料图)
centos7的镜像
centos-7.9.2009-isos-x86_64安装包下载_开源镜像站-阿里云 (aliyun.com)
java8
Java Downloads | Oracle
hadoop3.3.5
Index of /dist/hadoop/common/hadoop-3.3.5 (apache.org)
首先第一步在本地下载好vmware和centos7的镜像
之后的选项根据自己的实际需求选择
创建完之后
将这个路径换为你自己镜像的路径
然后就可以进入了,完成一些初始的设定之后
下载完后
打开命令窗口,并进入管理员模式
mkdir /usr/local/java8
tar zxvf jdk-8u371-linux-x64.tar.gz -C /usr/local/java8/
cd /usr/local/java8/jdk1.8.0_371/
vi /etc/profile
## JDK8export JAVA_HOME=/usr/local/java8/jdk1.8.0_371export JRE_HOME=${JAVA_HOEM}/jreexport CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib:$CLASSPATHexport JAVA_PATH=${JAVA_HOME}/bin:${JRE_HOME}/binexport PATH=$PATH:${JAVA_PATH}
source /etc/profile
在执行完vi /ect/profile命令完后,按i键进入编辑模式,在末尾添加上jdk8的环境设置,然后按esc退出,再按shift+:,然后输入wq按回车保存,最后使用source命令刷新配置
出现java版本说明配置成功了
wget https://archive.apache.org/dist/hadoop/common/hadoop-3.3.5/hadoop-arm64-3.3.5.tar.gz
tar zxvf hadoop-3.3.5.tar.gz -C /usr/local/
cd /usr/local/hadoop-3.3.5/
vi /etc/profile
## Hadoop3.3.5
export HADOOP_HOME=/usr/local/hadoop-3.3.5
export PATH=$PATH:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin
source /etc/profile
vi etc/hadoop/hadoop-env.sh
export JAVA_HOME=/usr/local/java8/jdk1.8.0_371
export HADOOP_PID_DIR=${HADOOP_HOME}/pids
在hadoop-env.sh末尾添加上面两句话
然后是对hadoop文件的配置
vi etc/hadoop/core-site.xml
#把configuration换位如下配置
fs.defaultFS hdfs://192.168.15.130:9000 #注意这里是你自己的IP地址不要照抄io.file.buffer.size 131072 vi etc/hadoop/hdfs-site.xml
dfs.replication 1 dfs.namenode.name.dir /mnt/data01/hadoop dfs.blocksize 268435456 dfs.namenode.handler.count 100
dfs.datanode.data.dir /mnt/data01/hdfs_dndata
hostnamectl set-hostname billsaifu #可以任意取主机名称bashecho "192.168.15.130 billsaifu" >> /etc/hosts#你自己的主机名和IP地址
这里可以查看你的IP地址
useradd xwpsu - xwp -c "ssh-keygen -t rsa -P "" -f ~/.ssh/id_rsa"su - xwp -c "cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys"su - xwp -c "chmod 0600 ~/.ssh/authorized_keys"chown -R xwp:xwp /usr/local/hadoop-3.3.5mkdir /usr/local/hadoop-3.3.5/pidschown -R xwp:xwp /usr/local/hadoop-3.3.5/pidsmkdir -p /mnt/data01/hadoopmkdir -p /mnt/data01/hdfs_dndatamkdir -p /mnt/data01/yarn_nmdatachown -R xwp:xwp /mnt/data01/hadoopchown -R xwp:xwp /mnt/data01/hdfs_dndatachown -R xwp:xwp /mnt/data01/yarn_nmdata
su - xwpcd /usr/local/hadoop-3.3.5bin/hdfs namenode -formatsbin/start-dfs.sh
hdfs dfs -mkdir /user
hdfs dfs -mkdir /user/xwp
hdfs dfs -mkdir inputhdfs dfs -put etc/hadoop/*.xml inputhadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.5.jar grep input output "dfs[a-z.]+"hdfs dfs -get output outputcat output/
vi etc/hadoop/mapred-site.xml
mapreduce.framework.name yarn mapreduce.map.memory.mb 1536 mapreduce.map.java.opts -Xmx1024M mapreduce.reduce.memory.mb 3072 mapreduce.reduce.java.opts -Xmx2560M mapreduce.task.io.sort.mb 512 mapreduce.task.io.sort.factor 100 mapreduce.application.classpath $HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*
vi etc/hadoop/yarn-site.xml
yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.env-whitelist JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,HADOOP_MAPRED_HOME yarn.nodemanager.local-dirs /mnt/data01/yarn_nmdata sbin/start-yarn.sh#开启yarn
bin/mapred --daemon start historyserver #开启JobhistoryServer
输入 hadoop version
出现对应的版本说明配置成功了
利用jps查看是否启动成功
未启动
[xwp@billsaifu hadoop-3.3.5]$ jps8129 Jps
成功启动
[xwp@billsaifu hadoop-3.3.5]$ ./sbin/start-dfs.shStarting namenodes on [billsaifu]billsaifu: Warning: Permanently added "billsaifu,192.168.15.130" (ECDSA) to the list of known hosts.Starting datanodeslocalhost: Warning: Permanently added "localhost" (ECDSA) to the list of known hosts.Starting secondary namenodes [billsaifu][xwp@billsaifu hadoop-3.3.5]$ jps9108 SecondaryNameNode9259 Jps8622 NameNode
登录相应的页面
失败是这样的