赞
踩
参考spark安装:记录本机(windows11)搭建spark过程 link
下载HADOOP压缩包,解压后,配置环境变量,在系统环境变量中新增:HADOOP_HOME=C:\hadoop-3.2.4。Path新增%HADOOP_HOME%bin,由于hadoop是Linux版本,此时还需要winutils.exe和hadoop.dll放入bin目录下。最后可用hadoop version查看是否安装成功。按照如下配置:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>roperty> </configuration>
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/D:/ASoftware/Hadoop/hadoop-2.7.7/data/dfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/D:/ASoftware/Hadoop/hadoop-2.7.7/data/dfs/datanode</value>
</property>
</configuration>
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
@rem set JAVA_HOME=%JAVA_HOME%
@如果设置了JAVA_HOME,可用:set JAVA_HOME=%JAVA_HOME%
set JAVA_HOME="C:\Program Files\Java\jdk1.8.0_201"
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=D:\\hadoop
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=D:\hive\conf
# Folder containing extra ibraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=D:\hive\lib
<property> <name>hive.exec.scratchdir</name> <!-- <value>/tmp/hive</value> --> <value>/D:/hive-3.1.3-bin/my_hive/scratch_dir/</value> <description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.</description> </property> <property> <name>hive.downloaded.resources.dir</name> <value>/D:/hive-3.1.3-bin/my_hive/resources_dir/${hive.session.id}_resources</value> <description>Temporary local directory for added resources in the remote file system.</description> </property> <property> <name>hive.querylog.location</name> <value>/D:/hive-3.1.3-bin/my_hive/querylog_dir/${system:user.name}</value> <description>Location of Hive run time structured log file</description> </property> <property> <name>hive.server2.logging.operation.log.location</name> <value>/D:/hive-3.1.3-bin/my_hive/operation_logs_dir/${system:user.name}/operation_logs</value> <description>Top level directory where operation logs are stored if logging functionality is enabled</description> </property>
(2)修改mysql相关配置
<property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.druid.metadata.db.type</name> <value>mysql</value> <description> Expects one of the pattern in [mysql, postgresql, derby]. Type of the metadata database. </description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123456</value> <description>password to use against metastore database</description> </property>
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。