赞
踩
- <property>
- <name>hive.server2.thrift.port</name>
- <value>10000</value>
- </property>
-
- <property>
- <name>hive.server2.thrift.bind.host</name>
- <value>localhost</value>
- </property>
-
- <property>
- <name>hive.metastore.uris</name>
- <value>thrift://localhost:9083</value>
- </property>
启动:
- nohup hive --service metastore &
- nohup hive --service hiveserver2 &
①如果你的hive元数据存在mysql,把mysql-connector-java-8.0.18.jar复制到spark的jars中
②在spark的spark-env.sh中添加:
- export HIVE_HOME=/root/bigdata/hive-2.3.6
- export HIVE_CONF_DIR=${HIVE_HOME}/conf
- export SPARK_CLASSPATH=$HIVE_HOME/lib:$SPARK_CLASSPATH
③将hive中conf目录中的hive-site.xml复制到spark的conf目录中
①配置hive得hive-site.xml
- <property>
- <name>hive.execution.engine</name>
- <value>spark</value>
- </property>
- <property>
- <name>hive.enable.spark.execution.engine</name>
- <value>true</value>
- </property>
-
- <property>
- <name>spark.home</name>
- <value>/home/work/spark-2.3.0</value>
- </property>
- <property>
- <name>spark.master</name>
- <value>spark://vm10-38-248-149.ksc.com:7077</value>
- </property>
- <property>
- <name>spark.serializer</name>
- <value>org.apache.spark.serializer.KryoSerializer</value>
- </property>
-
-
- <property>
- <name>spark.executor.memeory</name>
- <value>5g</value>
- </property>
- <property>
- <name>spark.driver.memeory</name>
- <value>4g</value>
- </property>
-
- <property>
- <name>spark.executor.cores</name>
- <value>6</value>
- </property>
- <property>
- <name>spark.executor.instances</name>
- <value>4</value>
- </property>
- <property>
- <name>spark.sql.shuffle.partitions</name>
- <value>200</value>
- </property>
-
-
-
- <property>
- <name>spark.executor.extraJavaOptions</name>
- <value>-XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"</value>
- </property>
- <property>
- <name>hive.server2.enable.doAs</name>
- <value>false</value>
- </property>
- <property>
- <name>dfs.permissions.enabled</name>
- <value>false</value>
- </property>
- </configuration>
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。