当前位置:   article > 正文

spark-on-hive环境,hive-on-spark环境配置_spark包中指定hive_home

spark包中指定hive_home

不管是on-hive还是on-spark,都需要配置hive得metastore服务

①配置hive

  1. <property>
  2. <name>hive.server2.thrift.port</name>
  3. <value>10000</value>
  4. </property>
  5. <property>
  6. <name>hive.server2.thrift.bind.host</name>
  7. <value>localhost</value>
  8. </property>
  9. <property>
  10.     <name>hive.metastore.uris</name>
  11.     <value>thrift://localhost:9083</value>
  12.   </property>

启动:

  1. nohup hive --service metastore &
  2. nohup hive --service hiveserver2 &

spark-on-hive:

spark中如果不配置hive环境是无法读取hive表数据的,如下配置:

①如果你的hive元数据存在mysql,把mysql-connector-java-8.0.18.jar复制到spark的jars中

②在spark的spark-env.sh中添加:

  1. export HIVE_HOME=/root/bigdata/hive-2.3.6
  2. export HIVE_CONF_DIR=${HIVE_HOME}/conf
  3. export SPARK_CLASSPATH=$HIVE_HOME/lib:$SPARK_CLASSPATH

③将hive中conf目录中的hive-site.xml复制到spark的conf目录中

hive-on-spark

①配置hive得hive-site.xml

  1. <property>
  2. <name>hive.execution.engine</name>
  3. <value>spark</value>
  4. </property>
  5. <property>
  6. <name>hive.enable.spark.execution.engine</name>
  7. <value>true</value>
  8. </property>
  9. <property>
  10. <name>spark.home</name>
  11. <value>/home/work/spark-2.3.0</value>
  12. </property>
  13. <property>
  14. <name>spark.master</name>
  15. <value>spark://vm10-38-248-149.ksc.com:7077</value>
  16. </property>
  17. <property>
  18. <name>spark.serializer</name>
  19. <value>org.apache.spark.serializer.KryoSerializer</value>
  20. </property>
  21. <property>
  22. <name>spark.executor.memeory</name>
  23. <value>5g</value>
  24. </property>
  25. <property>
  26. <name>spark.driver.memeory</name>
  27. <value>4g</value>
  28. </property>
  29. <property>
  30. <name>spark.executor.cores</name>
  31. <value>6</value>
  32. </property>
  33. <property>
  34. <name>spark.executor.instances</name>
  35. <value>4</value>
  36. </property>
  37. <property>
  38. <name>spark.sql.shuffle.partitions</name>
  39. <value>200</value>
  40. </property>
  41. <property>
  42. <name>spark.executor.extraJavaOptions</name>
  43. <value>-XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"</value>
  44. </property>
  45. <property>
  46. <name>hive.server2.enable.doAs</name>
  47. <value>false</value>
  48. </property>
  49. <property>
  50. <name>dfs.permissions.enabled</name>
  51. <value>false</value>
  52. </property>
  53. </configuration>

 

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/菜鸟追梦旅行/article/detail/526918
推荐阅读
相关标签
  

闽ICP备14008679号