当前位置:   article > 正文

sparksql jdbc mysql,什么是SparkSQL SQL查询写入JDBC表?

using org.apache.spark.sql.jdbc

For SQL query in Spark.

For read, we can read jdbc by

CREATE TEMPORARY TABLE jdbcTable

USING org.apache.spark.sql.jdbc

OPTIONS dbtable ...;

For write, what is the query to write the data to the remote JDBC table using SQL?

NOTE: I want it to be SQL query.

plz provide the pure "SQL query" that can write to jdbc when using HiveContext.sql(...) of SparkSQL.

解决方案

An INSERT OVERWRITE TABLE will write to your database using the JDBC connection:

DROP TABLE IF EXISTS jdbcTemp;

CREATE TABLE jdbcTemp

USING org.apache.spark.sql.jdbc

OPTIONS (...);

INSERT OVERWRITE TABLE jdbcTemp

SELECT * FROM my_spark_data;

DROP TABLE jdbcTemp;

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/盐析白兔/article/detail/593337
推荐阅读
相关标签
  

闽ICP备14008679号