当前位置:   article > 正文

spark连接metastore(kerbors)揭秘

spark连接metastore(kerbors)揭秘

1、hivemeta开启meta后,spark-sql执行报错:

Caused by: GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
        ... 55 more
Caused by: KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER
        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
        ... 58 more

解决方案:

spark.kerberos.principal hdfs/hadoop001@EXAMPLE.cn
spark.kerberos.keytab /BigData/run/hadoop/etc/hadoop/hdfs.keytab
spark.hadoop.hive.metastore.kerberos.principal hdfs/hadoop001@EXAMPLE.cn

本文内容由网友自发贡献,转载请注明出处:【wpsshop博客】
推荐阅读
相关标签
  

闽ICP备14008679号