10/10/25 17:01:46 INFO security.UserGroupInformation:
Login successful for user hdfs/fully.qualified.domain.name@YOUR-REALM.COM using keytab file /etc/hadoop/hdfs.keytab
10/10/25 17:01:52 INFO security.UserGroupInformation: Login successful for user host/fully.qualified.domain.name@YOUR-REALM.COM using keytab file /etc/hadoop/hdfs.keytab
10/10/25 17:01:52 INFO http.HttpServer: Added global filtersafety (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
10/10/25 17:01:57 INFO http.HttpServer: Adding Kerberos filter to getDelegationToken
10/10/25 17:01:57 INFO http.HttpServer: Adding Kerberos filter to renewDelegationToken
10/10/25 17:01:57 INFO http.HttpServer: Adding Kerberos filter to cancelDelegationToken
10/10/25 17:01:57 INFO http.HttpServer: Adding Kerberos filter to fsck
10/10/25 17:01:57 INFO http.HttpServer: Adding Kerberos filter to getimage
关于错误:
12/06/13 13:24:43 WARN ipc.Server: Auth failed for 127.0.0.1:63202:null
12/06/13 13:24:43 WARN ipc.Server: Auth failed for 127.0.0.1:63202:null
12/06/13 13:24:43 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception javax.security.sasl.SaslException: GSS initiate failed [Caused by GS***ception: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)] from client 127.0.0.1. Count of bytes read: 0
javax.security.sasl.SaslException: GSS initiate failed [Caused by GS***ception: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)]
12/06/13 13:23:21 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.
11/01/04 12:08:12 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException:
GSS initiate failed [Caused by GS***ception: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
Bad connection to FS. command aborted. exception: Call to nn-host/10.0.0.2:8020 failed on local exception: java.io.IOException:
javax.security.sasl.SaslException: GSS initiate failed [Caused by GS***ception: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
11/03/21 12:46:57 ERROR datanode.DataNode: java.lang.RuntimeException: Cannot start secure cluster without privileged resources. In a secure cluster, the DataNode must
be started from within jsvc. If using Cloudera packages, please install the hadoop-0.20-sbin package.
For development purposes ONLY you may override this check by setting dfs.datanode.require.secure.ports to false. *** THIS WILL OPEN A SECURITY HOLE AND MUST NOT BE
USED FOR A REAL CLUSTER ***.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:306)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:280)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1533)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1473)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1491)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1616)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1626)
Note In the taskcontroller.cfg
file, the default setting for the banned.users
property is mapred
, hdfs
, and bin
to prevent jobs from being submitted via those user accounts. The default setting for themin.user.id
property is 1000
to prevent jobs from being submitted with a user ID less than 1000, which are conventionally Unix super users. Note that some operating systems such as CentOS 5 use a default value of 500 and above for user IDs, not 1000. If this is the case on your system, change the default setting for the min.user.id
property to 500
. If there are user accounts on your cluster that have a user ID less than the value specified for the min.user.id
property, the TaskTracker returns an error code of 255.
FATAL mapred.JobTracker: org.apache.hadoop.security.AccessControlException: The systemdir hdfs://nn.hadoop.local:9000/hadoop_data/tmp/mapred/system is not owned by mapred
ERROR mapred.TaskTracker: Can not start task tracker because java.io.IOException: Login failure for mapred/srv143.madeforchina.co@for_hadoop from keytab /usr/local/hadoop/mapred.keytab