Set up kerberos for Hadoop error

ZongtianHou zongtianhou at icloud.com
Sat Jun 23 00:35:23 EDT 2018


I am setting up kerberos for hdfs, but when I do Hadoop -ls /, this error happens, and I can’t figure out the reason after searching on internet for a while. Can anyone give some advice?
Below is the Hadoop log 

bash-4.2$ HADOOP_ROOT_LOGGER=DEBUG,console hdfs dfs -ls /
18/06/23 04:21:27 DEBUG util.Shell: setsid exited with exit code 0
18/06/23 04:21:28 DEBUG conf.Configuration: parsing URL jar:file:/usr/hdp/2.5.3.0-37/hadoop/hadoop-common-2.7.1-SNAPSHOT.jar!/core-default.xml
18/06/23 04:21:28 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream at 4dfa3a9d
18/06/23 04:21:28 DEBUG conf.Configuration: parsing URL file:/etc/hadoop/conf/core-site.xml
18/06/23 04:21:28 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream at 2d6eabae
18/06/23 04:21:28 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
18/06/23 04:21:28 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
18/06/23 04:21:28 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
18/06/23 04:21:28 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
18/06/23 04:21:28 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
18/06/23 04:21:28 DEBUG security.Groups:  Creating new Groups object
18/06/23 04:21:28 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
18/06/23 04:21:28 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
18/06/23 04:21:28 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
18/06/23 04:21:28 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
18/06/23 04:21:28 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18/06/23 04:21:28 DEBUG security.UserGroupInformation: hadoop login
18/06/23 04:21:28 DEBUG security.UserGroupInformation: hadoop login commit
18/06/23 04:21:28 DEBUG security.UserGroupInformation: using kerberos user:gpadmin at CW.COM
18/06/23 04:21:28 DEBUG security.UserGroupInformation: Using user: "gpadmin at CW.COM" with name gpadmin at CW.COM
18/06/23 04:21:28 DEBUG security.UserGroupInformation: User entry: "gpadmin at CW.COM"
18/06/23 04:21:28 DEBUG security.UserGroupInformation: UGI loginUser:gpadmin at CW.COM (auth:KERBEROS)
18/06/23 04:21:28 DEBUG security.UserGroupInformation: Found tgt Ticket (hex) =
0000: 61 82 01 44 30 82 01 40   A0 03 02 01 05 A1 08 1B  a..D0.. at ........
0010: 06 43 57 2E 43 4F 4D A2   1B 30 19 A0 03 02 01 02  .CW.COM..0......
0020: A1 12 30 10 1B 06 6B 72   62 74 67 74 1B 06 43 57  ..0...krbtgt..CW
0030: 2E 43 4F 4D A3 82 01 10   30 82 01 0C A0 03 02 01  .COM....0.......
0040: 10 A1 03 02 01 01 A2 81   FF 04 81 FC 9B 42 41 5C  .............BA\
0050: E1 A6 AD A1 EC 27 39 38   04 5C F3 CD AC 79 E5 6B  .....'98.\...y.k
0060: D1 DC EE 90 1C 86 00 9D   AC C4 95 AD FD 53 8D F4  .............S..
0070: 78 34 DB BC BC F3 1F 1B   90 8E C3 2A 12 DB 54 7B  x4.........*..T.
0080: 3E D3 9F 1A DB 0E 72 37   41 3E 5E 2D 31 A9 89 DB  >.....r7A>^-1...
0090: B1 25 31 BC 7F 27 5F 40   34 24 B2 28 B7 E3 EA 1C  .%1..'_ at 4$.(....
00A0: D4 C8 39 40 D6 AF 0E B6   83 81 84 8E 61 4B F6 EC  ..9 at ........aK..
00B0: 40 07 C9 7B 8A 16 6B 42   EF 2C DE B1 6F 1C 2C 36  @.....kB.,..o.,6
00C0: 26 25 94 2B 11 5F B5 23   E7 45 F0 81 CC 5E 19 B6  &%.+._.#.E...^..
00D0: C1 03 39 0F B9 9F A9 E4   79 71 AF 9E F2 52 63 B1  ..9.....yq...Rc.
00E0: D3 1A 65 F8 E5 F3 3B 3E   BD E4 F7 03 A4 5D 99 39  ..e...;>.....].9
00F0: 21 6E 23 87 08 32 29 9D   16 FA 19 D5 3D 83 52 8D  !n#..2).....=.R.
0100: 73 AA E4 96 AF 42 18 BC   60 E0 E4 BA 2E 9C 2E 7C  s....B..`.......
0110: 35 D6 7B A5 FF AF D6 61   EF E8 EF D2 A6 30 1B 4B  5......a.....0.K
0120: 05 2D 9C 88 19 C8 0A 04   2C 8E 0F E9 DF 8E 15 28  .-......,......(
0130: 69 F5 33 C7 22 97 80 E2   BC 0C 2D 53 96 12 97 1F  i.3.".....-S....
0140: 2B 60 E6 B2 EF 1C C6 60                            +`.....`

Client Principal = gpadmin at CW.COM
Server Principal = krbtgt/CW.COM at CW.COM
Session Key = EncryptionKey: keyType=16 keyBytes (hex dump)=
0000: A2 08 4C A2 6D DF E0 3E   EC C2 54 B5 52 D6 70 2F  ..L.m..>..T.R.p/
0010: 7A AD 01 5E D9 9E 62 E6                            z..^..b.


Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Sat Jun 23 04:15:10 UTC 2018
Start Time = Sat Jun 23 04:15:10 UTC 2018
End Time = Sat Jun 23 14:15:10 UTC 2018
Renew Till = Sun Jun 24 04:15:10 UTC 2018
Client Addresses  Null
18/06/23 04:21:28 DEBUG security.UserGroupInformation: Current time is 1529727688897
18/06/23 04:21:28 DEBUG security.UserGroupInformation: Next refresh is 1529756110000
18/06/23 04:21:29 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
18/06/23 04:21:29 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = true
18/06/23 04:21:29 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
18/06/23 04:21:29 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket
18/06/23 04:21:29 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
18/06/23 04:21:29 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker at 2eae8e6e
18/06/23 04:21:29 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client at 5d0bf09b
18/06/23 04:21:29 DEBUG azure.NativeAzureFileSystem: finalize() called.
18/06/23 04:21:29 DEBUG azure.NativeAzureFileSystem: finalize() called.
18/06/23 04:21:29 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2 at 4c4ff400: starting with interruptCheckPeriodMs = 60000
18/06/23 04:21:29 DEBUG shortcircuit.DomainSocketFactory: The short-circuit local reads feature is enabled.
18/06/23 04:21:29 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol using SaslPropertiesResolver, configured QOP dfs.data.transfer.protection = integrity, configured class dfs.data.transfer.saslproperties.resolver.class = class org.apache.hadoop.security.SaslPropertiesResolver
18/06/23 04:21:29 DEBUG ipc.Client: The ping interval is 60000 ms.
18/06/23 04:21:29 DEBUG ipc.Client: Connecting to hou-1/10.0.0.6:9000
18/06/23 04:21:29 DEBUG security.UserGroupInformation: PrivilegedAction as:gpadmin at CW.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:758)
18/06/23 04:21:29 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

18/06/23 04:21:29 DEBUG security.SaslRpcClient: Received SASL message state: NEGOTIATE
auths {
  method: "TOKEN"
  mechanism: "DIGEST-MD5"
  protocol: ""
  serverId: "default"
  challenge: "realm=\"default\",nonce=\"JcOFbsn6cSCW0FZqHPZt6hu/NJ57Rp5p08NAECYQ\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "gpadmin"
  serverId: "CW.COM"
}

18/06/23 04:21:29 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
18/06/23 04:21:29 DEBUG security.SaslRpcClient: Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal)
18/06/23 04:21:29 DEBUG security.SaslRpcClient: RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is gpadmin/cw.com at CW.COM
18/06/23 04:21:29 DEBUG security.SaslRpcClient: Creating SASL GSSAPI(KERBEROS)  client to authenticate to service at CW.COM
18/06/23 04:21:29 DEBUG security.SaslRpcClient: Use KERBEROS authentication for protocol ClientNamenodeProtocolPB
18/06/23 04:21:29 DEBUG security.UserGroupInformation: PrivilegedActionException as:gpadmin at CW.COM (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)]
18/06/23 04:21:29 DEBUG security.UserGroupInformation: PrivilegedAction as:gpadmin at CW.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:683)
18/06/23 04:21:29 DEBUG ipc.Client: Exception encountered while connecting to the server :
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)]
	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:414)
	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:595)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:397)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:762)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:758)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:758)
	at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1618)
	at org.apache.hadoop.ipc.Client.call(Client.java:1449)
	at org.apache.hadoop.ipc.Client.call(Client.java:1396)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
	at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1435)
	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
	at org.apache.hadoop.fs.Globber.glob(Globber.java:265)
	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1674)
	at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
	at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
	at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
	at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
	at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
	at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
	at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)
Caused by: GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)




More information about the Kerberos mailing list