com.facebook.presto.spi.PrestoException: Error opening Hive split hdfs://longzhou-hdpnn.lz.dscc:11000/user/dt/hive-warehouse/dt_general_statistics.db/events_parquet_300m/part-r-00274-3a704be0-bdc2-4a41-a430-60222e023e9c.gz.parquet (offset=0, length=51756767): GSS initiate failed

GitHub | mouendless | 5 months ago
  1. 0

    GitHub comment 5709#234126944

    GitHub | 5 months ago | mouendless
    com.facebook.presto.spi.PrestoException: Error opening Hive split hdfs://longzhou-hdpnn.lz.dscc:11000/user/dt/hive-warehouse/dt_general_statistics.db/events_parquet_300m/part-r-00274-3a704be0-bdc2-4a41-a430-60222e023e9c.gz.parquet (offset=0, length=51756767): GSS initiate failed
  2. 0

    I tried setting the xd.customModule.home property to point to a Kerberized Hadoop cluster with all usual security config settings provided. It failed with the following exception: {code} org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'moduleRegistry' defined in class path resource [META-INF/spring-xd/internal/repositories.xml]: Cannot create inner bean 'org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean#19f459aa' of type [org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean] while setting constructor argument with key [1]; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean#19f459aa' defined in class path resource [META-INF/spring-xd/internal/repositories.xml]: Invocation of init method failed; nested exception is org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:313) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:382) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:648) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:140) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1139) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1042) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:504) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:757) ~[spring-context-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480) ~[spring-context-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:686) [spring-boot-1.2.3.RELEASE.jar:1.2.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:320) [spring-boot-1.2.3.RELEASE.jar:1.2.3.RELEASE] at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:139) [spring-boot-1.2.3.RELEASE.jar:1.2.3.RELEASE] at org.springframework.boot.builder.SpringApplicationBuilder.run(SpringApplicationBuilder.java:129) [spring-boot-1.2.3.RELEASE.jar:1.2.3.RELEASE] at org.springframework.xd.dirt.server.admin.AdminServerApplication.run(AdminServerApplication.java:95) [spring-xd-dirt-1.2.0.BUILD-SNAPSHOT.jar:1.2.0.BUILD-SNAPSHOT] at org.springframework.xd.dirt.server.admin.AdminServerApplication.main(AdminServerApplication.java:79) [spring-xd-dirt-1.2.0.BUILD-SNAPSHOT.jar:1.2.0.BUILD-SNAPSHOT] Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean#19f459aa' defined in class path resource [META-INF/spring-xd/internal/repositories.xml]: Invocation of init method failed; nested exception is org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1574) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:299) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] ... 22 common frames omitted Caused by: org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_67] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_67] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_67] at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_67] at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2755) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859) ~[hadoop-hdfs-2.6.0.jar:na] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1817) ~[hadoop-common-2.6.0.jar:na] at org.springframework.xd.dirt.module.ExtendedResource$HdfsExtendedResource.mkdirs(ExtendedResource.java:127) ~[spring-xd-dirt-1.2.0.BUILD-SNAPSHOT.jar:1.2.0.BUILD-SNAPSHOT] at org.springframework.xd.dirt.module.WritableResourceModuleRegistry.afterPropertiesSet(WritableResourceModuleRegistry.java:123) ~[spring-xd-dirt-1.2.0.BUILD-SNAPSHOT.jar:1.2.0.BUILD-SNAPSHOT] at org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean.afterPropertiesSet(CustomModuleRegistryFactoryBean.java:79) ~[spring-xd-dirt-1.2.0.BUILD-SNAPSHOT.jar:1.2.0.BUILD-SNAPSHOT] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1633) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570) ~[spring-beans-4.1.6.RELEASE.jar:4.1.6.RELEASE] ... 25 common frames omitted Caused by: org.apache.hadoop.ipc.RemoteException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Client.call(Client.java:1468) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.ipc.Client.call(Client.java:1399) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.0.jar:na] at com.sun.proxy.$Proxy79.mkdirs(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539) ~[hadoop-hdfs-2.6.0.jar:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67] at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.0.jar:na] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.0.jar:na] at com.sun.proxy.$Proxy80.mkdirs(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753) ~[hadoop-hdfs-2.6.0.jar:na] ... 37 common frames omitted 2015-06-10T14:49:20-0400 1.2.0.SNAP ERROR main boot.SpringApplication - Application startup failed {code}

    Spring JIRA | 1 year ago | Thomas Risberg
    org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'moduleRegistry' defined in class path resource [META-INF/spring-xd/internal/repositories.xml]: Cannot create inner bean 'org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean#19f459aa' of type [org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean] while setting constructor argument with key [1]; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.xd.dirt.module.CustomModuleRegistryFactoryBean#19f459aa' defined in class path resource [META-INF/spring-xd/internal/repositories.xml]: Invocation of init method failed; nested exception is org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
  3. 0

    Druid - Hadoop Remote Exception

    Google Groups | 8 months ago | Jagadeesh M
    org.apache.hadoop.ipc.RemoteException: User: eng/e...@UNIX.company.COM is not allowed to impersonate eng
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    To reproduce: 1. Have kerberos-enabled security setup in CDAP 2. Reduce hdfs token max-lifetime to a low value like 10 or 15 minutes 3. Start cdap services. After this duration of 10-15 minutes, kill one of the containers (such as tx.service). 4. AM will restart the container, but it will not have the credentials to read the credentials from HDFS. {code} 2016-04-27 03:14:00,762 - ERROR [main:o.a.t.i.ServiceMain@109] - Exception thrown from service TwillContainerService [FAILED]. java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken): token (HDFS_DELEGATION_TOKEN token 525 for cdap) is expired at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:294) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:281) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) ~[com.google.guava.guava-13.0.1.jar:na] at org.apache.twill.internal.ServiceMain.doMain(ServiceMain.java:105) ~[co.cask.cdap.cdap-app-fabric-3.4.0-SNAPSHOT.jar:na] at org.apache.twill.internal.container.TwillContainerMain.main(TwillContainerMain.java:103) [org.apache.twill.twill-yarn-0.7.0-incubating.jar:0.7.0-incubating] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67] at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] at org.apache.twill.launcher.TwillLauncher.main(TwillLauncher.java:89) [launcher.b29b37e4-5334-4688-94de-5128a7b4d5af.jar:na] java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken): token (HDFS_DELEGATION_TOKEN token 525 for cdap) is expired at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[com.google.guava.guava-13.0.1.jar:na] at co.cask.tephra.TransactionManager.doStart(TransactionManager.java:218) ~[co.cask.tephra.tephra-core-0.7.1-SNAPSHOT.jar:na] at com.google.common.util.concurrent.AbstractService.start(AbstractService.java:170) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.AbstractService.startAndWait(AbstractService.java:220) ~[com.google.guava.guava-13.0.1.jar:na] at co.cask.tephra.distributed.TransactionServiceThriftHandler.init(TransactionServiceThriftHandler.java:175) ~[co.cask.tephra.tephra-core-0.7.1-SNAPSHOT.jar:na] at co.cask.tephra.rpc.ThriftRPCServer.startUp(ThriftRPCServer.java:175) ~[co.cask.tephra.tephra-core-0.7.1-SNAPSHOT.jar:na] at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47) ~[com.google.guava.guava-13.0.1.jar:na] at java.lang.Thread.run(Thread.java:745) ~[na:1.7.0_67] Caused by: org.apache.hadoop.ipc.RemoteException: token (HDFS_DELEGATION_TOKEN token 525 for cdap) is expired at org.apache.hadoop.ipc.Client.call(Client.java:1466) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.ipc.Client.call(Client.java:1403) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67] at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2095) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1214) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1210) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1210) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1409) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at co.cask.tephra.persist.HDFSTransactionStateStorage.setupStorage(HDFSTransactionStateStorage.java:284) ~[co.cask.tephra.tephra-core-0.7.1-SNAPSHOT.jar:na] at co.cask.tephra.TransactionManager.doStart(TransactionManager.java:216) ~[co.cask.tephra.tephra-core-0.7.1-SNAPSHOT.jar:na] ... 6 common frames omitted {code}

    Cask Community Issue Tracker | 7 months ago | Ali Anwar
    java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken): token (HDFS_DELEGATION_TOKEN token 525 for cdap) is expired
  6. 0

    Setup: Kerberos-enabled, fully HA, 6-node CM cluster. Set following two parameters to 600000 (10 minutes): dfs.namenode.delegation.token.renew-interval dfs.namenode.delegation.token.max-lifetime Transaction service (and other containers) will fail. Error log pasted at the bottom. Relevant Hadoop JIRA: https://issues.apache.org/jira/browse/HDFS-9276 Workaround in Spark (which didn't work for us): https://github.com/apache/spark/pull/7069 {code} 2016-04-27 19:02:20,746 - ERROR [message-callback:o.a.t.i.y.AbstractYarnTwillService@96] - Failed to update secure store. org.apache.hadoop.ipc.RemoteException: token (HDFS_DELEGATION_TOKEN token 2710 for cdap) is expired at org.apache.hadoop.ipc.Client.call(Client.java:1466) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.ipc.Client.call(Client.java:1403) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:254) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source) ~[na:na] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_67] at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source) ~[na:na] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1258) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1245) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1233) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:302) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:268) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:260) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1564) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:308) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:304) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:304) ~[hadoop-hdfs-2.6.0-cdh5.5.2.jar:na] at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:775) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] at org.apache.twill.filesystem.HDFSLocation.getInputStream(HDFSLocation.java:74) ~[org.apache.twill.twill-yarn-0.7.0-incubating.jar:0.7.0-incubating] at org.apache.twill.internal.yarn.AbstractYarnTwillService.handleSecureStoreUpdate(AbstractYarnTwillService.java:86) ~[org.apache.twill.twill-yarn-0.7.0-incubating.jar:0.7.0-incubating] at org.apache.twill.internal.container.TwillContainerService.onReceived(TwillContainerService.java:88) [org.apache.twill.twill-yarn-0.7.0-incubating.jar:0.7.0-incubating] at org.apache.twill.internal.AbstractTwillService.handleMessage(AbstractTwillService.java:314) [org.apache.twill.twill-core-0.7.0-incubating.jar:na] at org.apache.twill.internal.AbstractTwillService.access$900(AbstractTwillService.java:83) [org.apache.twill.twill-core-0.7.0-incubating.jar:na] at org.apache.twill.internal.AbstractTwillService$4.onSuccess(AbstractTwillService.java:265) [org.apache.twill.twill-core-0.7.0-incubating.jar:na] at org.apache.twill.internal.AbstractTwillService$4.onSuccess(AbstractTwillService.java:245) [org.apache.twill.twill-core-0.7.0-incubating.jar:na] at com.google.common.util.concurrent.Futures$6.run(Futures.java:799) [com.google.guava.guava-13.0.1.jar:na] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_67] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_67] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_67] {code}

    Cask Community Issue Tracker | 7 months ago | Ali Anwar
    org.apache.hadoop.ipc.RemoteException: token (HDFS_DELEGATION_TOKEN token 2710 for cdap) is expired

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.ipc.RemoteException

      GSS initiate failed

      at org.apache.hadoop.ipc.Client.call()
    2. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.ipc.Client.call(Client.java:1476)
      2. org.apache.hadoop.ipc.Client.call(Client.java:1407)
      3. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
      3 frames
    3. com.sun.proxy
      $Proxy170.getFileInfo
      1. com.sun.proxy.$Proxy170.getFileInfo(Unknown Source)
      1 frame
    4. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.getFileInfo
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
      1 frame
    5. Java RT
      Method.invoke
      1. sun.reflect.GeneratedMethodAccessor57.invoke(Unknown Source)
      2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      3. java.lang.reflect.Method.invoke(Method.java:498)
      3 frames
    6. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    7. com.sun.proxy
      $Proxy171.getFileInfo
      1. com.sun.proxy.$Proxy171.getFileInfo(Unknown Source)
      1 frame
    8. Apache Hadoop HDFS
      DistributedFileSystem$22.doCall
      1. org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
      2. org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
      3. org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
      3 frames
    9. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    10. Apache Hadoop HDFS
      DistributedFileSystem.getFileStatus
      1. org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
      1 frame
    11. com.facebook.presto
      ClassLoaderSafeConnectorPageSourceProvider.createPageSource
      1. com.facebook.presto.hive.parquet.HdfsParquetDataSource.buildHdfsParquetDataSource(HdfsParquetDataSource.java:96)
      2. com.facebook.presto.hive.parquet.ParquetHiveRecordCursor.createParquetRecordReader(ParquetHiveRecordCursor.java:414)
      3. com.facebook.presto.hive.parquet.ParquetHiveRecordCursor.<init>(ParquetHiveRecordCursor.java:246)
      4. com.facebook.presto.hive.parquet.ParquetRecordCursorProvider.createHiveRecordCursor(ParquetRecordCursorProvider.java:96)
      5. com.facebook.presto.hive.HivePageSourceProvider.getHiveRecordCursor(HivePageSourceProvider.java:129)
      6. com.facebook.presto.hive.HivePageSourceProvider.createPageSource(HivePageSourceProvider.java:107)
      7. com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorPageSourceProvider.createPageSource(ClassLoaderSafeConnectorPageSourceProvider.java:44)
      7 frames
    12. presto-main
      TaskExecutor$Runner.run
      1. com.facebook.presto.split.PageSourceManager.createPageSource(PageSourceManager.java:48)
      2. com.facebook.presto.operator.TableScanOperator.createSourceIfNecessary(TableScanOperator.java:268)
      3. com.facebook.presto.operator.TableScanOperator.isFinished(TableScanOperator.java:210)
      4. com.facebook.presto.operator.Driver.processInternal(Driver.java:375)
      5. com.facebook.presto.operator.Driver.processFor(Driver.java:301)
      6. com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:618)
      7. com.facebook.presto.execution.TaskExecutor$PrioritizedSplitRunner.process(TaskExecutor.java:529)
      8. com.facebook.presto.execution.TaskExecutor$Runner.run(TaskExecutor.java:665)
      8 frames
    13. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames