java.lang.LinkageError: ClassCastException: attempting to castjar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.class

Stack Overflow | hbabbar | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    ClassCastException on Drop table query in apache spark hive

    Stack Overflow | 7 months ago | hbabbar
    java.lang.LinkageError: ClassCastException: attempting to castjar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.class
  2. 0

    Get a java.lang.LinkageError: ClassCastException when use spark sql hivesql on yarn

    Stack Overflow | 11 months ago | cxco
    java.lang.LinkageError: ClassCastException: attempting to castjar:file:/mnt/hadoop/yarn/local/filecache/18/spark-assembly-1.6.0-hadoop2.6.0.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/mnt/hadoop/yarn/local/filecache/18/spark-assembly-1.6.0-hadoop2.6.0.jar!/javax/ws/rs/ext/RuntimeDelegate.class
  3. 0

    java.lang.LinkageError: ClassCastException while using spark-submit

    Stack Overflow | 1 month ago | shashank kulkarni
    java.lang.LinkageError: ClassCastException: attempting to castjar:file:/usr/local/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/usr/local/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/javax/ws/rs/ext/RuntimeDelegate.class
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    LinkageError after upgrading dropwizard to 0.8.1

    Stack Overflow | 2 years ago | user2536290
    java.lang.LinkageError: ClassCastException: attempting to castjar:file:/home/.gradle/caches/modules-2/files-2.1/javax.ws.rs/jsr311-api/1.1.1/59033da2a1afd56af1ac576750a8d0b1830d59e6/jsr311-api-1.1.1.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/home/.gradle/caches/modules-2/files-2.1/javax.ws.rs/jsr311-api/1.1.1/59033da2a1afd56af1ac576750a8d0b1830d59e6/jsr311-api-1.1.1.jar!/javax/ws/rs/ext/RuntimeDelegate.class

    3 unregistered visitors

    Root Cause Analysis

    1. java.lang.LinkageError

      ClassCastException: attempting to castjar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/javax/ws/rs/ext/RuntimeDelegate.class

      at javax.ws.rs.ext.RuntimeDelegate.findDelegate()
    2. JavaEE 7
      MediaType.<clinit>
      1. javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:116)
      2. javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:91)
      3. javax.ws.rs.core.MediaType.<clinit>(MediaType.java:44)
      3 frames
    3. jersey-core
      MessageBodyFactory.init
      1. com.sun.jersey.core.header.MediaTypes.<clinit>(MediaTypes.java:64)
      2. com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:182)
      3. com.sun.jersey.core.spi.factory.MessageBodyFactory.initReaders(MessageBodyFactory.java:175)
      4. com.sun.jersey.core.spi.factory.MessageBodyFactory.init(MessageBodyFactory.java:162)
      4 frames
    4. jersey-client
      Client$1.f
      1. com.sun.jersey.api.client.Client.init(Client.java:342)
      2. com.sun.jersey.api.client.Client.access$000(Client.java:118)
      3. com.sun.jersey.api.client.Client$1.f(Client.java:191)
      4. com.sun.jersey.api.client.Client$1.f(Client.java:187)
      4 frames
    5. jersey-core
      Errors.processWithErrors
      1. com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193)
      1 frame
    6. jersey-client
      Client.<init>
      1. com.sun.jersey.api.client.Client.<init>(Client.java:187)
      2. com.sun.jersey.api.client.Client.<init>(Client.java:170)
      2 frames
    7. hadoop-yarn-client
      TimelineClientImpl.serviceInit
      1. org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.serviceInit(TimelineClientImpl.java:340)
      1 frame
    8. Hadoop
      AbstractService.init
      1. org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
      1 frame
    9. Hive Query Language
      ATSHook.<init>
      1. org.apache.hadoop.hive.ql.hooks.ATSHook.<init>(ATSHook.java:67)
      1 frame
    10. Java RT
      Class.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      5. java.lang.Class.newInstance(Class.java:442)
      5 frames
    11. Hive Query Language
      Driver.run
      1. org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:60)
      2. org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1309)
      3. org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1293)
      4. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1347)
      5. org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
      6. org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
      7. org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
      7 frames
    12. org.apache.spark
      ClientWrapper.runSqlHive
      1. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:495)
      2. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
      3. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)
      4. org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)
      5. org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)
      6. org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:279)
      7. org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:484)
      8. org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:474)
      8 frames
    13. Spark Project Hive
      DropTable.run
      1. org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:613)
      2. org.apache.spark.sql.hive.execution.DropTable.run(commands.scala:89)
      2 frames
    14. Spark Project SQL
      SparkPlan$$anonfun$execute$5.apply
      1. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
      2. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
      3. org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
      4. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
      5. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
      5 frames
    15. Spark
      RDDOperationScope$.withScope
      1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      1 frame
    16. Spark Project SQL
      SQLContext.sql
      1. org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
      2. org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
      3. org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
      4. org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
      5. org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
      6. org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
      7. org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
      7 frames
    17. com.accenture.aa
      UserJourneyBuilder$$anonfun$buildUserJourney$1.apply$mcVI$sp
      1. com.accenture.aa.dmah.spark.core.QueryExecutor.executeQuery(QueryExecutor.scala:35)
      2. com.accenture.aa.dmah.attribution.transformer.MulltipleUserJourneyTransformer.transform(MulltipleUserJourneyTransformer.scala:32)
      3. com.accenture.aa.dmah.attribution.userjourney.UserJourneyBuilder$$anonfun$buildUserJourney$1.apply$mcVI$sp(UserJourneyBuilder.scala:31)
      3 frames
    18. Scala
      Range.foreach$mVc$sp
      1. scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
      1 frame
    19. com.accenture.aa
      BootstrapObj.main
      1. com.accenture.aa.dmah.attribution.userjourney.UserJourneyBuilder.buildUserJourney(UserJourneyBuilder.scala:29)
      2. com.accenture.aa.dmah.attribution.core.AttributionHub.executeAttribution(AttributionHub.scala:47)
      3. com.accenture.aa.dmah.attribution.jobs.AttributionJob.process(AttributionJob.scala:33)
      4. com.accenture.aa.dmah.core.DMAHJob.processJob(DMAHJob.scala:73)
      5. com.accenture.aa.dmah.core.DMAHJob.execute(DMAHJob.scala:27)
      6. com.accenture.aa.dmah.core.JobRunner.<init>(JobRunner.scala:17)
      7. com.accenture.aa.dmah.core.ApplicationInstance.initilize(ApplicationInstance.scala:48)
      8. com.accenture.aa.dmah.core.Bootstrap.boot(Bootstrap.scala:112)
      9. com.accenture.aa.dmah.core.BootstrapObj$.main(Bootstrap.scala:134)
      10. com.accenture.aa.dmah.core.BootstrapObj.main(Bootstrap.scala)
      10 frames
    20. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    21. Scala Compiler
      MainGenericRunner.main
      1. scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:71)
      2. scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
      3. scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:139)
      4. scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:71)
      5. scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:139)
      6. scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:28)
      7. scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
      8. scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:35)
      9. scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
      10. scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74)
      11. scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:96)
      12. scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:105)
      13. scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
      13 frames