java.lang.ClassNotFoundException: com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1$$anonfun$4$$anonfun$5

GitHub | VeenitShah | 9 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    spark-sql gives java.lang.ClassNotFoundException: com.databricks.spark.avro.AvroRelation when quering multiple avro files

    GitHub | 9 months ago | VeenitShah
    java.lang.ClassNotFoundException: com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1$$anonfun$4$$anonfun$5
  2. 0

    SolrRelationUtil error in Zeppelin %sql

    GitHub | 8 months ago | x10ba
    java.lang.ClassNotFoundException: com.lucidworks.spark.util.SolrRelationUtil$$anonfun$1$$anonfun$apply$2
  3. 0

    Apache Spark Developers List - A confusing ClassNotFoundException error

    nabble.com | 2 years ago
    java.lang.ClassNotFoundException: cn.zhaishidan.trans.service.SparkHiveService$$anonfun$mapHandle$1$1$$anonfun$apply$1
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Apache Spark User List - java.lang.ClassNotFoundException: TestT$$anonfun$buildLabeledPoints$3$$anonfun$apply$1

    nabble.com | 2 years ago
    java.lang.ClassNotFoundException: TestT$$anonfun$buildLabeledPoints$3$$anonfun$apply$1
  6. 0

    [SPARK-8368] ClassNotFoundException in closure for map - ASF JIRA

    apache.org | 2 years ago
    java.lang.ClassNotFoundException: com.yhd.ycache.magic.Model$$anonfun$9$$anonfun$10
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.ClassNotFoundException

    com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1$$anonfun$4$$anonfun$5

    at java.net.URLClassLoader$1.run()
  2. Java RT
    Class.forName
    1. java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    2. java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    3. java.security.AccessController.doPrivileged(Native Method)
    4. java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    5. java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    6. sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    7. java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    8. java.lang.Class.forName0(Native Method)
    9. java.lang.Class.forName(Class.java:278)
    9 frames
  3. Spark
    InnerClosureFinder$$anon$4.visitMethodInsn
    1. org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:435)
    1 frame
  4. Apache XBean :: ASM 5 shaded (repackaged)
    ClassReader.accept
    1. org.apache.xbean.asm5.ClassReader.a(Unknown Source)
    2. org.apache.xbean.asm5.ClassReader.b(Unknown Source)
    3. org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    4. org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    4 frames
  5. Spark
    RDD.mapPartitions
    1. org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:84)
    2. org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:187)
    3. org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
    4. org.apache.spark.SparkContext.clean(SparkContext.scala:2055)
    5. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:707)
    6. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1.apply(RDD.scala:706)
    7. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    8. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    9. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    10. org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:706)
    10 frames
  6. com.databricks.spark
    AvroRelation$$anonfun$buildScan$1.apply
    1. com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1.apply(AvroRelation.scala:126)
    2. com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1.apply(AvroRelation.scala:120)
    2 frames
  7. Scala
    ArrayOps$ofRef.map
    1. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    2. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    3. scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    4. scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    5. scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    6. scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
    6 frames
  8. com.databricks.spark
    AvroRelation.buildScan
    1. com.databricks.spark.avro.AvroRelation.buildScan(AvroRelation.scala:120)
    1 frame
  9. Spark Project SQL
    HadoopFsRelation.buildInternalScan
    1. org.apache.spark.sql.sources.HadoopFsRelation.buildScan(interfaces.scala:762)
    2. org.apache.spark.sql.sources.HadoopFsRelation.buildScan(interfaces.scala:790)
    3. org.apache.spark.sql.sources.HadoopFsRelation.buildInternalScan(interfaces.scala:821)
    4. org.apache.spark.sql.sources.HadoopFsRelation.buildInternalScan(interfaces.scala:661)
    4 frames
  10. org.apache.spark
    DataSourceStrategy$.apply
    1. org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:131)
    2. org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:131)
    3. org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:292)
    4. org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:291)
    5. org.apache.spark.sql.execution.datasources.DataSourceStrategy$.pruneFilterProjectRaw(DataSourceStrategy.scala:370)
    6. org.apache.spark.sql.execution.datasources.DataSourceStrategy$.pruneFilterProject(DataSourceStrategy.scala:287)
    7. org.apache.spark.sql.execution.datasources.DataSourceStrategy$.apply(DataSourceStrategy.scala:127)
    7 frames
  11. Spark Project Catalyst
    QueryPlanner$$anonfun$1.apply
    1. org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    2. org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    2 frames
  12. Scala
    Iterator$$anon$13.hasNext
    1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    1 frame
  13. Spark Project Catalyst
    QueryPlanner.planLater
    1. org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    2. org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54)
    2 frames
  14. Spark Project SQL
    SparkStrategies$BasicOperators$.apply
    1. org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:349)
    1 frame
  15. Spark Project Catalyst
    QueryPlanner$$anonfun$1.apply
    1. org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    2. org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    2 frames
  16. Scala
    Iterator$$anon$13.hasNext
    1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    1 frame
  17. Spark Project Catalyst
    QueryPlanner.plan
    1. org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    1 frame
  18. Spark Project SQL
    DataFrame.take
    1. org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:47)
    2. org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:45)
    3. org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:52)
    4. org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:52)
    5. org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2095)
    6. org.apache.spark.sql.DataFrame.head(DataFrame.scala:1374)
    7. org.apache.spark.sql.DataFrame.take(DataFrame.scala:1456)
    7 frames
  19. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  20. org.apache.zeppelin
    FIFOScheduler$1.run
    1. org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:297)
    2. org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:144)
    3. org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
    4. org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
    5. org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:300)
    6. org.apache.zeppelin.scheduler.Job.run(Job.java:169)
    7. org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:134)
    7 frames
  21. Java RT
    Thread.run
    1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    2. java.util.concurrent.FutureTask.run(FutureTask.java:262)
    3. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
    4. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
    5. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    6. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    7. java.lang.Thread.run(Thread.java:745)
    7 frames