java.lang.ClassNotFoundException: scala.Function0

Stack Overflow | Max | 4 months ago
  1. 0

    Cannot run spark jobs locally using sbt, but works in IntelliJ

    Stack Overflow | 4 months ago | Max
    java.lang.ClassNotFoundException: scala.Function0
  2. 0

    Trouble building on Windows

    GitHub | 1 year ago | nikhilsinha
    java.lang.ClassNotFoundException: scala.xml.Properties
  3. 0

    ClassNotFoundException trying to run my unit tests with SBT

    Stack Overflow | 3 months ago | Phil
    java.lang.ClassNotFoundException: org.myproj.client.ClientAndServerTest
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    scalding unit tests for scala 2.10

    Google Groups | 3 years ago | Koert
    cascading.tuple.TupleException: unable to load class named: scala.Tuple3
  6. 0

    Heavily Modded MC 1.6.2, client. Crash

    minecraftforge.net | 9 months ago
    java.lang.NoClassDefFoundError: scala/Function0

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ClassNotFoundException

      scala.Function0

      at sbt.classpath.ClasspathFilter.loadClass()
    2. SBT
      ClasspathFilter.loadClass
      1. sbt.classpath.ClasspathFilter.loadClass(ClassLoaders.scala:63)
      1 frame
    3. Java RT
      Class.forName
      1. java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      2. java.lang.Class.forName0(Native Method)
      3. java.lang.Class.forName(Class.java:348)
      3 frames
    4. chill-java
      KryoBase$$anonfun$1.apply
      1. com.twitter.chill.KryoBase$$anonfun$1.apply(KryoBase.scala:41)
      2. com.twitter.chill.KryoBase$$anonfun$1.apply(KryoBase.scala:41)
      2 frames
    5. Scala
      AbstractTraversable.map
      1. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
      2. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
      3. scala.collection.immutable.Range.foreach(Range.scala:166)
      4. scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
      5. scala.collection.AbstractTraversable.map(Traversable.scala:104)
      5 frames
    6. chill-java
      EmptyScalaKryoInstantiator.newKryo
      1. com.twitter.chill.KryoBase.<init>(KryoBase.scala:41)
      2. com.twitter.chill.EmptyScalaKryoInstantiator.newKryo(ScalaKryoInstantiator.scala:57)
      2 frames
    7. Spark
      RDD$$anonfun$dependencies$2.apply
      1. org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:86)
      2. org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:274)
      3. org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:259)
      4. org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:175)
      5. org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects$lzycompute(KryoSerializer.scala:182)
      6. org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects(KryoSerializer.scala:178)
      7. org.apache.spark.shuffle.sort.SortShuffleManager$.canUseSerializedShuffle(SortShuffleManager.scala:187)
      8. org.apache.spark.shuffle.sort.SortShuffleManager.registerShuffle(SortShuffleManager.scala:99)
      9. org.apache.spark.ShuffleDependency.<init>(Dependency.scala:90)
      10. org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:91)
      11. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:235)
      12. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:233)
      12 frames
    8. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:121)
      1 frame
    9. Spark
      DAGScheduler$$anonfun$visit$1$1.apply
      1. org.apache.spark.rdd.RDD.dependencies(RDD.scala:233)
      2. org.apache.spark.scheduler.DAGScheduler.visit$2(DAGScheduler.scala:418)
      3. org.apache.spark.scheduler.DAGScheduler.getAncestorShuffleDependencies(DAGScheduler.scala:433)
      4. org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getShuffleMapStage(DAGScheduler.scala:288)
      5. org.apache.spark.scheduler.DAGScheduler$$anonfun$visit$1$1.apply(DAGScheduler.scala:394)
      6. org.apache.spark.scheduler.DAGScheduler$$anonfun$visit$1$1.apply(DAGScheduler.scala:391)
      6 frames
    10. Scala
      List.foreach
      1. scala.collection.immutable.List.foreach(List.scala:381)
      1 frame
    11. Spark
      EventLoop$$anon$1.run
      1. org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:391)
      2. org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:403)
      3. org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:304)
      4. org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:339)
      5. org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:849)
      6. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1626)
      7. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
      8. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
      9. org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      9 frames