java.lang.ExceptionInInitializerError

Stack Overflow | J.K | 4 months ago
  1. 0

    Submitting Spark application on YARN from Eclipse IDE

    Stack Overflow | 3 months ago | marjan
    java.lang.IllegalStateException: Library directory '/Users/marjanasgari/Desktop/MyThesis/EclipseIDE_Spark/First_Samples/Spark_MLlib/MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
  2. 0

    Exception in thread "main" java.lang.IllegalStateException: Library directory '/Users/dbl/spark/lib_managed/jars' does not exist

    Stack Overflow | 1 year ago | dbl001
    java.lang.IllegalStateException: Library directory '/Users/davidlaxer/spark/lib_managed/jars' does not exist.
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    running pyspark.mllib on Ubuntu

    Stack Overflow | 5 months ago | childishwitch
    java.lang.IllegalStateException: Library directory '/home/user/spark/lib_managed/jars' does not exist.
  5. 0

    Apache Spark Developers List - [VOTE] Release Apache Spark 1.5.0 (RC1)

    nabble.com | 5 months ago
    java.lang.IllegalStateException: Library directory '/home/lresende/dev/spark/source/releases/spark-1.5.0/lib_managed/jars' does not exist.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Library directory '/opt/hadoop/tmp/nm-local-dir/usercache/hadoop/appcache/application_1471514504287_0021/container_1471514504287_0021_01_000002/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.

      at org.apache.spark.launcher.CommandBuilderUtils.checkState()
    2. org.apache.spark
      YarnCommandBuilderUtils$.findJarsDir
      1. org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
      2. org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
      3. org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
      3 frames
    3. Spark Project YARN Stable API
      Client.submitApplication
      1. org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500)
      2. org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
      3. org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
      3 frames
    4. Spark
      SparkContext.<init>
      1. org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
      2. org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
      3. org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
      3 frames
    5. org.apache.spark
      CollaborativeFilteringSpark$$anonfun$main$1.apply
      1. org.apache.spark.mllib.learning.recommend.CollaborativeFilteringSpark$.<init>(CollaborativeFilteringSpark.scala:16)
      2. org.apache.spark.mllib.learning.recommend.CollaborativeFilteringSpark$.<clinit>(CollaborativeFilteringSpark.scala)
      3. org.apache.spark.mllib.learning.recommend.CollaborativeFilteringSpark$$anonfun$main$1.apply(CollaborativeFilteringSpark.scala:64)
      4. org.apache.spark.mllib.learning.recommend.CollaborativeFilteringSpark$$anonfun$main$1.apply(CollaborativeFilteringSpark.scala:62)
      4 frames
    6. Scala
      Iterator$class.foreach
      1. scala.collection.Iterator$class.foreach(Iterator.scala:893)
      1 frame
    7. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
      2. org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.apply(RDD.scala:875)
      3. org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.apply(RDD.scala:875)
      4. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1897)
      5. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1897)
      6. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
      7. org.apache.spark.scheduler.Task.run(Task.scala:85)
      8. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
      8 frames
    8. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames