java.lang.IllegalStateException: Library directory '/Users/marjanasgari/Desktop/MyThesis/EclipseIDE_Spark/First_Samples/Spark_MLlib/MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.

Stack Overflow | marjan | 5 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Submitting Spark application on YARN from Eclipse IDE

    Stack Overflow | 5 months ago | marjan
    java.lang.IllegalStateException: Library directory '/Users/marjanasgari/Desktop/MyThesis/EclipseIDE_Spark/First_Samples/Spark_MLlib/MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Library directory '/Users/marjanasgari/Desktop/MyThesis/EclipseIDE_Spark/First_Samples/Spark_MLlib/MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.

      at org.apache.spark.launcher.CommandBuilderUtils.checkState()
    2. org.apache.spark
      YarnCommandBuilderUtils$.findJarsDir
      1. org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
      2. org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
      3. org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
      3 frames
    3. Spark Project YARN Stable API
      Client.submitApplication
      1. org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500)
      2. org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
      3. org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
      3 frames
    4. Spark
      SparkContext.<init>
      1. org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
      2. org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
      3. org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
      3 frames
    5. Unknown
      SVM.main
      1. SVM.main(SVM.java:21)
      1 frame