Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    Expert tip

    This might be an issue with the file location in the Spark submit command. Try it with

    spark-submit --master spark://master:7077 \
         hello_world_from_pyspark.py {file location}
    
    
  2. ,
    Expert tip

    Check if you've set a name in Application -> Run. If you didn't, the generated XML is gonna have missing information and then this exception will be thrown.

Solutions on the web

via Stack Overflow by pzq317
, 1 year ago
controllers.util.SimpleCompute$$anonfun$4
via Stack Overflow by Amalo
, 9 months ago
via GitHub by PZaytsevUSC
, 8 months ago
via Stack Overflow by Jose Fonseca
, 1 year ago
controllers.Application$$anonfun$test$1$$anonfun$2
via GitHub by PZaytsevUSC
, 1 year ago
via GitHub by owlgvt
, 1 year ago
com.databricks.spark.csv.CsvRelation$$anonfun$firstLine$1
java.lang.ClassNotFoundException: controllers.util.SimpleCompute$$anonfun$4	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)	at java.lang.Class.forName0(Native Method)	at java.lang.Class.forName(Class.java:348)	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)	at org.apache.spark.scheduler.Task.run(Task.scala:85)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)