org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
via incubator-spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError
at net.razorvine.pickle.Pickler.put_javabean(Pickler.java:705)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:323)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_collection(Pickler.java:335)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:314)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at pyspark_util.ListPickler$.pickle(Pickling.scala:244)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at pyspark_util.StructPickler$class.pickle(Pickling.scala:149)
at pyspark_cassandra.UDTValuePickler$.pickle(Pickling.scala:66)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_collection(Pickler.java:335)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:314)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at pyspark_util.ListPickler$.pickle(Pickling.scala:244)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_collection(Pickler.java:335)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:314)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at pyspark_util.ListPickler$.pickle(Pickling.scala:244)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at pyspark_util.StructPickler$class.pickle(Pickling.scala:149)
at pyspark_cassandra.PlainRowPickler$.pickle(Pickling.scala:56)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:493)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.dump(Pickler.java:107)
at pyspark_util.BatchPickler$$anonfun$apply$1.apply(Pickling.scala:131)
at pyspark_util.BatchPickler$$anonfun$apply$1.apply(Pickling.scala:131)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1741)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Know the solutions? Share your knowledge to help other developers to debug faster.