org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by bsphere
, 11 months ago
Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via incubator-spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
net.razorvine.pickle.PickleException: couldn't introspect javabean: java.lang.IllegalArgumentException: wrong number of arguments
at net.razorvine.pickle.Pickler.put_javabean(Pickler.java:705)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:323)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:499)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at com.basho.riak.spark.util.python.TuplePickler$.pickle(PicklingUtils.scala:136)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:493)
at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205)
at net.razorvine.pickle.Pickler.save(Pickler.java:137)
at net.razorvine.pickle.Pickler.dump(Pickler.java:107)
at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:122)
at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:121)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1817)

Users with the same issue

You are the first who have seen this exception.

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.