org.apache.spark.api.python.PythonException

Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError


Solutions on the web114

Solution icon of github
Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin

Solution icon of github
via GitHub by jpdna
, 4 months ago
Traceback (most recent call last): File "/home/paschallj/Spark/1.6.3/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/paschallj/Spark/1.6.3/spark

Solution icon of apache
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python

Solution icon of github
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib

Solution icon of github
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib

Solution icon of apache
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python

Solution icon of apache
via incubator-spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python

Solution icon of apache
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python

Solution icon of stackoverflow
via Stack Overflow by user5147250
, 1 year ago
Traceback (most recent call last): File "/ephemeral/usr/hdp/2.3.4.33-1/spark/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/ephemeral/usr/hdp/2.3.4.33-1/spark/python/lib

Solution icon of stackoverflow
via Stack Overflow by IcedNecro
, 1 year ago
Traceback (most recent call last): File "/home/roman/dev/spark-1.4.0-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/roman/dev/spark-1.4.0-bin-hadoop2.6

Stack trace

  • org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166) at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:207) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: net.razorvine.pickle.PickleException: couldn't introspect javabean: java.lang.IllegalArgumentException: wrong number of arguments at net.razorvine.pickle.Pickler.put_javabean(Pickler.java:705) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:323) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:499) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at com.basho.riak.spark.util.python.TuplePickler$.pickle(PicklingUtils.scala:136) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:493) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.dump(Pickler.java:107) at net.razorvine.pickle.Pickler.dumps(Pickler.java:92) at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:122) at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:121) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452) at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1817) at org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:239)

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.