org.apache.spark.api.python.PythonException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • GitHub comment 108#234905107
    via GitHub by leninlal
    ,
    • org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 156, in _read_with_length length = read_int(stream) File "/home/gbenhaim/dev/spark-1.6.3-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/serializers.py", line 545, in read_int raise EOFError EOFError at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166) at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:207) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: net.razorvine.pickle.PickleException: couldn't introspect javabean: java.lang.IllegalArgumentException: wrong number of arguments at net.razorvine.pickle.Pickler.put_javabean(Pickler.java:705) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:323) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:499) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at com.basho.riak.spark.util.python.TuplePickler$.pickle(PicklingUtils.scala:136) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:248) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.put_arrayOfObjects(Pickler.java:493) at net.razorvine.pickle.Pickler.dispatch(Pickler.java:205) at net.razorvine.pickle.Pickler.save(Pickler.java:137) at net.razorvine.pickle.Pickler.dump(Pickler.java:107) at net.razorvine.pickle.Pickler.dumps(Pickler.java:92) at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:122) at com.basho.riak.spark.util.python.BatchPickler$$anonfun$apply$1.apply(PicklingUtils.scala:121) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452) at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1817) at org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:239)
    No Bugmate found.