org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/serializers.py", line 164, in _read_with_length return self.loads(obj) File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/serializers.py", line 422, in loads return pickle.loads(obj) UnpicklingError: NEWOBJ class argument has NULL tp_new

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by gsamaras
, 1 year ago
) File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/serializers.py", line 422, in loads return pickle.loads(obj) UnpicklingError: NEWOBJ class argument has NULL tp_new
via Stack Overflow by FLFLFLFL
, 1 year ago
Traceback (most recent call last): File "/home/ubuntu/spark-1.6.1-bin-hadoop2.6/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/ubuntu/spark-1.6.1-bin-hadoop2.6/python/lib
via Stack Overflow by Zoroastro
, 5 months ago
Traceback (most recent call last): File "/opt/spark/python/lib/pyspark.zip/pyspark/worker.py", line 163, in main func, profiler, deserializer, serializer = read_command(pickleSer, infile) File "/opt/spark/python/lib/pyspark.zip/pyspark
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
via spark-user by Anoop Shiralige, 1 year ago
Traceback (most recent call last): File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/disk2/spark6/spark-1.6.0-bin-hadoop2.4/python
via GitHub by leninlal
, 1 year ago
Traceback (most recent call last): File "/home/jez/Downloads/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/home/jez/Downloads/spark/spark-1.6.0/python/lib
org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/worker.py", line 98, in main command = pickleSer._read_with_length(infile) File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/serializers.py", line 164, in _read_with_length return self.loads(obj) File "/grid/0/tmp/yarn-local/usercache/gsamaras/appcache/application_1470212406507_56888/container_e04_1470212406507_56888_01_000009/pyspark.zip/pyspark/serializers.py", line 422, in loads return pickle.loads(obj) UnpicklingError: NEWOBJ class argument has NULL tp_new
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166)
at org.apache.spark.api.python.PythonRunner$$anon$1.(PythonRDD.scala:207)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 3 days ago

Know the solutions? Share your knowledge to help other developers to debug faster.