org.apache.spark.api.python.PythonException

Traceback (most recent call last): File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\worker.py", line 172, in main File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\worker.py", line 167, in process File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 2371, in pipeline_func return func(split, prev_func(split, iterator)) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 2371, in pipeline_func return func(split, prev_func(split, iterator)) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 317, in func return f(iterator) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 1792, in combineLocally merger.mergeValues(iterator) File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\shuffle.py", line 236, in mergeValues for k, v in iterator: File "E:/Work/Python1/work/spark/streamexample.py", line 159, in <lambda> with_hash = stream.map(lambda po : createmd5Hash(po)).reduceByKey(lambda s1,s2:s1) File "E:/Work/Python1/work/spark/streamexample.py", line 31, in createmd5Hash data = json.loads(input_line) File "C:\Python34\lib\json\__init__.py", line 318, in loads return _default_decoder.decode(s) File "C:\Python34\lib\json\decoder.py", line 343, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Python34\lib\json\decoder.py", line 361, in raw_decode raise ValueError(errmsg("Expecting value", s, err.value)) from None ValueError: Expecting value: line 1 column 1 (char 0)

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web112

  • via Stack Overflow by Backtrack
    , 9 months ago
    _default_decoder.decode(s) File "C:\Python34\lib\json\decoder.py", line 343, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Python34\lib\json\decoder.py", line 361, in raw_decode raise ValueError(errmsg("Expecting value", s, err.value)) from None ValueError: Expecting value: line 1 column 1 (char 0)
  • (iterator) File "/usr/local/spark/python/lib/pyspark.zip/pyspark/shuffle.py", line 236, in mergeValues for k, v in iterator: File "<stdin>", line 1, in <lambda> File "./TargetHolding_pyspark-cassandra-0.3.5.jar/pyspark_cassandra/types.py", line 130
  • Traceback (most recent call last): File "/home/ubuntu/spark/python/lib/pyspark.zip/pyspark/worker.py", line 174, in main process() File "/home/ubuntu/spark/python/lib/pyspark.zip/pyspark/worker.py", line 169, in process
  • Stack trace

    • org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\worker.py", line 172, in main File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\worker.py", line 167, in process File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 2371, in pipeline_func return func(split, prev_func(split, iterator)) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 2371, in pipeline_func return func(split, prev_func(split, iterator)) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 317, in func return f(iterator) File "E:\Work\spark\installtion\spark\python\pyspark\rdd.py", line 1792, in combineLocally merger.mergeValues(iterator) File "E:\Work\spark\installtion\spark\python\lib\pyspark.zip\pyspark\shuffle.py", line 236, in mergeValues for k, v in iterator: File "E:/Work/Python1/work/spark/streamexample.py", line 159, in <lambda> with_hash = stream.map(lambda po : createmd5Hash(po)).reduceByKey(lambda s1,s2:s1) File "E:/Work/Python1/work/spark/streamexample.py", line 31, in createmd5Hash data = json.loads(input_line) File "C:\Python34\lib\json\__init__.py", line 318, in loads return _default_decoder.decode(s) File "C:\Python34\lib\json\decoder.py", line 343, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Python34\lib\json\decoder.py", line 361, in raw_decode raise ValueError(errmsg("Expecting value", s, err.value)) from None ValueError: Expecting value: line 1 column 1 (char 0) at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193) at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319) at org.apache.spark.rdd.RDD.iterator(RDD.scala:283) at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:390) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319) at org.apache.spark.rdd.RDD.iterator(RDD.scala:283) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47) at org.apache.spark.scheduler.Task.run(Task.scala:85) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You’re the first here who have seen this exception.