java.lang.Exception: Subprocess exited with status 1

Stack Overflow | geoalgo | 5 months ago
  1. 0

    Getting the error of subprocess to the master when using rdd.pipe

    Stack Overflow | 5 months ago | geoalgo
    java.lang.Exception: Subprocess exited with status 1
  2. 0

    Pyspark saveAsTextFile exceptions

    apache.org | 1 year ago
    java.lang.Exception: Subprocess exited with status 134
  3. 0

    Spark cluster computing framework

    gmane.org | 1 year ago
    org.apache.spark.api.python.PythonException: Traceback (most recent call last):   File "/home/rajesh/spark-1.2.0/python/pyspark/worker.py", line 90, in main command = pickleSer._read_with_length(infile)   File "/home/rajesh/spark-1.2.0/python/pyspark/serializers.py", line 145, in _read_with_length length = read_int(stream)   File "/home/rajesh/spark-1.2.0/python/pyspark/serializers.py", line 511, in read_int raise EOFError EOFError
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Pyspark saveAsTextFile exceptions

    spark-user | 2 years ago | Madabhattula Rajesh Kumar
    org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/rajesh/spark-1.2.0/python/pyspark/worker.py", line 90, in main command = pickleSer._read_with_length(infile) File "/home/rajesh/spark-1.2.0/python/pyspark/serializers.py", line 145, in _read_with_length length = read_int(stream) File "/home/rajesh/spark-1.2.0/python/pyspark/serializers.py", line 511, in read_int raise EOFError EOFError
  6. 0

    GitHub comment 311#207632229

    GitHub | 8 months ago | marekhorst
    java.lang.Exception: Subprocess exited with status 127

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.Exception

      Subprocess exited with status 1

      at org.apache.spark.rdd.PipedRDD$$anon$1.hasNext()
    2. Spark
      PipedRDD$$anon$1.hasNext
      1. org.apache.spark.rdd.PipedRDD$$anon$1.hasNext(PipedRDD.scala:161)
      1 frame
    3. Scala
      Iterator$class.foreach
      1. scala.collection.Iterator$class.foreach(Iterator.scala:727)
      1 frame
    4. Spark
      PipedRDD$$anon$1.foreach
      1. org.apache.spark.rdd.PipedRDD$$anon$1.foreach(PipedRDD.scala:153)
      1 frame
    5. Scala
      TraversableOnce$class.to
      1. scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
      2. scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
      3. scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
      4. scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
      4 frames
    6. Spark
      PipedRDD$$anon$1.to
      1. org.apache.spark.rdd.PipedRDD$$anon$1.to(PipedRDD.scala:153)
      1 frame
    7. Scala
      TraversableOnce$class.toBuffer
      1. scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
      1 frame
    8. Spark
      PipedRDD$$anon$1.toBuffer
      1. org.apache.spark.rdd.PipedRDD$$anon$1.toBuffer(PipedRDD.scala:153)
      1 frame
    9. Scala
      TraversableOnce$class.toArray
      1. scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
      1 frame
    10. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.PipedRDD$$anon$1.toArray(PipedRDD.scala:153)
      2. org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885)
      3. org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885)
      4. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
      5. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
      6. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
      7. org.apache.spark.scheduler.Task.run(Task.scala:70)
      8. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
      8 frames
    11. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames