java.lang.NumberFormatException: For input string: "9000/input/plik2.txt"

Stack Overflow | DamianOS.MP5 | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Colons in Apache Spark application path

    Stack Overflow | 8 months ago | DamianOS.MP5
    java.lang.NumberFormatException: For input string: "9000/input/plik2.txt"
  2. 0

    Getting error when parsing spark driver host in hadoop

    Stack Overflow | 3 years ago | user3928837
    java.lang.NumberFormatException: For input string: "57831'"
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    'Advanced Analytics With Spark' error when calling .count()

    Stack Overflow | 4 months ago | lars
    java.lang.NumberFormatException: For input string: "lwM#lwMUx����ɮ4Mҭ��u'"

  1. harshg 1 times, last 2 months ago
  2. rp 4 times, last 2 months ago
  3. Nikolay Rybak 1 times, last 3 months ago
  4. Mark 6 times, last 8 months ago
  5. kid 1 times, last 8 months ago
40 more registered users
55 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.NumberFormatException

    For input string: "9000/input/plik2.txt"

    at java.lang.NumberFormatException.forInputString()
  2. Java RT
    Integer.parseInt
    1. java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    2. java.lang.Integer.parseInt(Integer.java:580)
    3. java.lang.Integer.parseInt(Integer.java:615)
    3 frames
  3. Scala
    StringOps.toInt
    1. scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272)
    2. scala.collection.immutable.StringOps.toInt(StringOps.scala:29)
    2 frames
  4. Spark
    Utils$.parseHostPort
    1. org.apache.spark.util.Utils$.parseHostPort(Utils.scala:935)
    1 frame
  5. Spark Project YARN Stable API
    ApplicationMaster$$anonfun$main$1.apply$mcV$sp
    1. org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:547)
    2. org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:405)
    3. org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:247)
    4. org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:749)
    4 frames
  6. Spark
    SparkHadoopUtil$$anon$1.run
    1. org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:71)
    2. org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:70)
    2 frames
  7. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:422)
    2 frames
  8. Hadoop
    UserGroupInformation.doAs
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    1 frame
  9. Spark
    SparkHadoopUtil.runAsSparkUser
    1. org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:70)
    1 frame
  10. Spark Project YARN Stable API
    ExecutorLauncher.main
    1. org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:747)
    2. org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:774)
    3. org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
    3 frames