java.lang.NumberFormatException: For input string: ""

GitHub | Azuaron | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    GitHub comment 239#236705052

    GitHub | 8 months ago | Azuaron
    java.lang.NumberFormatException: For input string: ""
  2. 0

    GitHub comment 98#248926427

    GitHub | 6 months ago | barrybecker4
    java.lang.NumberFormatException: For input string: \"1.9\"
  3. 0

    GitHub comment 98#132465994

    GitHub | 2 years ago | svravitej
    java.lang.NumberFormatException: For input string: "\N"
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    java.lang.NumberFormatException: For input string: ""

    GitHub | 3 years ago | rushtehrani
    java.lang.NumberFormatException: For input string: ""

  1. harshg 1 times, last 1 month ago
  2. rp 4 times, last 1 month ago
  3. Nikolay Rybak 1 times, last 2 months ago
  4. Mark 6 times, last 7 months ago
  5. kid 1 times, last 7 months ago
36 more registered users
59 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.NumberFormatException

    For input string: ""

    at java.lang.NumberFormatException.forInputString()
  2. Java RT
    Integer.parseInt
    1. java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    2. java.lang.Integer.parseInt(Integer.java:592)
    3. java.lang.Integer.parseInt(Integer.java:615)
    3 frames
  3. Scala
    StringOps.toInt
    1. scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229)
    2. scala.collection.immutable.StringOps.toInt(StringOps.scala:31)
    2 frames
  4. com.databricks.spark
    CsvRelation$$anonfun$buildScan$2.apply
    1. com.databricks.spark.csv.util.TypeCast$.castTo(TypeCast.scala:61)
    2. com.databricks.spark.csv.CsvRelation$$anonfun$buildScan$2.apply(CsvRelation.scala:120)
    3. com.databricks.spark.csv.CsvRelation$$anonfun$buildScan$2.apply(CsvRelation.scala:106)
    3 frames
  5. Scala
    Iterator$$anon$11.hasNext
    1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    2. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
    3. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
    4. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
    4 frames
  6. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:148)
    2. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
    3. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
    4. org.apache.spark.scheduler.Task.run(Task.scala:89)
    5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    5 frames
  7. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames