java.lang.IllegalArgumentException: Can not create a Path from an empty string

GitHub | allixender | 2 years ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    GitHub comment 1099#103358217

    GitHub | 2 years ago | allixender
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
  2. 0

    Can't load multiple files in nested directories from S3

    GitHub | 3 months ago | Nath5
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
  3. 0

    Loading nested csv files from S3 with Spark

    Stack Overflow | 3 months ago | Nathan Case
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Branch 1.3 by hxquangnhat · Pull Request #6635 · apache/spark · GitHub

    github.com | 1 year ago
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    Can not create a Path from an empty string

    at org.apache.hadoop.fs.Path.checkPathArg()
  2. Hadoop
    StringUtils.stringToPath
    1. org.apache.hadoop.fs.Path.checkPathArg(Path.java:127)
    2. org.apache.hadoop.fs.Path.<init>(Path.java:135)
    3. org.apache.hadoop.util.StringUtils.stringToPath(StringUtils.java:241)
    3 frames
  3. Hadoop
    FileInputFormat.setInputPaths
    1. org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:454)
    1 frame
  4. geotrellis.spark.io
    package$HadoopSparkContextMethodsWrapper.hadoopGeoTiffRDD
    1. geotrellis.spark.io.hadoop.HdfsUtils$.putFilesInConf(HdfsUtils.scala:58)
    2. geotrellis.spark.io.hadoop.package$HadoopConfigurationWrapper.withInputDirectory(package.scala:55)
    3. geotrellis.spark.io.hadoop.HadoopSparkContextMethods$class.hadoopGeoTiffRDD(HadoopSparkContextMethods.scala:29)
    4. geotrellis.spark.io.hadoop.package$HadoopSparkContextMethodsWrapper.hadoopGeoTiffRDD(package.scala:43)
    4 frames
  5. geotrellis.spark.ingest
    CassandraIngestCommand$$anonfun$main$1.apply
    1. geotrellis.spark.ingest.CassandraIngestCommand$$anonfun$main$1.apply(CassandraIngestCommand.scala:41)
    2. geotrellis.spark.ingest.CassandraIngestCommand$$anonfun$main$1.apply(CassandraIngestCommand.scala:40)
    2 frames
  6. geotrellis.spark.io
    Cassandra$.withSession
    1. geotrellis.spark.io.cassandra.Cassandra$.withSession(Cassandra.scala:19)
    1 frame
  7. geotrellis.spark.ingest
    CassandraIngestCommand$.main
    1. geotrellis.spark.ingest.CassandraIngestCommand$.main(CassandraIngestCommand.scala:40)
    2. geotrellis.spark.ingest.CassandraIngestCommand$.main(CassandraIngestCommand.scala:29)
    2 frames
  8. com.quantifind.sumac
    ArgMain$class.main
    1. com.quantifind.sumac.ArgMain$class.mainHelper(ArgApp.scala:45)
    2. com.quantifind.sumac.ArgMain$class.main(ArgApp.scala:34)
    2 frames
  9. geotrellis.spark.ingest
    CassandraIngestCommand.main
    1. geotrellis.spark.ingest.CassandraIngestCommand$.main(CassandraIngestCommand.scala:29)
    2. geotrellis.spark.ingest.CassandraIngestCommand.main(CassandraIngestCommand.scala)
    2 frames
  10. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  11. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    2. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    3. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    3 frames