java.lang.IllegalArgumentException: Can not create a Path from an empty string

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by avapirev
, 4 months ago
Can not create a Path from an empty string
via GitHub by avapirev
, 5 months ago
Can not create a Path from an empty string
via Stack Overflow by Bikash Karmokar
, 2 weeks ago
via github.com by Unknown author, 2 years ago
via github.com by Unknown author, 1 year ago
java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:127)
at org.apache.hadoop.fs.Path.(Path.java:135)
at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(SparkContext.scala:1016)
at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(SparkContext.scala:1016)
at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
at scala.Option.map(Option.scala:145)
at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:176)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:200)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.rdd.ZippedWithIndexRDD.(ZippedWithIndexRDD.scala:44)
at org.apache.spark.rdd.RDD$$anonfun$zipWithIndex$1.apply(RDD.scala:1246)
at org.apache.spark.rdd.RDD$$anonfun$zipWithIndex$1.apply(RDD.scala:1246)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
at org.apache.spark.rdd.RDD.zipWithIndex(RDD.scala:1245)
at org.apache.spark.api.java.JavaRDDLike$class.zipWithIndex(JavaRDDLike.scala:321)
at com.github.sparkbwa.BwaInterpreter.loadFastq(BwaInterpreter.java:152)
at com.github.sparkbwa.BwaInterpreter.handlePairedReadsSorting(BwaInterpreter.java:239)
at com.github.sparkbwa.BwaInterpreter.runBwa(BwaInterpreter.java:333)
at com.github.sparkbwa.SparkBWA.main(SparkBWA.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)

Users with the same issue

5 times, 11 months ago
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.