FileUtil.copy() has thrown an IllegalArgumentException

We found this prefix in 7 webpages.
MapChildren (4)Typical messages (6)TraceDetailed Map
  1. java.lang.IllegalArgumentException
    1.   at org.apache.hadoop.fs.FileSystem.checkPath()
    2.   at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile()
    3.   at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus()
    4.   at org.apache.hadoop.fs.FilterFileSystem.getFileStatus()
    5.   at org.apache.hadoop.fs.FileUtil.copy()
  1. eco-release-metadata/RELEASENOTES.1.2.0.md at master · aw-was-here/eco-release-metadata · GitHub

    First:2 years ago
    Last:2 years ago
  2. IllegalArgumentException

    FileSystem.copyFromLocalFile()
    Crashes:6
    Projects:0
    Web Pages:3
    Error Reports:0
    First:10 months ago
    Last:7 months ago
    preview
  3. IllegalArgumentException

    RunJar.main()
    Crashes:3
    Projects:0
    Web Pages:2
    Error Reports:0
    First:4 months ago
    Last:4 months ago
    preview
  4. getting Wrong FS: file while running hive query

    First:5 years ago
    Last:3 years ago
    Author:Rohit
MessageNumber of crashes
Wrong FS: file://Hive-mongo/target/hive-mongo-0.0.1-SNAPSHOT-jar-with-dependencies.jar, expected: file:///2
Wrong FS: hdfs://10.18.52.146:9000/history/job_201104291518_0001_root, expected: file:///1
Wrong FS: file://usr/lib/hive/lib/CustomUDFint.jar, expected: file:///1
Wrong FS: file://Users/joshrosen/Documents/Spark/examples/target/scala-2.10/spark-examples_2.10-1.1.2-SNAPSHOT.jar, expected: file:///1
Wrong FS: hdfs://10.18.52.146:9000/history/job_201104291518_0001_root, expected: file:///1
Wrong FS: hdfs://localhost:12123/tmp/hadoop-mark/mapred/system/job_201201060323_0005/job.jar, expected: file:///1
  1. java.lang.IllegalArgumentException
    1.   at org.apache.hadoop.fs.FileSystem.checkPath()
    2.   at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile()
    3.   at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus()
    4.   at org.apache.hadoop.fs.FilterFileSystem.getFileStatus()
    5.   at org.apache.hadoop.fs.FileUtil.copy()