java.lang.IllegalArgumentException: Wrong FS: hdfs:/*****/hadoop-2.6.1.****.tar.gz, > expected: file:///

lens-user | amareshwarisr . | 4 months ago
  1. 0

    Re: Lens Query Failed to Execute

    lens-user | 4 months ago | amareshwarisr .
    java.lang.IllegalArgumentException: Wrong FS: hdfs:/*****/hadoop-2.6.1.****.tar.gz, > expected: file:///
  2. 0

    Config not being read on driver and/or executor

    GitHub | 2 years ago | srowen
    java.lang.IllegalArgumentException: Wrong FS: file://xxxxx.cloudera.com:8020/tmp/Oryx/data, expected: hdfs://sssss.cloudera.com:8020
  3. 0

    Hive will not write to aws s3

    Stack Overflow | 1 year ago | NW0428
    java.lang.IllegalArgumentException: Wrong FS: s3a://bucket/folder/.hive-staging_hive_2015-07-06_09-22-10_351_9216807769834089982-3/-ext-10002, expected: hdfs://quickstart.cloudera:8020
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    In HA HDFS from uploading a file to streams (order of few MBs) and reading the stream back gives the following exception: {code} java.lang.IllegalArgumentException: Wrong FS: hdfs://prodnameservice1:8020/<PATH>, expected: hdfs://prodnameservice1 at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) ~[hadoop-common-2.5.0-cdh5.3.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:192) ~[hadoop-hdfs.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:104) ~[hadoop-hdfs.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$32.doCall(DistributedFileSystem.java:1569) ~[hadoop-hdfs.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem$32.doCall(DistributedFileSystem.java:1565) ~[hadoop-hdfs.jar:na] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.5.0-cdh5.3.2.jar:na] at org.apache.hadoop.hdfs.DistributedFileSystem.isFileClosed(DistributedFileSystem.java:1565) ~[hadoop-hdfs.jar:na] at co.cask.cdap.common.io.Locations$9.size(Locations.java:365) ~[co.cask.cdap.cdap-common-3.2.1.jar:na] at co.cask.cdap.common.io.Locations$11.size(Locations.java:406) ~[co.cask.cdap.cdap-common-3.2.1.jar:na] at co.cask.cdap.common.io.DFSSeekableInputStream.size(DFSSeekableInputStream.java:51) ~[co.cask.cdap.cdap-common-3.2.1.jar:na] at co.cask.cdap.data.stream.StreamDataFileReader.createEventTemplate(StreamDataFileReader.java:344) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar :na] at co.cask.cdap.data.stream.StreamDataFileReader.readHeader(StreamDataFileReader.java:305) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.StreamDataFileReader.init(StreamDataFileReader.java:280) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.StreamDataFileReader.doOpen(StreamDataFileReader.java:252) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.StreamDataFileReader.initialize(StreamDataFileReader.java:139) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.LiveStreamFileReader$StreamPositionTransformFileReader.initialize(LiveStreamFileReader.java:169) ~[co.cask.cdap.c dap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.LiveStreamFileReader.renewReader(LiveStreamFileReader.java:81) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.file.LiveFileReader.initialize(LiveFileReader.java:42) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.MultiLiveStreamFileReader$StreamEventSource.initialize(MultiLiveStreamFileReader.java:175) ~[co.cask.cdap.cdap-da ta-fabric-3.2.1.jar:na] at co.cask.cdap.data.stream.MultiLiveStreamFileReader.initialize(MultiLiveStreamFileReader.java:72) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar :na] at co.cask.cdap.data.stream.service.StreamFetchHandler.createReader(StreamFetchHandler.java:286) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na ] at co.cask.cdap.data.stream.service.StreamFetchHandler.fetch(StreamFetchHandler.java:124) ~[co.cask.cdap.cdap-data-fabric-3.2.1.jar:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_67] {code} This happens because in HA mode the URI returned from org.apache.hadoop.fs.FileContext is not compatible with the one expected by org.apache.hadoop.dfs.DistributedFileSystem. FileContext is not HA aware and always appends the port which DistributedFilesystem uses the logical name. Related HDFS JIRA: https://issues.apache.org/jira/browse/HADOOP-9617 Until the HDFS JIRA is fixed we need a workaround in CDAP to strip out the port if used in HA mode.

    Cask Community Issue Tracker | 1 year ago | Sreevatsan Raman
    java.lang.IllegalArgumentException: Wrong FS: hdfs://prodnameservice1:8020/<PATH>, expected: hdfs://prodnameservice1
  6. 0

    loci loading from file seems to fail when running given a `file:///` path while running on yarn

    GitHub | 8 months ago | timodonnell
    java.lang.IllegalArgumentException: Wrong FS: file:/hpc/users/odonnt02/sinai/git/projects/guacamole-validation/joint-caller-runs/aocs/runs/aocs-034-wgs/aocs_034_called_loci.txt, expected: hdfs://demeter-nn1.demeter.hpc.mssm.edu:8020

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Wrong FS: hdfs:/*****/hadoop-2.6.1.****.tar.gz, > expected: file:///

      at org.apache.hadoop.fs.FileSystem.checkPath()
    2. Hadoop
      FileSystem.checkPath
      1. org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
      1 frame