java.lang.IllegalArgumentException: Wrong FS: hdfs://stampy/user/jianshuang/.sparkStaging/application_1404410683830_531767/datanucleus-api-jdo-3.2.6.jar, expected: file:///

github.com | 5 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    eco-release-metadata/RELEASENOTES.1.2.0.md at master · aw-was-here/eco-release-metadata · GitHub

    github.com | 5 months ago
    java.lang.IllegalArgumentException: Wrong FS: hdfs://stampy/user/jianshuang/.sparkStaging/application_1404410683830_531767/datanucleus-api-jdo-3.2.6.jar, expected: file:///
  2. 0

    Mahout Sequence file conversion

    Stack Overflow | 3 days ago | Shubham Sinha
    java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:50070/mahout_data/december.txt, expected: file:///
  3. 0

    [SOLR-1301] Add a Solr contrib that allows for building Solr indexes via Hadoop's Map-Reduce. - ASF JIRA

    apache.org | 1 year ago
    java.io.IOException: java.lang.IllegalArgumentException: Wrong FS: hdfs://mi-prod-app01.ec2.biz360.com:9000/user/hadoop/solr/_attempt_201001212110_2841_r_000001_0.1.index-a, expected:
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    solr-dev.lucene.apache.org - [jira] Commented: (SOLR-1301) Solr + Hadoop - msg#00006 - Recent Discussion OSDir.com

    osdir.com | 1 year ago
    java.io.IOException: java.lang.IllegalArgumentException: Wrong FS: hdfs://mi-prod-app01.ec2.biz360.com:9000/user/hadoop/solr/_attempt_201001212110_2841_r_000001_0.1.index-a, expected: file:///
  6. 0

    如何在win7下的eclipse中调试Hadoop2.2.0的程序 - Java,Eclipse,Hadoop - language - ITeye论坛

    iteye.com | 8 months ago
    java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.130.54:19000/user/hmail/output/part-00000, expected: file:///

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Wrong FS: hdfs://stampy/user/jianshuang/.sparkStaging/application_1404410683830_531767/datanucleus-api-jdo-3.2.6.jar, expected: file:///

      at org.apache.hadoop.fs.FileSystem.checkPath()
    2. Hadoop
      FilterFileSystem.getFileStatus
      1. org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:643)
      2. org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:79)
      3. org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:506)
      4. org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:724)
      5. org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501)
      6. org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:397)
      6 frames
    3. Spark Project YARN Stable API
      ClientBase$$anonfun$prepareLocalResources$5.apply
      1. org.apache.spark.deploy.yarn.ClientDistributedCacheManager.addResource(ClientDistributedCacheManager.scala:67)
      2. org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5.apply(ClientBase.scala:257)
      3. org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5.apply(ClientBase.scala:242)
      3 frames
    4. Scala
      Option.foreach
      1. scala.Option.foreach(Option.scala:236)
      1 frame
    5. Spark Project YARN Stable API
      Client.submitApplication
      1. org.apache.spark.deploy.yarn.ClientBase$class.prepareLocalResources(ClientBase.scala:242)
      2. org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:35)
      3. org.apache.spark.deploy.yarn.ClientBase$class.createContainerLaunchContext(ClientBase.scala:350)
      4. org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:35)
      5. org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:80)
      5 frames
    6. Spark
      SparkContext.<init>
      1. org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
      2. org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)
      3. org.apache.spark.SparkContext.<init>(SparkContext.scala:335)
      3 frames
    7. Spark REPL
      SparkILoop.createSparkContext
      1. org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
      1 frame
    8. Unknown
      $iwC.<init>
      1. $iwC$$iwC.<init>(<console>:9)
      2. $iwC.<init>(<console>:18)
      2 frames