java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist

Stack Overflow | ys0123 | 5 months ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    KiteSdk 1.1.0 csv-import IOError

    Stack Overflow | 5 months ago | ys0123
    java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist
  2. 0

    Ambari HDP 2.2: Port 8020 Connection refused

    Stack Overflow | 2 years ago | Tanny
    java.io.FileNotFoundException: File does not exist: hdfs://<fqdn for the namenode>:8020/hdp/apps/2.2.6.3-1/mapreduce/mapreduce.tar.gz
  3. 0

    How to submit applications to yarn-cluster so jars in packages are also copied?

    Stack Overflow | 2 years ago | Cody Canning
    java.io.FileNotFoundException: File does not exist: hdfs://172.31.13.205:9000/home/hadoop/.ivy2/jars/spark-csv_2.10.jar
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Error launching spark-submit

    GitHub | 4 months ago | dmcarba
    java.io.FileNotFoundException: File does not exist: hdfs://lambda-pluralsight:9000/spark/spark-assembly-1.6.1-hadoop2.6.0.jar
  6. 0

    Dynmap not working

    GitHub | 5 years ago | Freirec
    java.io.FileNotFoundException: plugins\dynmap\texturepacks\standard\t errain.png (The system cannot find the file specified) 13:01:32 [SEVERE] [dynmap] Error loading texture pack 'standard' - not found 13:01:32 [SEVERE] [dynmap] Error: shader 'stdtexture-mcr-grid' cannot load textu re pack 'standard' 13:01:32 [INFO] [dynmap] Loaded 20 shaders. 13:01:33 [INFO] [dynmap] Loaded 82 perspectives. 13:01:33 [INFO] [dynmap] Loaded 12 lightings. 13:01:33 [INFO] [dynmap] Web server started on address 0.0.0.0:8123 13:01:33 [INFO] [dynmap] version 0.39-1163 is enabled - core version 0.39-259 13:01:33 [INFO] [dynmap] Loaded 3 maps of world 'Minecraft Server'. 13:01:33 [INFO] [dynmap] Loaded 2 maps of world 'Minecraft Server_nether'. 13:01:33 [INFO] [dynmap] Loaded 2 maps of world 'Minecraft Server_the_end'. 13:01:33 [INFO] [dynmap] Loaded 8 pending tile renders for world 'Minecraft Serv er_the_end 13:01:33 [INFO] [dynmap] Enabled 13:01:33 [INFO] [Essentials] Enabling Essentials v2.9.2 13:01:33 [INFO] Essentials: Using GroupManager based permissions. 13:01:33 [INFO] [EssentialsXMPP] Enabling EssentialsXMPP v2.9.2 13:01:33 [WARNING] config broken for xmpp 13:01:33 [INFO] [EssentialsProtect] Enabling EssentialsProtect v2.9.2 13:01:33 [INFO] Initializing c3p0-0.9.1.2 [built 21-May-2007 15:04:56; debug? tr ue; trace: 10] 13:01:33 [INFO] [WorldEdit] Enabling WorldEdit v5.3 13:01:33 [INFO] WEPIF: Using the Bukkit Permissions API. 13:01:34 [INFO] [EssentialsSpawn] Enabling EssentialsSpawn v2.9.2 13:01:34 [INFO] [EssentialsGeoIP] Enabling EssentialsGeoIP v2.9.2 13:01:34 [INFO] [EssentialsGeoIP] This product includes GeoLite data created by MaxMind, available from http://www.maxmind.com/. 13:01:34 [INFO] [Dynmap-GriefPrevention] Enabling Dynmap-GriefPrevention v0.10 13:01:34 [INFO] [Dynmap-GriefPrevention] initializing 13:01:34 [INFO] [Dynmap-GriefPrevention] version 0.10 is activated 13:01:34 [INFO] [EssentialsChat] Enabling EssentialsChat v2.9.2 13:01:34 [INFO] Server permissions file permissions.yml is empty, ignoring it 13:01:34 [INFO] Reload complete. 13:01:34 [WARNING] Can't keep up! Did the system time change, or is the server o verloaded? 13:01:34 [SEVERE] [dynmap] Exception during render job: world=Minecraft Server_t he_end, map=null 13:01:34 [SEVERE] [dynmap] Exception during render job: world=Minecraft Server_t he_end, map=null 13:01:34 [SEVERE] java.lang.NullPointerException 13:01:34 [SEVERE] at org.dynmap.hdmap.TexturePackHDShader$ShaderState.<ini t>(TexturePackHDShader.java:94) 13:01:34 [SEVERE] at org.dynmap.hdmap.TexturePackHDShader$ShaderState.<ini t>(TexturePackHDShader.java:69) 13:01:34 [SEVERE] at org.dynmap.hdmap.TexturePackHDShader.getStateInstance (TexturePackHDShader.java:229) 13:01:34 [SEVERE] at org.dynmap.hdmap.HDMapManager.getShaderStateForTile(H DMapManager.java:125) 13:01:34 [SEVERE] at org.dynmap.hdmap.IsoHDPerspective.render(IsoHDPerspec tive.java:1198) 13:01:34 [SEVERE] at org.dynmap.hdmap.HDMapTile.render(HDMapTile.java:96) 13:01:34 [SEVERE] at org.dynmap.MapManager$FullWorldRenderState.processTil e(MapManager.java:620) 13:01:34 [INFO] GroupManager - INFO - Bukkit Permissions Updated! 13:01:34 [SEVERE] at org.dynmap.MapManager$FullWorldRenderState.run(MapMan ager.java:566) 13:01:34 [SEVERE] at org.dynmap.MapManager$DynmapScheduledThreadPoolExecut or$1.run(MapManager.java:180) 13:01:34 [SEVERE] at org.dynmap.MapManager$DynmapScheduledThreadPoolExecut or$2.run(MapManager.java:196) 13:01:34 [SEVERE] at java.util.concurrent.Executors$RunnableAdapter.call(U nknown Source) 13:01:34 [SEVERE] at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) 13:01:34 [SEVERE] at java.util.concurrent.FutureTask.run(Unknown Source)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.FileNotFoundException

      File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist

      at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus()
    2. Hadoop
      FileContext.resolvePath
      1. org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:624)
      2. org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:850)
      3. org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:614)
      4. org.apache.hadoop.fs.DelegateToFileSystem.getFileStatus(DelegateToFileSystem.java:125)
      5. org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:468)
      6. org.apache.hadoop.fs.FilterFs.resolvePath(FilterFs.java:158)
      7. org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2195)
      8. org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2191)
      9. org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
      10. org.apache.hadoop.fs.FileContext.resolve(FileContext.java:2191)
      11. org.apache.hadoop.fs.FileContext.resolvePath(FileContext.java:603)
      11 frames
    3. Hadoop
      Job$10.run
      1. org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:457)
      2. org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142)
      3. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
      4. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
      4 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
      1 frame
    6. Hadoop
      Job.submit
      1. org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
      1 frame
    7. org.apache.crunch
      MRExecutor$1.run
      1. org.apache.crunch.hadoop.mapreduce.lib.jobcontrol.CrunchControlledJob.submit(CrunchControlledJob.java:329)
      2. org.apache.crunch.hadoop.mapreduce.lib.jobcontrol.CrunchJobControl.startReadyJobs(CrunchJobControl.java:204)
      3. org.apache.crunch.hadoop.mapreduce.lib.jobcontrol.CrunchJobControl.pollJobStatusAndStartNewOnes(CrunchJobControl.java:238)
      4. org.apache.crunch.impl.mr.exec.MRExecutor.monitorLoop(MRExecutor.java:112)
      5. org.apache.crunch.impl.mr.exec.MRExecutor.access$000(MRExecutor.java:55)
      6. org.apache.crunch.impl.mr.exec.MRExecutor$1.run(MRExecutor.java:83)
      6 frames
    8. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:745)
      1 frame