org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt".

Pentaho BI Platform Tracking | Alexander Buloichik | 1 year ago
  1. 0

    {noformat} 2015/09/14 06:41:05 - Hadoop File Output.0 - ERROR (version 6.0.0.0-287, build 1 from 2015-09-09 22.13.23 by buildguy) : Couldn't open file hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt.txt 2015/09/14 06:41:05 - Hadoop File Output.0 - ERROR (version 6.0.0.0-287, build 1 from 2015-09-09 22.13.23 by buildguy) : org.pentaho.di.core.exception.KettleException: 2015/09/14 06:41:05 - Hadoop File Output.0 - Error opening new file : org.pentaho.di.core.exception.KettleFileException: 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt". 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.openNewFile(TextFileOutput.java:654) 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.init(TextFileOutput.java:754) 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69) 2015/09/14 06:41:05 - Hadoop File Output.0 - at java.lang.Thread.run(Thread.java:745) {noformat}

    Pentaho BI Platform Tracking | 1 year ago | Alexander Buloichik
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt".
  2. 0

    {noformat} 2015/09/14 06:41:05 - Hadoop File Output.0 - ERROR (version 6.0.0.0-287, build 1 from 2015-09-09 22.13.23 by buildguy) : Couldn't open file hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt.txt 2015/09/14 06:41:05 - Hadoop File Output.0 - ERROR (version 6.0.0.0-287, build 1 from 2015-09-09 22.13.23 by buildguy) : org.pentaho.di.core.exception.KettleException: 2015/09/14 06:41:05 - Hadoop File Output.0 - Error opening new file : org.pentaho.di.core.exception.KettleFileException: 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt". 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.openNewFile(TextFileOutput.java:654) 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.init(TextFileOutput.java:754) 2015/09/14 06:41:05 - Hadoop File Output.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69) 2015/09/14 06:41:05 - Hadoop File Output.0 - at java.lang.Thread.run(Thread.java:745) {noformat}

    Pentaho BI Platform Tracking | 1 year ago | Alexander Buloichik
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt".
  3. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw".
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Pentaho unable to copy files to Hadoop HDFS file system 1.0.3

    Stack Overflow | 1 year ago | Harinath Arasu
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://notroot/hadoop123@192.168.139.128:8020/input' : Could not resolve file "hdfs://notroot/hadoop123@192.168.139.128:8020/input".
  6. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw".

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.pentaho.di.core.exception.KettleFileException

      Unable to get VFS File object for filename 'hdfs://devuser:password@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt' : Invalid absolute URI "hdfs://devuser:***@svqxbdcn6cdh54secn1.pentahoqa.com:8020hdfs://svqxbdcn6cdh54secn1.pentahoqa.com:8020/user/devuser/test3/wordcount/wordcount-hdfs-output1/output_mt09-14-2015-06-41-05-272.txt".

      at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.openNewFile()
    2. org.pentaho.di
      StepInitThread.run
      1. org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.openNewFile(TextFileOutput.java:654)
      2. org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.init(TextFileOutput.java:754)
      3. org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69)
      3 frames
    3. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:745)
      1 frame