org.pentaho.di.core.exception.KettleFileException: 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://172.16.30.99:8020/test/' : Unknown message with code "Error connecting to filesystem hdfs://172.16.30.99:8020/". 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files -

Stack Overflow | Priyanka pal | 2 weeks ago
  1. 0

    Unable to get VFS File object for filename

    Stack Overflow | 2 weeks ago | Priyanka pal
    org.pentaho.di.core.exception.KettleFileException: 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://172.16.30.99:8020/test/' : Unknown message with code "Error connecting to filesystem hdfs://172.16.30.99:8020/". 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files -
  2. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) :
  3. 0

    Hadoop Copy files Step in Job [Archive] - Pentaho Community Forums

    pentaho.com | 3 months ago
    org.pentaho.di.core.exception.KettleFileException: 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : Unable to get VFS File object for filename 'hdfs://localhost:9000/usr/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/usr/pdi/weblogs/raw". 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) :
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Pentaho unable to copy files to Hadoop HDFS file system 1.0.3

    Stack Overflow | 1 year ago | Harinath Arasu
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://notroot/hadoop123@192.168.139.128:8020/input' : Could not resolve file "hdfs://notroot/hadoop123@192.168.139.128:8020/input".
  6. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw".

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.pentaho.di.core.exception.KettleFileException

      2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://172.16.30.99:8020/test/' : Unknown message with code "Error connecting to filesystem hdfs://172.16.30.99:8020/". 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files - 2017/01/10 12:31:21 - Hadoop Copy Files -

      at org.pentaho.di.core.vfs.KettleVFS.getFileObject()
    2. org.pentaho.di
      Job.run
      1. org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:159)
      2. org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:106)
      3. org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:428)
      4. org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:382)
      5. org.pentaho.di.job.Job.execute(Job.java:724)
      6. org.pentaho.di.job.Job.execute(Job.java:865)
      7. org.pentaho.di.job.Job.execute(Job.java:546)
      8. org.pentaho.di.job.Job.run(Job.java:436)
      8 frames