org.pentaho.di.core.exception.KettleFileException: 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : Unable to get VFS File object for filename 'hdfs://localhost:9000/usr/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/usr/pdi/weblogs/raw". 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) :

pentaho.com | 1 month ago
  1. 0

    Hadoop Copy files Step in Job [Archive] - Pentaho Community Forums

    pentaho.com | 1 month ago
    org.pentaho.di.core.exception.KettleFileException: 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : Unable to get VFS File object for filename 'hdfs://localhost:9000/usr/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/usr/pdi/weblogs/raw". 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) :
  2. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw".
  3. 0

    When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.

    Pentaho BI Platform Tracking | 5 years ago | Chris Deptula
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw".
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Pentaho unable to copy files to Hadoop HDFS file system 1.0.3

    Stack Overflow | 1 year ago | Harinath Arasu
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'hdfs://notroot/hadoop123@192.168.139.128:8020/input' : Could not resolve file "hdfs://notroot/hadoop123@192.168.139.128:8020/input".
  6. 0

    I offered updates for pash to the marketplace, and these were merged: https://github.com/pentaho/marketplace-metadata/pull/314 Among these is one with this <version>: <version> <branch>MONDRIAN-2263</branch> <version>0.14 / MONDRIAN-2263</version> <name>Beta</name> <package_url>https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip</package_url> <description>Pash version 0.14 / MONDRIAN-2263</description> <min_parent_version>5.3</min_parent_version> <development_stage> <lane>Community</lane> <phase>3</phase> </development_stage> </version> This, but not the others, fail to install. Server log shows: 2015/04/14 12:22:19 - download_and_install_plugin - Start of job execution 2015/04/14 12:22:19 - download_and_install_plugin - exec(0, 0, START.0) 2015/04/14 12:22:19 - START - Starting job entry 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Download Plugin] 2015/04/14 12:22:19 - download_and_install_plugin - exec(1, 0, Download Plugin.0) 2015/04/14 12:22:19 - Download Plugin - Starting job entry 2015/04/14 12:22:19 - Download Plugin - Start of HTTP job entry. 2015/04/14 12:22:19 - Download Plugin - Connecting to URL: https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error getting file from HTTP : 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:302) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.entries.http.JobEntryHTTP.execute(JobEntryHTTP.java:471) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - Download Plugin - Caused by: java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:268) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:300) 2015/04/14 12:22:19 - Download Plugin - ... 5 more 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzip] 2015/04/14 12:22:19 - download_and_install_plugin - exec(2, 0, Unzip.0) 2015/04/14 12:22:19 - Unzip - Starting job entry 2015/04/14 12:22:19 - Unzip - Starting ... 2015/04/14 12:22:19 - Unzip - Processing row source File/folder source : [zip:${downloadDestination}] ... destination file/folder : [${stagingDestination}]... wildcard : [null] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Can not copy file/folder [zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip] to [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging]. Exception : [ 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - ] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:154) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:102) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:362) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:315) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzipped Plugin Exists] 2015/04/14 12:22:19 - download_and_install_plugin - exec(3, 0, Unzipped Plugin Exists.0) 2015/04/14 12:22:19 - Unzipped Plugin Exists - Starting job entry 2015/04/14 12:22:19 - Unzipped Plugin Exists - File [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging/pash] does not exist! 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Cleanup Staging] 2015/04/14 12:22:19 - download_and_install_plugin - exec(4, 0, Cleanup Staging.0) 2015/04/14 12:22:19 - Cleanup Staging - Starting job entry 2015/04/14 12:22:19 - Cleanup Staging - Processing folder [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging] 2015/04/14 12:22:19 - Cleanup Staging - Folder /home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging was deleted. Total deleted files = 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - Cleanup Staging - Number of errors : 0 2015/04/14 12:22:19 - Cleanup Staging - Number of deleted folders : 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Abort job] 2015/04/14 12:22:19 - download_and_install_plugin - exec(5, 0, Abort job.0) 2015/04/14 12:22:19 - Abort job - Starting job entry 2015/04/14 12:22:19 - Abort job - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Plugin did not contain ID 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Abort job] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Cleanup Staging] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzipped Plugin Exists] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzip] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Download Plugin] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Job execution finished The serverlog indicates that the "/" in the version number ends up in the server file path. This is most likely the cause of the failure.

    Pentaho BI Platform Tracking | 2 years ago | Roland Bouman
    org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.pentaho.di.core.exception.KettleFileException

      2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : Unable to get VFS File object for filename 'hdfs://localhost:9000/usr/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/usr/pdi/weblogs/raw". 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) :

      at org.pentaho.di.core.vfs.KettleVFS.getFileObject()
    2. org.pentaho.di
      Job.run
      1. org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161)
      2. org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104)
      3. org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376)
      4. org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324)
      5. org.pentaho.di.job.Job.execute(Job.java:528)
      6. org.pentaho.di.job.Job.execute(Job.java:667)
      7. org.pentaho.di.job.Job.execute(Job.java:393)
      8. org.pentaho.di.job.Job.run(Job.java:313)
      8 frames