org.pentaho.di.core.exception.KettleFileException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.
    via by Chris Deptula,
  • When using the browse button in the Hadoop Copy Files step and pressing connect to connect to the cluster you receive a 'Could not resolve file "hdfs://localhost/".' There is not any additional information in the spoon log. When manually specifying an hdfs folder path instead of browsing and selecting you receive the following stack trace when you run the job. 2012/01/31 14:50:39 - Hadoop Copy Files - Processing row source File/folder source : [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] ... destination file/folder : [hdfs://localhost:9000/user/pdi/weblogs/raw]... wildcard : [^.*\.txt] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Can not copy file/folder [file:///share/Cloudera/Sample Data/weblogs_rebuild.txt] to [hdfs://localhost:9000/user/pdi/weblogs/raw]. Exception : [ 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : ] 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : org.pentaho.di.core.exception.KettleFileException: 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : Unable to get VFS File object for filename 'hdfs://localhost:9000/user/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/user/pdi/weblogs/raw". 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:528) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:667) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.execute(Job.java:393) 2012/01/31 14:50:39 - Hadoop Copy Files - ERROR (version 4.3.0, build 16295 from 2012-01-27 15.53.26 by tomcat) : at org.pentaho.di.job.Job.run(Job.java:313) Use the http://wiki.pentaho.com/display/BAD/Loading+Data+into+HDFS how to for a test case. This works on CDH3u2.
    via by Chris Deptula,
  • I offered updates for pash to the marketplace, and these were merged: https://github.com/pentaho/marketplace-metadata/pull/314 Among these is one with this <version>: <version> <branch>MONDRIAN-2263</branch> <version>0.14 / MONDRIAN-2263</version> <name>Beta</name> <package_url>https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip</package_url> <description>Pash version 0.14 / MONDRIAN-2263</description> <min_parent_version>5.3</min_parent_version> <development_stage> <lane>Community</lane> <phase>3</phase> </development_stage> </version> This, but not the others, fail to install. Server log shows: 2015/04/14 12:22:19 - download_and_install_plugin - Start of job execution 2015/04/14 12:22:19 - download_and_install_plugin - exec(0, 0, START.0) 2015/04/14 12:22:19 - START - Starting job entry 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Download Plugin] 2015/04/14 12:22:19 - download_and_install_plugin - exec(1, 0, Download Plugin.0) 2015/04/14 12:22:19 - Download Plugin - Starting job entry 2015/04/14 12:22:19 - Download Plugin - Start of HTTP job entry. 2015/04/14 12:22:19 - Download Plugin - Connecting to URL: https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error getting file from HTTP : 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:302) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.entries.http.JobEntryHTTP.execute(JobEntryHTTP.java:471) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - Download Plugin - Caused by: java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:268) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:300) 2015/04/14 12:22:19 - Download Plugin - ... 5 more 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzip] 2015/04/14 12:22:19 - download_and_install_plugin - exec(2, 0, Unzip.0) 2015/04/14 12:22:19 - Unzip - Starting job entry 2015/04/14 12:22:19 - Unzip - Starting ... 2015/04/14 12:22:19 - Unzip - Processing row source File/folder source : [zip:${downloadDestination}] ... destination file/folder : [${stagingDestination}]... wildcard : [null] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Can not copy file/folder [zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip] to [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging]. Exception : [ 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - ] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:154) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:102) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:362) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:315) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzipped Plugin Exists] 2015/04/14 12:22:19 - download_and_install_plugin - exec(3, 0, Unzipped Plugin Exists.0) 2015/04/14 12:22:19 - Unzipped Plugin Exists - Starting job entry 2015/04/14 12:22:19 - Unzipped Plugin Exists - File [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging/pash] does not exist! 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Cleanup Staging] 2015/04/14 12:22:19 - download_and_install_plugin - exec(4, 0, Cleanup Staging.0) 2015/04/14 12:22:19 - Cleanup Staging - Starting job entry 2015/04/14 12:22:19 - Cleanup Staging - Processing folder [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging] 2015/04/14 12:22:19 - Cleanup Staging - Folder /home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging was deleted. Total deleted files = 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - Cleanup Staging - Number of errors : 0 2015/04/14 12:22:19 - Cleanup Staging - Number of deleted folders : 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Abort job] 2015/04/14 12:22:19 - download_and_install_plugin - exec(5, 0, Abort job.0) 2015/04/14 12:22:19 - Abort job - Starting job entry 2015/04/14 12:22:19 - Abort job - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Plugin did not contain ID 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Abort job] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Cleanup Staging] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzipped Plugin Exists] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzip] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Download Plugin] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Job execution finished The serverlog indicates that the "/" in the version number ends up in the server file path. This is most likely the cause of the failure.
    via by Roland Bouman,
  • I offered updates for pash to the marketplace, and these were merged: https://github.com/pentaho/marketplace-metadata/pull/314 Among these is one with this <version>: <version> <branch>MONDRIAN-2263</branch> <version>0.14 / MONDRIAN-2263</version> <name>Beta</name> <package_url>https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip</package_url> <description>Pash version 0.14 / MONDRIAN-2263</description> <min_parent_version>5.3</min_parent_version> <development_stage> <lane>Community</lane> <phase>3</phase> </development_stage> </version> This, but not the others, fail to install. Server log shows: 2015/04/14 12:22:19 - download_and_install_plugin - Start of job execution 2015/04/14 12:22:19 - download_and_install_plugin - exec(0, 0, START.0) 2015/04/14 12:22:19 - START - Starting job entry 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Download Plugin] 2015/04/14 12:22:19 - download_and_install_plugin - exec(1, 0, Download Plugin.0) 2015/04/14 12:22:19 - Download Plugin - Starting job entry 2015/04/14 12:22:19 - Download Plugin - Start of HTTP job entry. 2015/04/14 12:22:19 - Download Plugin - Connecting to URL: https://github.com/rpbouman/pash/raw/MONDRIAN-2263/bin/pash.zip 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error getting file from HTTP : 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:302) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.entries.http.JobEntryHTTP.execute(JobEntryHTTP.java:471) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - Download Plugin - Caused by: java.io.IOException: Error creating output file! Parent directory [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 ] does not exist. 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:268) 2015/04/14 12:22:19 - Download Plugin - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:300) 2015/04/14 12:22:19 - Download Plugin - ... 5 more 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzip] 2015/04/14 12:22:19 - download_and_install_plugin - exec(2, 0, Unzip.0) 2015/04/14 12:22:19 - Unzip - Starting job entry 2015/04/14 12:22:19 - Unzip - Starting ... 2015/04/14 12:22:19 - Unzip - Processing row source File/folder source : [zip:${downloadDestination}] ... destination file/folder : [${stagingDestination}]... wildcard : [null] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Can not copy file/folder [zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip] to [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging]. Exception : [ 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - ] 2015/04/14 12:22:19 - Unzip - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleFileException: 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - Unable to get VFS File object for filename 'zip:/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip' : Could not replicate "file:///home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/downloads/pash-0.14 / MONDRIAN-2263_1429006939885.zip" as it does not exist. 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:154) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:102) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:362) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:315) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:716) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:859) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.execute(Job.java:532) 2015/04/14 12:22:19 - Unzip - at org.pentaho.di.job.Job.run(Job.java:424) 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Unzipped Plugin Exists] 2015/04/14 12:22:19 - download_and_install_plugin - exec(3, 0, Unzipped Plugin Exists.0) 2015/04/14 12:22:19 - Unzipped Plugin Exists - Starting job entry 2015/04/14 12:22:19 - Unzipped Plugin Exists - File [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging/pash] does not exist! 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Cleanup Staging] 2015/04/14 12:22:19 - download_and_install_plugin - exec(4, 0, Cleanup Staging.0) 2015/04/14 12:22:19 - Cleanup Staging - Starting job entry 2015/04/14 12:22:19 - Cleanup Staging - Processing folder [/home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging] 2015/04/14 12:22:19 - Cleanup Staging - Folder /home/roland/pentaho-ce/biserver-ce-5.3.0.0-213/biserver-ce/pentaho-solutions/system/plugin-cache/staging was deleted. Total deleted files = 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - Cleanup Staging - Number of errors : 0 2015/04/14 12:22:19 - Cleanup Staging - Number of deleted folders : 1 2015/04/14 12:22:19 - Cleanup Staging - ======================================= 2015/04/14 12:22:19 - download_and_install_plugin - Starting entry [Abort job] 2015/04/14 12:22:19 - download_and_install_plugin - exec(5, 0, Abort job.0) 2015/04/14 12:22:19 - Abort job - Starting job entry 2015/04/14 12:22:19 - Abort job - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Plugin did not contain ID 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Abort job] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Cleanup Staging] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzipped Plugin Exists] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Unzip] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Finished job entry [Download Plugin] (result=[false]) 2015/04/14 12:22:19 - download_and_install_plugin - Job execution finished The serverlog indicates that the "/" in the version number ends up in the server file path. This is most likely the cause of the failure.
    via by Roland Bouman,
  • Unable to get VFS File object for filename
    via Stack Overflow by Priyanka pal
    ,
  • Getting Error in Pentaho 7.0
    via GitHub by sumit140
    ,
  • See the attached image for a screenshot of the transformation, the shared files field and the error. This happened when I executed the transformation. There was an error while reading the shared objects (continuing load) : org.pentaho.di.core.exception.KettleXMLException: Unexpected problem reading shared objects from XML file : /../../res/shared.xml Unable to get VFS File object for filename '/../../res/shared.xml' : Invalid relative file name. org.pentaho.di.core.exception.KettleXMLException: Unexpected problem reading shared objects from XML file : /../../res/shared.xml Unable to get VFS File object for filename '/../../res/shared.xml' : Invalid relative file name. at org.pentaho.di.shared.SharedObjects.<init>(SharedObjects.java:183) at org.pentaho.di.trans.TransMeta.readSharedObjects(TransMeta.java:3319) at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2877) at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2781) at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2741) at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2727) at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2702) at org.pentaho.di.trans.Trans.<init>(Trans.java:478) at org.pentaho.di.ui.spoon.trans.TransGraph.start(TransGraph.java:3313) at org.pentaho.di.ui.spoon.delegates.SpoonTransformationDelegate.executeTransformation(SpoonTransformationDelegate.java:922) at org.pentaho.di.ui.spoon.Spoon$31$1.run(Spoon.java:7619) at org.eclipse.swt.widgets.RunnableLock.run(Unknown Source) at org.eclipse.swt.widgets.Synchronizer.runAsyncMessages(Unknown Source) at org.eclipse.swt.widgets.Display.runAsyncMessages(Unknown Source) at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source) at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1221) at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7044) at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:8304) at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:580) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134) Caused by: org.pentaho.di.core.exception.KettleFileException: Unable to get VFS File object for filename '/../../res/shared.xml' : Invalid relative file name. at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:99) at org.pentaho.di.shared.SharedObjects.<init>(SharedObjects.java:105) ... 23 more
    via by Curtis Boyden,
    • org.pentaho.di.core.exception.KettleFileException: 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : Unable to get VFS File object for filename 'hdfs://localhost:9000/usr/pdi/weblogs/raw' : Could not resolve file "hdfs://localhost/usr/pdi/weblogs/raw". 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : 2012/05/03 12:13:15 - Hadoop Copy Files - ERROR (version 4.3.0-GA, build 16753 from 2012-04-18 21.39.30 by buildguy) : at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:161) at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:104) at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:376) at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:324) at org.pentaho.di.job.Job.execute(Job.java:528) at org.pentaho.di.job.Job.execute(Job.java:667) at org.pentaho.di.job.Job.execute(Job.java:393) at org.pentaho.di.job.Job.run(Job.java:313)
    No Bugmate found.