org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: The Specified XMI File is invalid: C:\Program Files\pentaho\server\biserver-ee\pentaho-solutions\steel-wheels\metadata.xmi

Pentaho BI Platform Tracking | Li Deng | 4 years ago
  1. 0

    Creating Metadata Data Source doesn't work for crosstab. Repro steps: 1. Drag and Drop crosstab icon to detail canvas. 2. Click the green plus sign on the Crosstab Data Source dialog and select metadata 3. Select a metadata xml file, enter a Solution name, edit the query. Notice no Business domains available. Error message: org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: The Specified XMI File is invalid: C:\Program Files\pentaho\server\biserver-ee\pentaho-solutions\steel-wheels\metadata.xmi at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdConnectionProvider.getMetadataDomainRepository(PmdConnectionProvider.java:104) at org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.buildDomainRepository(LoadRepositoryRunnable.java:79) at org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.run(LoadRepositoryRunnable.java:59) at java.lang.Thread.run(Unknown Source) Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: The derived entry does not exist in this bundle. at org.pentaho.reporting.libraries.docbundle.bundleloader.RepositoryResourceBundleLoader.deriveKey(RepositoryResourceBundleLoader.java:205) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.deriveKey(BundleResourceManagerBackend.java:111) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.deriveKey(ResourceManager.java:192) at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdConnectionProvider.getMetadataDomainRepository(PmdConnectionProvider.java:92) ... 3 more

    Pentaho BI Platform Tracking | 4 years ago | Li Deng
    org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: The Specified XMI File is invalid: C:\Program Files\pentaho\server\biserver-ee\pentaho-solutions\steel-wheels\metadata.xmi
  2. 0

    Creating Metadata Data Source doesn't work for crosstab. Repro steps: 1. Drag and Drop crosstab icon to detail canvas. 2. Click the green plus sign on the Crosstab Data Source dialog and select metadata 3. Select a metadata xml file, enter a Solution name, edit the query. Notice no Business domains available. Error message: org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: The Specified XMI File is invalid: C:\Program Files\pentaho\server\biserver-ee\pentaho-solutions\steel-wheels\metadata.xmi at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdConnectionProvider.getMetadataDomainRepository(PmdConnectionProvider.java:104) at org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.buildDomainRepository(LoadRepositoryRunnable.java:79) at org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.run(LoadRepositoryRunnable.java:59) at java.lang.Thread.run(Unknown Source) Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: The derived entry does not exist in this bundle. at org.pentaho.reporting.libraries.docbundle.bundleloader.RepositoryResourceBundleLoader.deriveKey(RepositoryResourceBundleLoader.java:205) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.deriveKey(BundleResourceManagerBackend.java:111) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.deriveKey(ResourceManager.java:192) at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdConnectionProvider.getMetadataDomainRepository(PmdConnectionProvider.java:92) ... 3 more

    Pentaho BI Platform Tracking | 4 years ago | Li Deng
    org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: The Specified XMI File is invalid: C:\Program Files\pentaho\server\biserver-ee\pentaho-solutions\steel-wheels\metadata.xmi
  3. 0

    Something happens in PRD which causes it to be unable to read data from or otherwise use a PDI (ie., .ktr) data source. I haven't been able to figure out exactly when it happens, though I'll add to this ticket if I can figure it out. Here is the setup: (1) I have a report with a main query and parameters sourced from a PDI (ie., .ktr) data source. (2) Periodically, when something causes PRD to refresh the report (ie., reload the data from the main query, and reload the parameter values), I will receive a popup error message with the following stack trace. Once this has happened I cannot access the PDI data sources for metadata or data unless I restart PRD: org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:159) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:203) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:121) at org.pentaho.reporting.ui.datasources.kettle.KettleDataSourceDialog$KettlePreviewWorker.run(KettleDataSourceDialog.java:592) at java.lang.Thread.run(Thread.java:744) Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: Unable to create key: No loader was able to handle the given key data: get_metertype_options.ktr at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.createKey(DefaultResourceManagerBackend.java:74) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:151) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:137) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.createKey(KettleTransFromFileProducer.java:118) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:138) ... 4 more

    Pentaho BI Platform Tracking | 2 years ago | Bryce Lobdell
    org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Please see attached video: https://pentaho.box.com/s/kaw9b9gt69uzw1nulh62zc68q0sdodf1 In the video you can see the following elements are loaded in the same folder: ktr_query.ktr Sample KTR Query.prpt As shown in the video you can see the report returns data when executed on-demand but when trying to schedule the report to run on the background the Pentaho.log shows the following stacktrace: {noformat} ERROR [org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor] 1923623266: Report processing failed. org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:159) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:273) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:110) at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:205) at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182) at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:505) at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryStatic(CachingDataFactory.java:181) at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:200) at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182) at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:69) at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQueryData(DefaultFlowController.java:296) at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQuery(DefaultFlowController.java:217) at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.initializeForMasterReport(ProcessState.java:331) at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:481) at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1713) at org.pentaho.reporting.platform.plugin.output.FastStreamJcrHtmlOutput.generate(FastStreamJcrHtmlOutput.java:51) at org.pentaho.reporting.platform.plugin.SimpleReportingAction._execute(SimpleReportingAction.java:920) at org.pentaho.reporting.platform.plugin.SimpleReportingAction.execute(SimpleReportingAction.java:838) at org.pentaho.platform.scheduler2.quartz.ActionAdapterQuartzJob$1.call(ActionAdapterQuartzJob.java:241) at org.pentaho.platform.scheduler2.quartz.ActionAdapterQuartzJob$1.call(ActionAdapterQuartzJob.java:181) at org.pentaho.platform.engine.security.SecurityHelper.runAsUser(SecurityHelper.java:173) at org.pentaho.platform.engine.security.SecurityHelper.runAsUser(SecurityHelper.java:162) at org.pentaho.platform.scheduler2.quartz.ActionAdapterQuartzJob.invokeAction(ActionAdapterQuartzJob.java:267) at org.pentaho.platform.scheduler2.quartz.ActionAdapterQuartzJob.execute(ActionAdapterQuartzJob.java:140) at org.pentaho.platform.scheduler2.quartz.BlockingQuartzJob.execute(BlockingQuartzJob.java:39) at org.quartz.core.JobRunShell.run(JobRunShell.java:199) at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:546) Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: Unable to create key: No loader was able to handle the given key data: ktr_query.ktr at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.createKey(DefaultResourceManagerBackend.java:74) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:151) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:137) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.createKey(KettleTransFromFileProducer.java:118) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:138) ... 26 more {noformat} Please find attached Sample report and transformation to replicate the issue.

    Pentaho BI Platform Tracking | 1 year ago | Carlos Lopez
    org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation
  6. 0

    Something happens in PRD which causes it to be unable to read data from or otherwise use a PDI (ie., .ktr) data source. I haven't been able to figure out exactly when it happens, though I'll add to this ticket if I can figure it out. Here is the setup: (1) I have a report with a main query and parameters sourced from a PDI (ie., .ktr) data source. (2) Periodically, when something causes PRD to refresh the report (ie., reload the data from the main query, and reload the parameter values), I will receive a popup error message with the following stack trace. Once this has happened I cannot access the PDI data sources for metadata or data unless I restart PRD: org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:159) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:203) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:121) at org.pentaho.reporting.ui.datasources.kettle.KettleDataSourceDialog$KettlePreviewWorker.run(KettleDataSourceDialog.java:592) at java.lang.Thread.run(Thread.java:744) Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: Unable to create key: No loader was able to handle the given key data: get_metertype_options.ktr at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.createKey(DefaultResourceManagerBackend.java:74) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:89) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:151) at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:137) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.createKey(KettleTransFromFileProducer.java:118) at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:138) ... 4 more

    Pentaho BI Platform Tracking | 2 years ago | Bryce Lobdell
    org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException

      The derived entry does not exist in this bundle.

      at org.pentaho.reporting.libraries.docbundle.bundleloader.RepositoryResourceBundleLoader.deriveKey()
    2. org.pentaho.reporting
      LoadRepositoryRunnable.run
      1. org.pentaho.reporting.libraries.docbundle.bundleloader.RepositoryResourceBundleLoader.deriveKey(RepositoryResourceBundleLoader.java:205)
      2. org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.deriveKey(BundleResourceManagerBackend.java:111)
      3. org.pentaho.reporting.libraries.resourceloader.ResourceManager.deriveKey(ResourceManager.java:192)
      4. org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdConnectionProvider.getMetadataDomainRepository(PmdConnectionProvider.java:92)
      5. org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.buildDomainRepository(LoadRepositoryRunnable.java:79)
      6. org.pentaho.reporting.ui.datasources.pmd.util.LoadRepositoryRunnable.run(LoadRepositoryRunnable.java:59)
      6 frames
    3. Java RT
      Thread.run
      1. java.lang.Thread.run(Unknown Source)
      1 frame