java.io.IOException: No table was provided

Pentaho BI Platform Tracking | Chris Deptula | 4 years ago
  1. 0

    When using the org.pentaho.hadoop.mapred.PentahoTableInputFormat described at http://wiki.pentaho.com/display/BAD/Processing+HBase+data+in+Pentaho+MapReduce+using+TableInputFormat, running the job from a Windows machine the job fails with the following error. The same job works successfully when run on a Linux machine. 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : No table was provided 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : java.io.IOException: No table was provided 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.hbase.mapred.TableInputFormatBase.getSplits(TableInputFormatBase.java:120) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:891) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at java.security.AccessController.doPrivileged(Native Method) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at javax.security.auth.Subject.doAs(Unknown Source) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:818) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.hadoop.shim.common.CommonHadoopShim.submitJob(CommonHadoopShim.java:201) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:806) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:589) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:728) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:443) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.run(Job.java:363)

    Pentaho BI Platform Tracking | 4 years ago | Chris Deptula
    java.io.IOException: No table was provided
  2. 0

    When using the org.pentaho.hadoop.mapred.PentahoTableInputFormat described at http://wiki.pentaho.com/display/BAD/Processing+HBase+data+in+Pentaho+MapReduce+using+TableInputFormat, running the job from a Windows machine the job fails with the following error. The same job works successfully when run on a Linux machine. 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : No table was provided 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : java.io.IOException: No table was provided 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.hbase.mapred.TableInputFormatBase.getSplits(TableInputFormatBase.java:120) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:891) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at java.security.AccessController.doPrivileged(Native Method) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at javax.security.auth.Subject.doAs(Unknown Source) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:818) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.hadoop.shim.common.CommonHadoopShim.submitJob(CommonHadoopShim.java:201) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:806) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:589) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:728) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:443) 2013/02/11 08:11:47 - Pentaho MapReduce - ERROR (version 4.4.0-GA, build 17542 from 2012-11-01 20.06.29 by buildguy) : at org.pentaho.di.job.Job.run(Job.java:363)

    Pentaho BI Platform Tracking | 4 years ago | Chris Deptula
    java.io.IOException: No table was provided
  3. 0

    hbase integration with pyspark

    Stack Overflow | 1 year ago | user3359790
    java.io.IOException: No table was provided.
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Standalone HBase with Spark, HBaseTest.scala is giving error

    Stack Overflow | 2 years ago
    java.io.IOException: No table was provided.
  6. 0

    PySpark HBase/Phoenix integration

    Stack Overflow | 1 year ago | Ranic
    java.io.IOException: No table was provided.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      No table was provided

      at org.apache.hadoop.hbase.mapred.TableInputFormatBase.getSplits()
    2. HBase
      TableInputFormatBase.getSplits
      1. org.apache.hadoop.hbase.mapred.TableInputFormatBase.getSplits(TableInputFormatBase.java:120)
      1 frame
    3. Hadoop
      JobClient$2.run
      1. org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
      2. org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
      3. org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170)
      4. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:891)
      5. org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844)
      5 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Unknown Source)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
      1 frame
    6. Hadoop
      JobClient.submitJob
      1. org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844)
      2. org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:818)
      2 frames
    7. org.pentaho.hadoop
      CommonHadoopShim.submitJob
      1. org.pentaho.hadoop.shim.common.CommonHadoopShim.submitJob(CommonHadoopShim.java:201)
      1 frame
    8. org.pentaho.di
      Job.run
      1. org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:806)
      2. org.pentaho.di.job.Job.execute(Job.java:589)
      3. org.pentaho.di.job.Job.execute(Job.java:728)
      4. org.pentaho.di.job.Job.execute(Job.java:443)
      5. org.pentaho.di.job.Job.run(Job.java:363)
      5 frames