tool.ImportTool: Encountered IOException running import job: java.io.IOException: NoSuchObjectException(message:test.user1 table not found)

  1. 0

    how to create multi level partition in hive using sqoop

    Stack Overflow | 4 months ago | Mahebub A Sayyed
    tool.ImportTool: Encountered IOException running import job: java.io.IOException: NoSuchObjectException(message:test.user1 table not found)
  2. 0

    Wrting Data from MS-SQL server to HDFS using Sqoop

    Stack Overflow | 3 years ago | Bhagwant Bhobe
    tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not start Java compiler.
  3. 0

    Sqoop Incremental Import and CURRENT_TIMESTAMP

    Stack Overflow | 1 year ago | Sanjiv
    tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not get current time from database
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Re: sqoop import --hive-import failing for row_version (datatype: timestamp) column for MS SQL Server

    sqoop-user | 11 months ago | Jarek Jarcec Cecho
    tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive does not support the SQL type for column row_version
  6. 0

    RE: Hortonworks Connector for Teradata + Sqoop Query

    sqoop-user | 2 years ago | Dhandapani, Karthik
    tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception running Teradata import job

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. tool.ImportTool

      Encountered IOException running import job: java.io.IOException: NoSuchObjectException(message:test.user1 table not found)

      at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput()
    2. org.apache.hive
      HCatInputFormat.setInput
      1. org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
      2. org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
      2 frames
    3. org.apache.sqoop
      Sqoop.run
      1. org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
      2. org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
      3. org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
      4. org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
      5. org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
      6. org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
      7. org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
      8. org.apache.sqoop.Sqoop.run(Sqoop.java:143)
      8 frames
    4. Hadoop
      ToolRunner.run
      1. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      1 frame
    5. org.apache.sqoop
      Sqoop.main
      1. org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
      2. org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
      3. org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
      4. org.apache.sqoop.Sqoop.main(Sqoop.java:236)
      4 frames