java.io.IOException: Hive exited with status 9

Cloudera Open Source | Arvind Prabhakar | 6 years ago
  1. 0

    When importing data in Hive for a table that already exists, the import fails because the create table statement generated by Sqoop fails. Sqoop should instead allow the data to be loaded in existing table if the --hive-overwrite option is specified. Relevant stack trace: {noformat} SQOOP import statement sqoop import --driver com.teradata.jdbc.TeraDriver --connect jdbc:teradata://****.com/***_V --username **** -P --table Table_1 --split-by TABLENAME --num- mappers 1 --warehouse-dir /userdata/***/sqoop/***/qatest --hive-import --hive-overwrite --hive-table DB_1_Table_1 --verbose Sqoop console log 11/05/23 08:54:38 INFO mapreduce.ImportJobBase: Transferred 245 bytes in 11.3367 seconds (21.6112 bytes/sec) 11/05/23 08:54:38 INFO mapreduce.ImportJobBase: Retrieved 4 records. 11/05/23 08:54:38 INFO hive.HiveImport: Removing temporary files from import process: /***/***/sqoop/***/qatest/***/_logs 11/05/23 08:54:38 INFO hive.HiveImport: Loading uploaded data into Hive 11/05/23 08:54:38 DEBUG hive.HiveImport: Hive.inputTable: Table_1 11/05/23 08:54:38 DEBUG hive.HiveImport: Hive.outputTable: DB_1_Table_1 11/05/23 08:54:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Table_1 AS t WHERE 1=0 11/05/23 08:54:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Table_1 AS t WHERE 1=0 11/05/23 08:54:38 WARN hive.TableDefWriter: Column LOAD_START_DTTM had to be cast to a less precise type in Hive 11/05/23 08:54:38 WARN hive.TableDefWriter: Column LOAD_END_DTTM had to be cast to a less precise type in Hive 11/05/23 08:54:38 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE `DB_1_Table_1` ( `TABLENAME` STRING, `LOAD_START_DTTM` STRING, `LOAD_END_DTTM` STRING) COMMENT 'Imported by sqoop on 2011/05/23 08:54:38' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 11/05/23 08:54:38 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://****/qatest/Table_1' INTO TABLE `DB_1_Table_1` 11/05/23 08:54:38 DEBUG hive.HiveImport: Using external Hive process. 11/05/23 08:54:40 INFO hive.HiveImport: Hive history file=/tmp/bejoys/hive_job_log_bejoys_201105230854_1483614299.txt 11/05/23 08:54:42 INFO hive.HiveImport: FAILED: Error in metadata: AlreadyExistsException(message:Table DB_1_Table_1 already exists) 11/05/23 08:54:42 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask 11/05/23 08:54:42 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 9 at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326) at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276) at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218) at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362) at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423) at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228) {noformat}

    Cloudera Open Source | 6 years ago | Arvind Prabhakar
    java.io.IOException: Hive exited with status 9
  2. 0

    [Sqoop-user] MySQLNonTransientConnectionException - Grokbase

    grokbase.com | 1 year ago
    java.io.IOException: Destination file hdfs://hadoop-namenode-2XXXXXXXXXXXX/user/hive/warehouse/hive_table1/data-00000 already exists at com.cloudera.sqoop.io.SplittingOutputStream.openNextFile(SplittingOutputStream.java:99) at com.cloudera.sqoop.io.SplittingOutputStream.<init>(SplittingOutputStream.java:80) at com.cloudera.sqoop.util.DirectImportUtils.createHdfsSink(DirectImportUtils.java:90) at com.cloudera.sqoop.manager.DirectPostgresqlManager.importTable(DirectPostgresqlManager.java:379) at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:382)
  3. 0

    Error on Last Modified Incremental import from Postgres

    incubator-sqoop-user | 5 years ago | Mark Roddy
    java.io.IOException: Could not get current time from database
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Re: Error on Last Modified Incremental import from Postgres

    incubator-sqoop-user | 5 years ago | bleeapache bleeapache
    java.io.IOException: Could not get current time from database
  6. 0

    Error on Last Modified Incremental import from Postgres

    sqoop-user | 5 years ago | Mark Roddy
    java.io.IOException: Could not get current time from database

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Hive exited with status 9

      at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript()
    2. com.cloudera.sqoop
      Sqoop.run
      1. com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326)
      2. com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276)
      3. com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218)
      4. com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362)
      5. com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
      6. com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
      6 frames
    3. Hadoop
      ToolRunner.run
      1. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
      2. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
      2 frames
    4. com.cloudera.sqoop
      Sqoop.main
      1. com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
      2. com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218)
      3. com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)
      3 frames