org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException Unrecognized file format in STORED AS clause: 'TEXTILE'

Stack Overflow | AKC | 2 months ago
  1. 0

    CREATE TABLE of HIVE in SPARK/JAVA Program is giving error

    Stack Overflow | 2 months ago | AKC
    org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException Unrecognized file format in STORED AS clause: 'TEXTILE'
  2. 0

    [HIVE-10990] Compatibility Hive-1.2 an hbase-1.0.1.1 - ASF JIRA

    apache.org | 11 months ago
    org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V
  3. 0

    locking hive table from spark HiveContext

    Stack Overflow | 8 months ago | Vivek Kumar
    org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. lock Table LockManager not specified
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [SPARK-2706] Enable Spark to support Hive 0.13 - ASF JIRA

    apache.org | 1 year ago
    org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10072]: Database does not exist:
  6. 0

    [SPARK-2706] Enable Spark to support Hive 0.13 - ASF JIRA

    apache.org | 1 year ago
    org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10072]: Database does not exist: default

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.sql.execution.QueryExecutionException

      FAILED: SemanticException Unrecognized file format in STORED AS clause: 'TEXTILE'

      at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply()
    2. org.apache.spark
      ClientWrapper.runSqlHive
      1. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:499)
      2. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)
      3. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)
      4. org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)
      5. org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)
      6. org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:279)
      7. org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:484)
      8. org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:474)
      8 frames
    3. Spark Project Hive
      HiveNativeCommand.run
      1. org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:605)
      2. org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33)
      2 frames
    4. Spark Project SQL
      SparkPlan$$anonfun$execute$5.apply
      1. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
      2. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
      3. org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
      4. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
      5. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
      5 frames
    5. Spark
      RDDOperationScope$.withScope
      1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      1 frame
    6. Spark Project SQL
      SQLContext.sql
      1. org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
      2. org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
      3. org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
      4. org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
      5. org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
      6. org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
      7. org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
      7 frames
    7. com.comcast.emm
      LegalDemand.main
      1. com.comcast.emm.vodip.Viper2.LegalDemand.main(LegalDemand.java:84)
      1 frame
    8. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    9. Spark
      SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain
      1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
      1 frame