Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by AKC
, 1 year ago
FAILED: SemanticException Unrecognized file format in STORED AS clause: 'TEXTILE'
via Stack Overflow by Zanarou
, 1 year ago
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V
via Stack Overflow by Vivek Kumar
, 1 year ago
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. lock Table LockManager not specified
via apache.org by Unknown author, 2 years ago
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V
via Stack Overflow by aa8y
, 2 years ago
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Unable to create database path file:/user/hive/warehouse/foo.db, failed to create database foo)
org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException Unrecognized file format in STORED AS clause: 'TEXTILE'	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:499)	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:484)	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:290)	at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:237)	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:236)	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:279)	at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:484)	at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:474)	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:605)	at org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33)	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)	at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)	at org.apache.spark.sql.DataFrame.(DataFrame.scala:145)	at org.apache.spark.sql.DataFrame.(DataFrame.scala:130)	at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)	at com.comcast.emm.vodip.Viper2.LegalDemand.main(LegalDemand.java:84)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:606)	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)