Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    via playframework.com by Unknown author

    Add guice to the dependencies.

    libraryDependencies += guice
    

Solutions on the web

via DataStax JIRA by Piotr Kołaczkowski, 2 years ago
[1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^
via DataStax JIRA by Piotr Kołaczkowski, 1 year ago
[1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^
via Stack Overflow by yatsukav
, 1 year ago
[1.44] failure: ``union'' expected but ErrorToken(end of input) found SELECT *, " \" " as quoteCol FROM tempdf ^
via GitHub by rvm-xx
, 1 year ago
[3.35] failure: ``)'' expected but `(' found FROM (SELECT *, row_number() OVER (ORDER BY `user`) as rownumber ^
via Stack Overflow by Green
, 1 year ago
[1.9] failure: ``union'' expected but `right' found select right(Phone_number,4) from mytable1 ^
via GitHub by wangli86
, 1 year ago
[1.1] failure: ``with'' expected but identifier CREATE found CREATE DATABASE IF NOT EXISTS mydata ^
java.lang.RuntimeException: [1.2] failure: ``with'' expected but identifier CREATE found
 
  CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS (  c_table "test1",  keyspace "sql_ds_test",  push_down "true",  spark_cassandra_input_page_row_size "10",  spark_cassandra_output_consistency_level "ONE",  spark_cassandra_connection_timeout_ms "1000",  spark_cassandra_connection_host "127.0.0.1"  )       
  ^	at scala.sys.package$.error(package.scala:27)	at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)	at org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67)	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:145)	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:145)	at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)	at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:94)	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)