java.lang.RuntimeException

[1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^

Solutions on the web14896

  • via DataStax JIRA by Piotr Kołaczkowski, 1 year ago
    [1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^
  • via DataStax JIRA by Piotr Kołaczkowski, 1 year ago
    [1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^
  • via GitHub by wangli86
    , 8 months ago
    [1.1] failure: ``with'' expected but identifier CREATE found CREATE DATABASE IF NOT EXISTS mydata ^
  • Stack trace

    • java.lang.RuntimeException: [1.2] failure: ``with'' expected but identifier CREATE found CREATE TEMPORARY TABLE tmpTable USING org.apache.spark.sql.cassandra OPTIONS ( c_table "test1", keyspace "sql_ds_test", push_down "true", spark_cassandra_input_page_row_size "10", spark_cassandra_output_consistency_level "ONE", spark_cassandra_connection_timeout_ms "1000", spark_cassandra_connection_host "127.0.0.1" ) ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) at org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67) at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:145) at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:145) at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95) at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:94) at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136) at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242) at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    Unknown user
    Once, 1 year ago
    Unknown user
    Once, 1 year ago
    Unknown user
    Once, 1 year ago
    Once, 1 week ago
    Once, 1 week ago
    91 more bugmates