java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier update found update wikipediaEnTemp set requests=0 where article='!K7_Records' ^

Stack Overflow | Mike Pakhomov | 5 months ago
  1. 0

    Is it possible to do an update using SparkSQL?

    Stack Overflow | 5 months ago | Mike Pakhomov
    java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier update found update wikipediaEnTemp set requests=0 where article='!K7_Records' ^
  2. 0

    SparkSql service in bluemix.error while firing a query using window functions

    Stack Overflow | 6 months ago | Neha Setia
    java.lang.RuntimeException: [1.92] failure: ``union'' expected but `(' found select b.used as used, b.en_cap as en_cap,(b.used/b.en_cap)*100 as util, avg(b.used) over ( partition ^
  3. 0

    dplyr::sample_n fails with wrong SQL when non Hive context is available

    GitHub | 2 months ago | rvm-xx
    java.lang.RuntimeException: [3.35] failure: ``)'' expected but `(' found FROM (SELECT *, row_number() OVER (ORDER BY `user`) as rownumber ^
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark SQL single quote error

    Stack Overflow | 3 weeks ago | yatsukav
    java.lang.RuntimeException: [1.44] failure: ``union'' expected but ErrorToken(end of input) found SELECT *, " \" " as quoteCol FROM tempdf ^
  6. 0

    GitHub comment 66#262864139

    GitHub | 2 weeks ago | wangli86
    java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier CREATE found CREATE DATABASE IF NOT EXISTS mydata ^

  1. rp 1 times, last 1 week ago
  2. Handemelindo 1 times, last 3 weeks ago
  3. rp 4 times, last 2 months ago
  4. balintn 3 times, last 2 months ago
  5. max_samebug 2 times, last 2 months ago
13 more registered users
70 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.RuntimeException

    [1.1] failure: ``with'' expected but identifier update found update wikipediaEnTemp set requests=0 where article='!K7_Records' ^

    at scala.sys.package$.error()
  2. Scala
    package$.error
    1. scala.sys.package$.error(package.scala:27)
    1 frame
  3. Spark Project Catalyst
    DefaultParserDialect.parse
    1. org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)
    2. org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67)
    2 frames
  4. Spark Project SQL
    SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply
    1. org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
    2. org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
    3. org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
    4. org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
    4 frames
  5. scala-parser-combinators
    Parsers$$anon$2$$anonfun$apply$14.apply
    1. scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:137)
    2. scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
    3. scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:237)
    4. scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:237)
    5. scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:217)
    6. scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:249)
    7. scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:249)
    8. scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:197)
    9. scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:249)
    10. scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:249)
    11. scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:217)
    12. scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:882)
    13. scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:882)
    13 frames
  6. Scala
    DynamicVariable.withValue
    1. scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    1 frame
  7. scala-parser-combinators
    PackratParsers$$anon$1.apply
    1. scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:881)
    2. scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
    2 frames
  8. Spark Project Catalyst
    AbstractSparkSQLParser.parse
    1. org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
    1 frame
  9. Spark Project SQL
    SQLContext$$anonfun$1.apply
    1. org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
    2. org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
    2 frames
  10. org.apache.spark
    DDLParser.parse
    1. org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
    1 frame
  11. Spark Project SQL
    SQLContext.sql
    1. org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
    2. org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
    2 frames