org.apache.spark.SparkException: Task failed while writing rows

Stack Overflow | Legolas | 5 months ago
  1. 0

    com.mongodb.MongoQueryException: Query failed with error code 2 and error message 'bad sort specification'

    Google Groups | 4 months ago | srinivas seema
    com.mongodb.MongoQueryException: Query failed with error code 2 and error message 'bad sort specification' on server localhost:27017]
  2. 0

    VersionOne Collector MongoDB Query Issue

    GitHub | 12 months ago | aniketvsawant
    org.springframework.data.mongodb.UncategorizedMongoDbException: Query failed with error code 17287 and error message 'Can't canonicalize query: BadValue unknown top level operator: $query' on server localhost:27017; nest
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    GitHub comment 220#164967695

    GitHub | 12 months ago | amitmawkin
    org.springframework.data.mongodb.UncategorizedMongoDbException: Query failed with error code 17287 and error message 'Can't canonicalize query: BadValue unknown top level operator: $query' on server ccloud-tomcat04200.kdc.capitalone.com:11500; nested exception is com.mongodb.MongoQueryException: Query failed with error code 17287 and error message 'Can't canonicalize query: BadValue unknown top level operator: $query' on server ccloud-tomcat04200.kdc.capitalone.com:11500
  5. 0

    com.mongodb.MongoQueryException: Query failed with error code 2 and error message 'bad sort specification'

    Google Groups | 4 months ago | srinivas seema
    com.mongodb.MongoQueryException: Query failed with error code 2 and error message 'bad sort specification' on server localhost:27017]

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.mongodb.MongoQueryException

      Query failed with error code 16493 and error message 'Tried to create string longer than 16MB' on server xx.xxx.xx.xx:27018

      at com.mongodb.connection.ProtocolHelper.getQueryFailureException()
    2. MongoDB Java Driver
      MongoBatchCursorAdapter.hasNext
      1. com.mongodb.connection.ProtocolHelper.getQueryFailureException(ProtocolHelper.java:131)
      2. com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:96)
      3. com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:49)
      4. com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:159)
      5. com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:286)
      6. com.mongodb.connection.DefaultServerConnection.getMore(DefaultServerConnection.java:251)
      7. com.mongodb.operation.QueryBatchCursor.getMore(QueryBatchCursor.java:218)
      8. com.mongodb.operation.QueryBatchCursor.hasNext(QueryBatchCursor.java:103)
      9. com.mongodb.MongoBatchCursorAdapter.hasNext(MongoBatchCursorAdapter.java:46)
      9 frames
    3. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:41)
      2. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      3. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      4. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      4 frames
    4. Spark
      SerDeUtil$AutoBatchedPickler.next
      1. org.apache.spark.api.python.SerDeUtil$AutoBatchedPickler.next(SerDeUtil.scala:118)
      2. org.apache.spark.api.python.SerDeUtil$AutoBatchedPickler.next(SerDeUtil.scala:110)
      2 frames
    5. Scala
      Iterator$class.foreach
      1. scala.collection.Iterator$class.foreach(Iterator.scala:727)
      1 frame
    6. Spark
      PythonRunner$WriterThread.run
      1. org.apache.spark.api.python.SerDeUtil$AutoBatchedPickler.foreach(SerDeUtil.scala:110)
      2. org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
      3. org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
      4. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1801)
      5. org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:239)
      5 frames