org.apache.spark.SparkException: Task failed while writing rows

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by fingerspitzen
, 1 year ago
Task failed while writing rows.
via DataStax JIRA by Dmytro Popovych, 1 year ago
Task failed while writing rows.
via Stack Overflow by Newbie
, 1 year ago
via Apache's JIRA Issue Tracker by Naden Franciscus, 1 year ago
Task failed while writing rows.
via DataStax JIRA by Dmytro Popovych, 1 year ago
Task failed while writing rows.
via Stack Overflow by Hello lad
, 2 years ago
org.apache.spark.SparkException: Task failed while writing rows
at com.mongodb.connection.ProtocolHelper.getQueryFailureException(ProtocolHelper.java:131)
at com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:96)
at com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:49)
at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:159)
at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:286)
at com.mongodb.connection.DefaultServerConnection.getMore(DefaultServerConnection.java:251)
at com.mongodb.operation.QueryBatchCursor.getMore(QueryBatchCursor.java:218)
at com.mongodb.operation.QueryBatchCursor.hasNext(QueryBatchCursor.java:103)
at com.mongodb.MongoBatchCursorAdapter.hasNext(MongoBatchCursorAdapter.java:46)
at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:41)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at org.apache.spark.api.python.SerDeUtil$AutoBatchedPickler.next(SerDeUtil.scala:118)
at org.apache.spark.api.python.SerDeUtil$AutoBatchedPickler.foreach(SerDeUtil.scala:110)
at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
at org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1801)

Users with the same issue

2 times, 3 months ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 2 years ago

Know the solutions? Share your knowledge to help other developers to debug faster.