org.apache.spark.SparkException: Exception thrown in awaitResult

Stack Overflow | michelle | 4 months ago
  1. 0

    Spark Job fails due to java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage

    Stack Overflow | 3 months ago | Mnemosyne
    java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
  2. 0

    unable to set SPARK_MASTER_HOST to public ip in AWS while running on a stand alone mode?

    Stack Overflow | 3 months ago | user1870400
    org.apache.spark.SparkException: Exception thrown in awaitResult
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    JavaSparkStreaming | Exception thrown in awaitResult

    Stack Overflow | 1 month ago | srikanth
    org.apache.spark.SparkException: Exception thrown in awaitResult
  5. 0

    Google App Engine - InvalidClassException when logging in

    Stack Overflow | 4 years ago | ice13ill
    java.lang.RuntimeException: java.io.InvalidClassException: ro.expert.evt.shared.entities.ObjectModel; local class incompatible: stream classdesc serialVersionUID = -2824144882306533912, local class serialVersionUID = 6500787607817458947

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -1223633663228316618, local class serialVersionUID = 18257903091306170

      at java.io.ObjectStreamClass.initNonProxy()
    2. Java RT
      ObjectInputStream.readObject
      1. java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
      2. java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
      3. java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
      4. java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
      5. java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
      6. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
      7. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
      8. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
      9. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
      10. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
      11. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
      12. java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
      12 frames
    3. Spark
      JavaSerializerInstance.deserialize
      1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
      2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
      2 frames
    4. org.apache.spark
      NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply
      1. org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
      1 frame
    5. Scala
      DynamicVariable.withValue
      1. scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
      1 frame
    6. org.apache.spark
      NettyRpcEnv$$anonfun$deserialize$1.apply
      1. org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
      2. org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
      2 frames
    7. Scala
      DynamicVariable.withValue
      1. scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
      1 frame
    8. org.apache.spark
      NettyRpcHandler.receive
      1. org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
      2. org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
      3. org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
      3 frames
    9. Spark
      TransportChannelHandler.channelRead0
      1. org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
      2. org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
      3. org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
      4. org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
      4 frames
    10. Netty
      AbstractChannelHandlerContext.invokeChannelRead
      1. io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
      2. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      3. io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
      4. io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
      5. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      6. io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
      7. io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
      8. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      8 frames