java.lang.ClassCastException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Dataflow issue
    via GitHub by jtkoskel
    ,
  • Error for importvcf
    via GitHub by seuchoi
    ,
  • Running this command: {code} sc.parallelize([(('a', 'b'), 'c')]).groupByKey().partitionBy(20).cache().lookup(('a', 'b')) {code} gives the following error: {noformat} 15/09/16 14:22:23 INFO SparkContext: Starting job: runJob at PythonRDD.scala:361 Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/Cellar/apache-spark/1.5.0/libexec/python/pyspark/rdd.py", line 2199, in lookup return self.ctx.runJob(values, lambda x: x, [self.partitioner(key)]) File "/usr/local/Cellar/apache-spark/1.5.0/libexec/python/pyspark/context.py", line 916, in runJob port = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd, partitions) File "/usr/local/Cellar/apache-spark/1.5.0/libexec/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 538, in __call__ File "/usr/local/Cellar/apache-spark/1.5.0/libexec/python/pyspark/sql/utils.py", line 36, in deco return f(*a, **kw) File "/usr/local/Cellar/apache-spark/1.5.0/libexec/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob. : java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Integer at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106) at org.apache.spark.scheduler.DAGScheduler$$anonfun$submitJob$1.apply(DAGScheduler.scala:530) at scala.collection.Iterator$class.find(Iterator.scala:780) at scala.collection.AbstractIterator.find(Iterator.scala:1157) at scala.collection.IterableLike$class.find(IterableLike.scala:79) at scala.collection.AbstractIterable.find(Iterable.scala:54) at org.apache.spark.scheduler.DAGScheduler.submitJob(DAGScheduler.scala:530) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:558) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1813) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1826) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1839) at org.apache.spark.api.python.PythonRDD$.runJob(PythonRDD.scala:361) at org.apache.spark.api.python.PythonRDD.runJob(PythonRDD.scala) at sun.reflect.GeneratedMethodAccessor49.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379) at py4j.Gateway.invoke(Gateway.java:259) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:207) at java.lang.Thread.run(Thread.java:745) {noformat}
    via by Thouis Jones,
  • Spark RDD to Dataframe with schema specifying
    via Stack Overflow by morsik
    ,
  • Re: Trouble with cache() and parquet
    via by Michael Armbrust,
    • java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Integer at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106) at org.apache.spark.rdd.OrderedRDD$$anonfun$calculateKeyRanges$1.apply(OrderedRDD.scala:143) at org.apache.spark.rdd.OrderedRDD$$anonfun$calculateKeyRanges$1.apply(OrderedRDD.scala:142) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) at org.apache.spark.rdd.OrderedRDD$.calculateKeyRanges(OrderedRDD.scala:142) at org.apache.spark.rdd.OrderedRDD$.apply(OrderedRDD.scala:117) at org.broadinstitute.hail.RichPairRDD$.toOrderedRDD$extension(Utils.scala:482) at org.broadinstitute.hail.io.vcf.LoadVCF$.apply(LoadVCF.scala:267) at org.broadinstitute.hail.driver.ImportVCF$.run(ImportVCF.scala:85) at org.broadinstitute.hail.driver.ImportVCF$.run(ImportVCF.scala:31) at org.broadinstitute.hail.driver.Command.runCommand(Command.scala:239) at org.broadinstitute.hail.driver.Main$.runCommand(Main.scala:120) at org.broadinstitute.hail.driver.Main$$anonfun$runCommands$1$$anonfun$1.apply(Main.scala:144) at org.broadinstitute.hail.driver.Main$$anonfun$runCommands$1$$anonfun$1.apply(Main.scala:144) at org.broadinstitute.hail.Utils$.time(Utils.scala:1282) at org.broadinstitute.hail.driver.Main$$anonfun$runCommands$1.apply(Main.scala:143) at org.broadinstitute.hail.driver.Main$$anonfun$runCommands$1.apply(Main.scala:137) at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51) at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60) at scala.collection.mutable.ArrayOps$ofRef.foldLeft(ArrayOps.scala:108) at org.broadinstitute.hail.driver.Main$.runCommands(Main.scala:137) at org.broadinstitute.hail.driver.Main$.main(Main.scala:286) at org.broadinstitute.hail.driver.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Users with the same issue

    HandemelindoHandemelindo
    1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    bandocabandoca
    2 times, last one,
    tyson925tyson925
    9 times, last one,
    poroszdporoszd
    1 times, last one,
    4 more bugmates