java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.MutableAny cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableInt

Apache's JIRA Issue Tracker | Michael Armbrust | 2 years ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    From the user list. It looks like data is not implemented correctly in in-memory caching. We should also check the JDBC datasource support for date. {code} Stack trace of an exception being reported since upgrade to 1.3.0: java.lang.ClassCastException: java.sql.Date cannot be cast to java.lang.Integer at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:105) ~[scala-library-2.11.6.jar:na] at org.apache.spark.sql.catalyst.expressions.GenericRow.getInt(rows.scala:83) ~[spark-catalyst_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.IntColumnStats.gatherStats(ColumnStats.scala:191) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.NullableColumnBuilder$class.appendFrom(NullableColumnBuilder.scala:56) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.NativeColumnBuilder.org$apache$spark$sql$columnar$compression$CompressibleColumnBuilder$$super$appendFrom(ColumnBuilder.scala:87) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.compression.CompressibleColumnBuilder$class.appendFrom(CompressibleColumnBuilder.scala:78) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.NativeColumnBuilder.appendFrom(ColumnBuilder.scala:87) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at org.apache.spark.sql.columnar.InMemoryRelation$$anonfun$3$$anon$1.next(InMemoryColumnarTableScan.scala:135) ~[spark-sql_2.11-1.3.0.jar:1.3.0] at {code}

    Apache's JIRA Issue Tracker | 2 years ago | Michael Armbrust
    java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.MutableAny cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableInt
  2. 0

    Help:basic RMI question

    Google Groups | 2 decades ago | Paolo De Lutiis
    java.lang.ClassCastException: serverPackage.ServerClass_Stub at clientPackage.myApplet.init(myApplet.java:line#) at sun.applet.AppletPanel.run(AppletPanel.java:273) at java.lang.Thread.
  3. 0

    Shared hosting Bungeecord problems.

    GitHub | 3 years ago | Arksenu
    java.lang.ClassCastException: bka cannot be cast to fs All my servers are set to onlinemode: false Here is my config.yml groups: arksenu: - admin disabled_commands: - find player_limit: -1 stats: 347d1d62-6fb6-4869-bd43-7d0745de8e3c permissions: default: - bungeecord.command.server - bungeecord.command.list admin: - bungeecord.command.ip - bungeecord.command.alert - bungeecord.command.end - bungeecord.command.reload listeners: - max_players: 18 fallback_server: hub host: 0.0.0.0:35289 bind_local_address: true ping_passthrough: false tab_list: GLOBAL_PING default_server: hub forced_hosts: pvp.md-5.net: hub tab_size: 60 force_default_server: true motd: ’Network’ query_enabled: false query_port: 25565 timeout: 30000 connection_throttle: 4000 servers: hub: address: 108.170.8.146:35289 restricted: false motd: test UvGames: address: 66.85.165.170:26200 restricted: false motd: test UvPrison: address: 66.85.128.90:25928 restricted: false motd: test ip_forward: false online_mode: true And here is my console message -> UpstreamBridge has disconnected disconnected with: Exception Connecting:RuntimeException : Server is online mode! @ net.md_5.bungee.ServerConnector:188
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Shared hosting Bungeecord problems.

    GitHub | 3 years ago | Arksenu
    java.lang.ClassCastException: bka cannot be cast to fs All my servers are set to onlinemode: false Here is my config.yml groups: arksenu: - admin disabled_commands: - find player_limit: -1 stats: 347d1d62-6fb6-4869-bd43-7d0745de8e3c permissions: default: - bungeecord.command.server - bungeecord.command.list admin: - bungeecord.command.ip - bungeecord.command.alert - bungeecord.command.end - bungeecord.command.reload listeners: - max_players: 18 fallback_server: hub host: 0.0.0.0:35289 bind_local_address: true ping_passthrough: false tab_list: GLOBAL_PING default_server: hub forced_hosts: pvp.md-5.net: hub tab_size: 60 force_default_server: true motd: ’Network’ query_enabled: false query_port: 25565 timeout: 30000 connection_throttle: 4000 servers: hub: address: 108.170.8.146:35289 restricted: false motd: test UvGames: address: 66.85.165.170:26200 restricted: false motd: test UvPrison: address: 66.85.128.90:25928 restricted: false motd: test ip_forward: false online_mode: true And here is my console message -> UpstreamBridge has disconnected disconnected with: Exception Connecting:RuntimeException : Server is online mode! @ net.md_5.bungee.ServerConnector:188
  6. 0

    SWTBot/Troubleshooting - Eclipsepedia

    eclipse.org | 1 year ago
    java.lang.ClassCastException: org.apache.tools.ant.taskdefs.optional.junit.XMLJUnitResultFormatter at org.eclipse.swtbot.eclipse.junit4.headless.EclipseTestRunner .run(EclipseTestRunner.java:331) at org.eclipse.swtbot.eclipse.junit4.headless.EclipseTestRunner .run(EclipseTestRunner.java:208) at org.eclipse.swtbot.eclipse.junit4.headless.UITestApplication .runTests(UITestApplication.java:115) at org.eclipse.ui.internal.testing.WorkbenchTestable$1.run(Work benchTestable.java:68)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ClassCastException

      org.apache.spark.sql.catalyst.expressions.MutableAny cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableInt

      at org.apache.spark.sql.catalyst.expressions.SpecificMutableRow.getInt()
    2. Spark Project Catalyst
      SpecificMutableRow.getInt
      1. org.apache.spark.sql.catalyst.expressions.SpecificMutableRow.getInt(SpecificMutableRow.scala:248)[spark-catalyst_2.11-1.3.0.jar:1.3.0]
      1 frame
    3. Spark Project SQL
      InMemoryRelation$$anonfun$3$$anon$1.next
      1. org.apache.spark.sql.columnar.IntColumnStats.gatherStats(ColumnStats.scala:191)[spark-sql_2.11-1.3.0.jar:1.3.0]
      2. org.apache.spark.sql.columnar.NullableColumnBuilder$class.appendFrom(NullableColumnBuilder.scala:56)[spark-sql_2.11-1.3.0.jar:1.3.0]
      3. org.apache.spark.sql.columnar.NativeColumnBuilder.org$apache$spark$sql$columnar$compression$CompressibleColumnBuilder$$super$appendFrom(ColumnBuilder.scala:87)[spark-sql_2.11-1.3.0.jar:1.3.0]
      4. org.apache.spark.sql.columnar.compression.CompressibleColumnBuilder$class.appendFrom(CompressibleColumnBuilder.scala:78)[spark-sql_2.11-1.3.0.jar:1.3.0]
      5. org.apache.spark.sql.columnar.NativeColumnBuilder.appendFrom(ColumnBuilder.scala:87)[spark-sql_2.11-1.3.0.jar:1.3.0]
      6. org.apache.spark.sql.columnar.InMemoryRelation$$anonfun$3$$anon$1.next(InMemoryColumnarTableScan.scala:135)[spark-sql_2.11-1.3.0.jar:1.3.0]
      7. org.apache.spark.sql.columnar.InMemoryRelation$$anonfun$3$$anon$1.next(InMemoryColumnarTableScan.scala:111)[spark-sql_2.11-1.3.0.jar:1.3.0]
      7 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:249)[spark-core_2.11-1.3.0.jar:1.3.0]
      2. org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:172)[spark-core_2.11-1.3.0.jar:1.3.0]
      3. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:79)[spark-core_2.11-1.3.0.jar:1.3.0]
      4. org.apache.spark.rdd.RDD.iterator(RDD.scala:242)[spark-core_2.11-1.3.0.jar:1.3.0]
      5. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)[spark-core_2.11-1.3.0.jar:1.3.0]
      6. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)[spark-core_2.11-1.3.0.jar:1.3.0]
      7. org.apache.spark.rdd.RDD.iterator(RDD.scala:244)[spark-core_2.11-1.3.0.jar:1.3.0]
      8. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)[spark-core_2.11-1.3.0.jar:1.3.0]
      9. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)[spark-core_2.11-1.3.0.jar:1.3.0]
      10. org.apache.spark.rdd.RDD.iterator(RDD.scala:244)[spark-core_2.11-1.3.0.jar:1.3.0]
      11. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)[spark-core_2.11-1.3.0.jar:1.3.0]
      12. org.apache.spark.scheduler.Task.run(Task.scala:64)[spark-core_2.11-1.3.0.jar:1.3.0]
      13. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)[spark-core_2.11-1.3.0.jar:1.3.0]
      13 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[na:1.8.0_11]
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[na:1.8.0_11]
      3. java.lang.Thread.run(Thread.java:745)[na:1.8.0_11]
      3 frames