java.lang.RuntimeException: error reading Scala signature of org.apache.spark.ml.classification.NaiveBayesModel: assertion failed: class MLWriter

Stack Overflow | Kaushal | 5 months ago
  1. 0

    Spark is throughing scala.reflect.internal.MissingRequirementError when writing Naive Bayes model

    Stack Overflow | 5 months ago | Kaushal
    java.lang.RuntimeException: error reading Scala signature of org.apache.spark.ml.classification.NaiveBayesModel: assertion failed: class MLWriter
  2. 0

    Exception on using VectorAssembler in apache spark ml

    Stack Overflow | 7 months ago | hbabbar
    java.lang.RuntimeException: error reading Scala signature of org.apache.spark.mllib.linalg.Vector: assertion failed: unsafe symbol SparseVector (child of package linalg) in runtime reflection universe
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    [SI-7639] java.lang.ArrayIndexOutOfBoundsException - java.lang.RuntimeException: error reading Scala signature of com.greenfossil.sqlview.TableSupport: null - Scala

    scala-lang.org | 1 year ago
    java.lang.RuntimeException: error reading Scala signature of com.greenfossil.sqlview.TableSupport: null
  5. 0

    The source compiled successfully, but when I run the testcase via scala-test I get the stack trace below. However, if I comment out one of the method with a long param list, the application runs fine. Here's the method signature def update[A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T](ca: Schema#Table#Column[A], cb: Schema#Table#Column[B], cc: Schema#Table#Column[C], cd: Schema#Table#Column[D], ce: Schema#Table#Column[E], cf: Schema#Table#Column[F], cg: Schema#Table#Column[G], ch: Schema#Table#Column[H], ci: Schema#Table#Column[I], cj: Schema#Table#Column[J], ck: Schema#Table#Column[K], cl: Schema#Table#Column[L], cm: Schema#Table#Column[M], cn: Schema#Table#Column[N], co: Schema#Table#Column[O], cp: Schema#Table#Column[P], cq: Schema#Table#Column[Q], cr: Schema#Table#Column[R], cs: Schema#Table#Column[S], ct: Schema#Table#Column[T])(va: A, vb: B, vc: C, vd: D, ve: E, vf: F, vg: G, vh: H, vi: I, vj: J, vk: K, vl: L, vm: M, vn: N, vo: O, vp: P, vq: Q, vr: R, vs: S, vt: T)(where: String = null)(implicit session: DBSession) = { /*body removed*/ } Stack trace java.lang.ArrayIndexOutOfBoundsException [info] DDLSuite: [info] - Generate Schema !!! IGNORED !!! [info] - Create a Schema file !!! IGNORED !!! [info] - Create Table statement !!! IGNORED !!! [info] - sort tables according to dependencies !!! IGNORED !!! [info] - delete, create schema *** FAILED *** [info] java.lang.RuntimeException: error reading Scala signature of com.greenfossil.sqlview.TableSupport: null [info] at scala.reflect.internal.pickling.UnPickler.unpickle(UnPickler.scala:45) [info] at scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:578) [info] at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:31) [info] at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1229) [info] at scala.reflect.internal.Types$TypeRef.baseTypeSeqImpl(Types.scala:2436) [info] at scala.reflect.internal.Types$class.defineBaseTypeSeqOfTypeRef(Types.scala:2562) [info] at scala.reflect.runtime.JavaUniverse.scala$reflect$runtime$SynchronizedTypes$$super$defineBaseTypeSeqOfTypeRef(JavaUniverse.scala:12) [info] at scala.reflect.runtime.SynchronizedTypes$class.defineBaseTypeSeqOfTypeRef(SynchronizedTypes.scala:104) [info] at scala.reflect.runtime.JavaUniverse.defineBaseTypeSeqOfTypeRef(JavaUniverse.scala:12) [info] at scala.reflect.internal.Types$TypeRef.baseTypeSeq(Types.scala:2443) [info] ... [error] Failed: : Total 5, Failed 1, Errors 0, Passed 0, Skipped 4 [error] Failed tests: [error] com.greenfossil.sqlview.DDLSuite [trace] Stack trace suppressed: run last test:test-only for the full output. [error] (test:test-only) Tests unsuccessful [error] Total time: 96 s, completed Jul 5, 2013 9:02:41 PM > last test:test-only [debug] Running Test com.greenfossil.sqlview.DDLSuite : subclass(false, org.scalatest.Suite) with arguments [error] Failed: : Total 5, Failed 1, Errors 0, Passed 0, Skipped 4 [error] Failed tests: [error] com.greenfossil.sqlview.DDLSuite java.lang.RuntimeException: Tests unsuccessful at scala.sys.package$.error(package.scala:27) at scala.Predef$.error(Predef.scala:123) at sbt.Tests$.showResults(Tests.scala:192) at sbt.Defaults$$anonfun$inputTests$2$$anonfun$apply$18$$anonfun$apply$19.apply(Defaults.scala:373) at sbt.Defaults$$anonfun$inputTests$2$$anonfun$apply$18$$anonfun$apply$19.apply(Defaults.scala:373) at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49) at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49) at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41) at sbt.std.Transform$$anon$5.work(System.scala:71) at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232) at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232) at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18) at sbt.Execute.work(Execute.scala:238) at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232) at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232) at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) at sbt.CompletionService$$anon$2.call(CompletionService.scala:30) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722) [error] (test:test-only) Tests unsuccessful

    Scala JIRA | 3 years ago | Cheong Chung Onn
    java.lang.RuntimeException: error reading Scala signature of com.greenfossil.sqlview.TableSupport: null

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      error reading Scala signature of org.apache.spark.ml.classification.NaiveBayesModel: assertion failed: class MLWriter

      at scala.reflect.internal.pickling.UnPickler.unpickle()
    2. Scala
      Mirrors$RootsBase.staticClass
      1. scala.reflect.internal.pickling.UnPickler.unpickle(UnPickler.scala:45)
      2. scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:565)
      3. scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:32)
      4. scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
      5. scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:43)
      6. scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
      7. scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)
      8. scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
      9. scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
      9 frames
    3. Spark Project ML Library
      NaiveBayesModel$NaiveBayesModelWriter$$typecreator1$1.apply
      1. org.apache.spark.ml.classification.NaiveBayesModel$NaiveBayesModelWriter$$typecreator1$1.apply(NaiveBayes.scala:264)
      1 frame
    4. Scala
      TypeTags$WeakTypeTagImpl.tpe
      1. scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)
      2. scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)
      2 frames
    5. Spark Project Catalyst
      ScalaReflection$.schemaFor
      1. org.apache.spark.sql.catalyst.ScalaReflection$class.localTypeOf(ScalaReflection.scala:642)
      2. org.apache.spark.sql.catalyst.ScalaReflection$.localTypeOf(ScalaReflection.scala:30)
      3. org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:630)
      4. org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30)
      4 frames
    6. Spark Project SQL
      SQLContext.createDataFrame
      1. org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:430)
      1 frame
    7. Spark Project ML Library
      NaiveBayesModel$NaiveBayesModelWriter.saveImpl
      1. org.apache.spark.ml.classification.NaiveBayesModel$NaiveBayesModelWriter.saveImpl(NaiveBayes.scala:264)
      1 frame
    8. org.apache.spark
      MLWritable$class.save
      1. org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:90)
      2. org.apache.spark.ml.util.MLWritable$class.save(ReadWrite.scala:130)
      2 frames
    9. Spark Project ML Library
      NaiveBayesModel.save
      1. org.apache.spark.ml.classification.NaiveBayesModel.save(NaiveBayes.scala:130)
      1 frame
    10. $line61
      $read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1$$anonfun$apply$mcVI$sp$1.apply
      1. $line61.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1$$anonfun$apply$mcVI$sp$1.apply$mcV$sp(<console>:65)
      2. $line61.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1$$anonfun$apply$mcVI$sp$1.apply(<console>:63)
      3. $line61.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1$$anonfun$apply$mcVI$sp$1.apply(<console>:63)
      3 frames
    11. Scala
      Future$PromiseCompletingRunnable.run
      1. scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
      2. scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
      2 frames
    12. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames