java.lang.NoClassDefFoundError: Could not initialize class $line10.$read$

Apache's JIRA Issue Tracker | Svend Vanderveken | 3 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Execution of SQL query against HDFS systematically throws a class not found exception on slave nodes when executing . (this was originally reported on the user list: http://apache-spark-user-list.1001560.n3.nabble.com/spark1-0-1-spark-sql-error-java-lang-NoClassDefFoundError-Could-not-initialize-class-line11-read-tc10135.html) Sample code (ran from spark-shell): {code} val sqlContext = new org.apache.spark.sql.SQLContext(sc) import sqlContext.createSchemaRDD case class Car(timestamp: Long, objectid: String, isGreen: Boolean) // I get the same error when pointing to the folder "hdfs://vm28:8020/test/cardata" val data = sc.textFile("hdfs://vm28:8020/test/cardata/part-00000") val cars = data.map(_.split(",")).map ( ar => Car(ar(0).toLong, ar(1), ar(2).toBoolean)) cars.registerAsTable("mcars") val allgreens = sqlContext.sql("SELECT objectid from mcars where isGreen = true") allgreens.collect.take(10).foreach(println) {code} Stack trace on the slave nodes: {code} I0716 13:01:16.215158 13631 exec.cpp:131] Version: 0.19.0 I0716 13:01:16.219285 13656 exec.cpp:205] Executor registered on slave 20140714-142853-485682442-5050-25487-2 14/07/16 13:01:16 INFO MesosExecutorBackend: Registered with Mesos as executor ID 20140714-142853-485682442-5050-25487-2 14/07/16 13:01:16 INFO SecurityManager: Changing view acls to: mesos,mnubohadoop 14/07/16 13:01:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mesos, mnubohadoop) 14/07/16 13:01:17 INFO Slf4jLogger: Slf4jLogger started 14/07/16 13:01:17 INFO Remoting: Starting remoting 14/07/16 13:01:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@vm23:38230] 14/07/16 13:01:17 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@vm23:38230] 14/07/16 13:01:17 INFO SparkEnv: Connecting to MapOutputTracker: akka.tcp://spark@vm28:41632/user/MapOutputTracker 14/07/16 13:01:17 INFO SparkEnv: Connecting to BlockManagerMaster: akka.tcp://spark@vm28:41632/user/BlockManagerMaster 14/07/16 13:01:17 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140716130117-8ea0 14/07/16 13:01:17 INFO MemoryStore: MemoryStore started with capacity 294.9 MB. 14/07/16 13:01:17 INFO ConnectionManager: Bound socket to port 44501 with id = ConnectionManagerId(vm23-hulk-priv.mtl.mnubo.com,44501) 14/07/16 13:01:17 INFO BlockManagerMaster: Trying to register BlockManager 14/07/16 13:01:17 INFO BlockManagerMaster: Registered BlockManager 14/07/16 13:01:17 INFO HttpFileServer: HTTP File server directory is /tmp/spark-ccf6f36c-2541-4a25-8fe4-bb4ba00ee633 14/07/16 13:01:17 INFO HttpServer: Starting HTTP Server 14/07/16 13:01:18 INFO Executor: Using REPL class URI: http://vm28:33973 14/07/16 13:01:18 INFO Executor: Running task ID 2 14/07/16 13:01:18 INFO HttpBroadcast: Started reading broadcast variable 0 14/07/16 13:01:18 INFO MemoryStore: ensureFreeSpace(125590) called with curMem=0, maxMem=309225062 14/07/16 13:01:18 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 122.6 KB, free 294.8 MB) 14/07/16 13:01:18 INFO HttpBroadcast: Reading broadcast variable 0 took 0.294602722 s 14/07/16 13:01:19 INFO HadoopRDD: Input split: hdfs://vm28:8020/test/cardata/part-00000:23960450+23960451 I0716 13:01:19.905113 13657 exec.cpp:378] Executor asked to shutdown 14/07/16 13:01:20 ERROR Executor: Exception in task ID 2 java.lang.NoClassDefFoundError: $line11/$read$ at $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19) at $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$$anon$1.next(Iterator.scala:853) at scala.collection.Iterator$$anon$1.head(Iterator.scala:840) at org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:181) at org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:176) at org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559) at org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111) at org.apache.spark.scheduler.Task.run(Task.scala:51) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:183) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Caused by: java.lang.ClassNotFoundException: $line11.$read$ at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:65) at java.lang.ClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ... 27 more Caused by: java.lang.ClassNotFoundException: $line11.$read$ at java.lang.ClassLoader.findClass(Unknown Source) at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26) at java.lang.ClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30) at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:60) ... 29 more {code} Note that running a simple map+reduce job on the same hdfs files with the same installation works fine: {code} # this works val data = sc.textFile("hdfs://vm28:8020/test/cardata/") val lineLengths = data.map(s => s.length) val totalLength = lineLengths.reduce((a, b) => a + b) {code} The hdfs files contain just plain csv files: {code} $ hdfs dfs -tail /test/cardata/part-00000 14/07/16 13:18:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 1396396560000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,0.0,0.0,0.0,0.0,38.24645296229051,99.41880649743675,26.619177092584696 1396396620000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,1.3637951832478066,0.5913309707002152,56.6895043678199,96.54451566032114,100.76632815433682,92.29189473832957,7.009760456230157 1396396680000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,-3.405565593143888,0.8104753585926928,41.677424397834905,36.57019235002255,8.974008103729105,92.94054149986701,11.673872282136195 1396396740000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,2.6548062807597854,0.6180832371072019,40.88058181777176,24.47455760837969,37.42027121601756,93.97373842452362,16.48937328407166 {code} spark-env.sh look like this: {code} export SPARK_LOCAL_IP=vm28 export MESOS_NATIVE_LIBRARY=/usr/local/etc/mesos-0.19.0/build/src/.libs/libmesos.so export SPARK_EXECUTOR_URI=hdfs://vm28:8020/apps/spark/spark-1.0.1-2.3.0-mr1-cdh5.0.2-hive.tgz {code}

    Apache's JIRA Issue Tracker | 3 years ago | Svend Vanderveken
    java.lang.NoClassDefFoundError: Could not initialize class $line10.$read$
  2. 0

    Execution of SQL query against HDFS systematically throws a class not found exception on slave nodes when executing . (this was originally reported on the user list: http://apache-spark-user-list.1001560.n3.nabble.com/spark1-0-1-spark-sql-error-java-lang-NoClassDefFoundError-Could-not-initialize-class-line11-read-tc10135.html) Sample code (ran from spark-shell): {code} val sqlContext = new org.apache.spark.sql.SQLContext(sc) import sqlContext.createSchemaRDD case class Car(timestamp: Long, objectid: String, isGreen: Boolean) // I get the same error when pointing to the folder "hdfs://vm28:8020/test/cardata" val data = sc.textFile("hdfs://vm28:8020/test/cardata/part-00000") val cars = data.map(_.split(",")).map ( ar => Car(ar(0).toLong, ar(1), ar(2).toBoolean)) cars.registerAsTable("mcars") val allgreens = sqlContext.sql("SELECT objectid from mcars where isGreen = true") allgreens.collect.take(10).foreach(println) {code} Stack trace on the slave nodes: {code} I0716 13:01:16.215158 13631 exec.cpp:131] Version: 0.19.0 I0716 13:01:16.219285 13656 exec.cpp:205] Executor registered on slave 20140714-142853-485682442-5050-25487-2 14/07/16 13:01:16 INFO MesosExecutorBackend: Registered with Mesos as executor ID 20140714-142853-485682442-5050-25487-2 14/07/16 13:01:16 INFO SecurityManager: Changing view acls to: mesos,mnubohadoop 14/07/16 13:01:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mesos, mnubohadoop) 14/07/16 13:01:17 INFO Slf4jLogger: Slf4jLogger started 14/07/16 13:01:17 INFO Remoting: Starting remoting 14/07/16 13:01:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@vm23:38230] 14/07/16 13:01:17 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@vm23:38230] 14/07/16 13:01:17 INFO SparkEnv: Connecting to MapOutputTracker: akka.tcp://spark@vm28:41632/user/MapOutputTracker 14/07/16 13:01:17 INFO SparkEnv: Connecting to BlockManagerMaster: akka.tcp://spark@vm28:41632/user/BlockManagerMaster 14/07/16 13:01:17 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140716130117-8ea0 14/07/16 13:01:17 INFO MemoryStore: MemoryStore started with capacity 294.9 MB. 14/07/16 13:01:17 INFO ConnectionManager: Bound socket to port 44501 with id = ConnectionManagerId(vm23-hulk-priv.mtl.mnubo.com,44501) 14/07/16 13:01:17 INFO BlockManagerMaster: Trying to register BlockManager 14/07/16 13:01:17 INFO BlockManagerMaster: Registered BlockManager 14/07/16 13:01:17 INFO HttpFileServer: HTTP File server directory is /tmp/spark-ccf6f36c-2541-4a25-8fe4-bb4ba00ee633 14/07/16 13:01:17 INFO HttpServer: Starting HTTP Server 14/07/16 13:01:18 INFO Executor: Using REPL class URI: http://vm28:33973 14/07/16 13:01:18 INFO Executor: Running task ID 2 14/07/16 13:01:18 INFO HttpBroadcast: Started reading broadcast variable 0 14/07/16 13:01:18 INFO MemoryStore: ensureFreeSpace(125590) called with curMem=0, maxMem=309225062 14/07/16 13:01:18 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 122.6 KB, free 294.8 MB) 14/07/16 13:01:18 INFO HttpBroadcast: Reading broadcast variable 0 took 0.294602722 s 14/07/16 13:01:19 INFO HadoopRDD: Input split: hdfs://vm28:8020/test/cardata/part-00000:23960450+23960451 I0716 13:01:19.905113 13657 exec.cpp:378] Executor asked to shutdown 14/07/16 13:01:20 ERROR Executor: Exception in task ID 2 java.lang.NoClassDefFoundError: $line11/$read$ at $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19) at $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$$anon$1.next(Iterator.scala:853) at scala.collection.Iterator$$anon$1.head(Iterator.scala:840) at org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:181) at org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:176) at org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559) at org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111) at org.apache.spark.scheduler.Task.run(Task.scala:51) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:183) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Caused by: java.lang.ClassNotFoundException: $line11.$read$ at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:65) at java.lang.ClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ... 27 more Caused by: java.lang.ClassNotFoundException: $line11.$read$ at java.lang.ClassLoader.findClass(Unknown Source) at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26) at java.lang.ClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30) at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:60) ... 29 more {code} Note that running a simple map+reduce job on the same hdfs files with the same installation works fine: {code} # this works val data = sc.textFile("hdfs://vm28:8020/test/cardata/") val lineLengths = data.map(s => s.length) val totalLength = lineLengths.reduce((a, b) => a + b) {code} The hdfs files contain just plain csv files: {code} $ hdfs dfs -tail /test/cardata/part-00000 14/07/16 13:18:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 1396396560000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,0.0,0.0,0.0,0.0,38.24645296229051,99.41880649743675,26.619177092584696 1396396620000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,1.3637951832478066,0.5913309707002152,56.6895043678199,96.54451566032114,100.76632815433682,92.29189473832957,7.009760456230157 1396396680000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,-3.405565593143888,0.8104753585926928,41.677424397834905,36.57019235002255,8.974008103729105,92.94054149986701,11.673872282136195 1396396740000,2ea211cc-ea01-435a-a190-98a6dd5ccd0a,false,Ivory,chrysler,New Caledonia,1970,2.6548062807597854,0.6180832371072019,40.88058181777176,24.47455760837969,37.42027121601756,93.97373842452362,16.48937328407166 {code} spark-env.sh look like this: {code} export SPARK_LOCAL_IP=vm28 export MESOS_NATIVE_LIBRARY=/usr/local/etc/mesos-0.19.0/build/src/.libs/libmesos.so export SPARK_EXECUTOR_URI=hdfs://vm28:8020/apps/spark/spark-1.0.1-2.3.0-mr1-cdh5.0.2-hive.tgz {code}

    Apache's JIRA Issue Tracker | 3 years ago | Svend Vanderveken
    java.lang.NoClassDefFoundError: Could not initialize class $line10.$read$
  3. 0

    Problem with Tomcat - unable to start webapp [Archive] - Java-Monitor Forum

    java-monitor.com | 1 year ago
    java.lang.NoClassDefFoundError: javax/faces/context/ExternalContext at org.apache.myfaces.trinidadinternal.webapp.Trinida dListenerImpl.contextDestroyed(TrinidadListenerImp l.java:39) at org.apache.catalina.core.StandardContext.listenerS top(StandardContext.java:3882) at org.apache.catalina.core.StandardContext.stop(Stan dardContext.java:4523) at org.apache.catalina.core.StandardContext.start(Sta ndardContext.java:4387) at org.apache.catalina.manager.ManagerServlet.start(M anagerServlet.java:1247) at org.apache.catalina.manager.HTMLManagerServlet.sta rt(HTMLManagerServlet.java:604) at org.apache.catalina.manager.HTMLManagerServlet.doG et(HTMLManagerServlet.java:129) at javax.servlet.http.HttpServlet.service(HttpServlet .java:617) at javax.servlet.http.HttpServlet.service(HttpServlet .java:717) at org.apache.catalina.core.ApplicationFilterChain.in ternalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.do Filter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invo ke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invo ke(StandardContextValve.java:191) at org.apache.catalina.authenticator.AuthenticatorBas e.invoke(AuthenticatorBase.java:525) at org.apache.catalina.core.StandardHostValve.invoke( StandardHostValve.java:128) at org.apache.catalina.valves.ErrorReportValve.invoke (ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invok e(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.servic e(CoyoteAdapter.java:286) at org.apache.coyote.http11.Http11Processor.process(H ttp11Processor.java:845) at org.apache.coyote.http11.Http11Protocol$Http11Conn ectionHandler.process(Http11Protocol.java:583) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run( JIoEndpoint.java:447)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    issues in spring petclinic applicaiton

    Stack Overflow | 4 years ago | user2216702
    java.lang.NoClassDefFoundError: org/springframework/asm/ClassVisitor at org.springframework.beans.factory.support.Abstract AutowireCapableBean Factory.&lt;init&gt;(AbstractAutowireCapableBeanFactory. java:121) ~[spring-beans-3.1.4 .RELEASE.jar:3.1.4.RELEASE] at org.springframework.beans.factory.support.Abstract AutowireCapableBean Factory.&lt;init&gt;(AbstractAutowireCapableBeanFactory. java:168) ~[spring-beans-3.1.4 .RELEASE.jar:3.1.4.RELEASE] at org.springframework.beans.factory.support.DefaultL istableBeanFactory. &lt;init&gt;(DefaultListableBeanFactory.java:163) ~[spring-beans-3.1.4.RELEASE.jar:3.1 .4.RELEASE] at org.springframework.context.support.AbstractRefres hableApplicationCon text.createBeanFactory(AbstractRefreshableApplicat ionContext.java:194) ~[spring- context-3.2.2.RELEASE.jar:3.2.2.RELEASE] at org.springframework.context.support.AbstractRefres hableApplicationCon text.refreshBeanFactory(AbstractRefreshableApplica tionContext.java:127) ~[spring -context-3.2.2.RELEASE.jar:3.2.2.RELEASE] at org.springframework.context.support.AbstractApplic ationContext.obtain FreshBeanFactory(AbstractApplicationContext.java:5 37) ~[spring-context-3.2.2.REL EASE.jar:3.2.2.RELEASE] at org.springframework.context.support.AbstractApplic ationContext.refres h(AbstractApplicationContext.java:451) ~[spring-context-3.2.2.RELEASE.jar:3.2.2. RELEASE] at org.springframework.web.context.ContextLoader.conf igureAndRefreshWebA pplicationContext(ContextLoader.java:389) ~[spring-web-3.2.2.RELEASE.jar:3.2.2.R ELEASE] at org.springframework.web.context.ContextLoader.init WebApplicationConte xt(ContextLoader.java:294) ~[spring-web-3.2.2.RELEASE.jar:3.2.2.RELEASE] at org.springframework.web.context.ContextLoaderListe ner.contextInitiali zed(ContextLoaderListener.java:112) [spring-web-3.2.2.RELEASE.jar:3.2.2.RELEASE] at org.apache.catalina.core.StandardContext.listenerS tart(StandardContex t.java:4791) [tomcat-embed-core-7.0.30.jar:7.0.30] at org.apache.catalina.core.StandardContext.startInte rnal(StandardContex t.java:5285) [tomcat-embed-core-7.0.30.jar:7.0.30] at org.apache.catalina.util.LifecycleBase.start(Lifec ycleBase.java:150) [tomcat-embed-core-7.0.30.jar:7.0.30] at org.apache.catalina.core.ContainerBase$StartChild. call(ContainerBase. java:1559) [tomcat-embed-core-7.0.30.jar:7.0.30] at org.apache.catalina.core.ContainerBase$StartChild. call(ContainerBase. java:1549) [tomcat-embed-core-7.0.30.jar:7.0.30] at java.util.concurrent.FutureTask$Sync.innerRun(Futu reTask.java:303) [n a:1.6.0_20] at java.util.concurrent.FutureTask.run(FutureTask.jav a:138) [na:1.6.0_20 ] at java.util.concurrent.ThreadPoolExecutor$Worker.run Task(ThreadPoolExec utor.java:886) [na:1.6.0_20] at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor .java:908) [na:1.6.0_20]
  6. 0

    Grails : production server only : spring.ReloadAwareAutowireCapableBeanFactory

    Stack Overflow | 3 years ago | Snite
    java.lang.NoClassDefFoundError: org/springframework/orm/jpa/EntityManagerFactoryUtils

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NoClassDefFoundError

      Could not initialize class $line10.$read$

      at $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply()
    2. $line12
      $read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply
      1. $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19)
      2. $line12.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:19)
      2 frames
    3. Scala
      Iterator$$anon$1.head
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
      3. scala.collection.Iterator$$anon$1.head(Iterator.scala:840)
      3 frames
    4. Spark Project SQL
      ExistingRdd$$anonfun$productToRowRdd$1.apply
      1. org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:181)
      2. org.apache.spark.sql.execution.ExistingRdd$$anonfun$productToRowRdd$1.apply(basicOperators.scala:176)
      2 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559)
      2. org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559)
      3. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
      4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
      5. org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
      6. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
      9. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
      10. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
      11. org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
      12. org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
      13. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
      14. org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
      15. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)
      16. org.apache.spark.scheduler.Task.run(Task.scala:51)
      17. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:183)
      17 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
      3. java.lang.Thread.run(Unknown Source)
      3 frames