java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobConf > at > co.cask.cdap.internal.app.runtime.spark.ScalaSparkFacade.<init>(ScalaSparkFacade.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.setupSparkContext(SparkProgramWrapper.java:76) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.run(SparkProgramWrapper.java:58) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.main(SparkProgramWrapper.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[na:1.7.0_67] > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > ~[na:1.7.0_67] > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > ~[na:1.7.0_67] > at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

Google Groups | Unknown author | 1 year ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    NoClassDefFoundError: org/apache/hadoop/mapred/JobConf - CDAP 3.2.0 on CDH 5.4 using PageRank Spark

    Google Groups | 1 year ago | Unknown author
    java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobConf > at > co.cask.cdap.internal.app.runtime.spark.ScalaSparkFacade.<init>(ScalaSparkFacade.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.setupSparkContext(SparkProgramWrapper.java:76) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.run(SparkProgramWrapper.java:58) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.main(SparkProgramWrapper.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[na:1.7.0_67] > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > ~[na:1.7.0_67] > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > ~[na:1.7.0_67] > at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

    Root Cause Analysis

    1. java.lang.NoClassDefFoundError

      org/apache/hadoop/mapred/JobConf > at > co.cask.cdap.internal.app.runtime.spark.ScalaSparkFacade.<init>(ScalaSparkFacade.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.setupSparkContext(SparkProgramWrapper.java:76) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.run(SparkProgramWrapper.java:58) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at > co.cask.cdap.internal.app.runtime.spark.SparkProgramWrapper.main(SparkProgramWrapper.java:40) > ~[co.cask.cdap.cdap-app-fabric-3.2.0.jar:na] > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > ~[na:1.7.0_67] > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > ~[na:1.7.0_67] > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > ~[na:1.7.0_67] > at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_67] > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

      at org.apache.spark.deploy.SparkSubmit$.doRunMain$1()
    2. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
      2. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
      3. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
      4. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      4 frames