java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.conf.Configuration > HadoopWriterLib.HadoopWriter.OpenFileSystem(HadoopWriter.java:22) > HadoopWriterLib.HadoopWriter.<init>(HadoopWriter.java:16) > HadoopServletTest.doGet(HadoopServletTest.java:35) > javax.servlet.http.HttpServlet.service(HttpServlet.java:621) > javax.servlet.http.HttpServlet.service(HttpServlet.java:722) > > *note* *The full stack trace of the root cause is available in the Apache > Tomcat/7.0.6 logs.* > > The error can be a problem on my ubuntu server ? > > thanks > > On Mon, Jan 24, 2011 at 1:01 PM, Alessandro Binhara <binhara@gmail.com > >wrote: > > > ..i try > > java -classpath hadoop-core-0.20.1.jar -jar HahoopHdfsHello.jar > > > > i got a same error.. > > i will try build a servlet and run on tomcat... > > i try many issues to config a classpath... all fail.. > > > > thanks > > > > > > On Mon, Jan 24, 2011 at 12:54 PM, Harsh J <qwertymaniac@gmail.com> > wrote: > > > >> The issue would definitely lie with your CLASSPATH. > >> > >> Ideally, while beginning development using Hadoop 0.20, it is better > >> to use the `hadoop jar` command to launch jars of any kind that > >> require Hadoop libraries; be it MapReduce or not. The command will > >> ensure that all the classpath requirements for Hadoop-side libraries > >> are satisfied, so you don't have to worry. > >> > >> Anyhow, try launching it this way: > >> $ java -classpath hadoop-0.20.2-core.jar -jar HadoopHdfsHello.jar; # > >> This should run just fine. > >> > >> On Mon, Jan 24, 2011 at 5:06 PM, Alessandro Binhara <binhara@gmail.com> > >> wrote: Hello .. i solve problem in jar.. i put a hadoop-core-0.20.2.jar in same jar dir. i configure a class path export CLASSPATH=.:$JAVA_HOME i got this erro in shell root:~# java -jar HahoopHdfsHello.jar Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

hadoop-general | Owen O'Malley | 6 years ago
  1. 0

    Re: Problem write on HDFS

    hadoop-general | 6 years ago | Owen O'Malley
    java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.conf.Configuration > HadoopWriterLib.HadoopWriter.OpenFileSystem(HadoopWriter.java:22) > HadoopWriterLib.HadoopWriter.<init>(HadoopWriter.java:16) > HadoopServletTest.doGet(HadoopServletTest.java:35) > javax.servlet.http.HttpServlet.service(HttpServlet.java:621) > javax.servlet.http.HttpServlet.service(HttpServlet.java:722) > > *note* *The full stack trace of the root cause is available in the Apache > Tomcat/7.0.6 logs.* > > The error can be a problem on my ubuntu server ? > > thanks > > On Mon, Jan 24, 2011 at 1:01 PM, Alessandro Binhara <binhara@gmail.com > >wrote: > > > ..i try > > java -classpath hadoop-core-0.20.1.jar -jar HahoopHdfsHello.jar > > > > i got a same error.. > > i will try build a servlet and run on tomcat... > > i try many issues to config a classpath... all fail.. > > > > thanks > > > > > > On Mon, Jan 24, 2011 at 12:54 PM, Harsh J <qwertymaniac@gmail.com> > wrote: > > > >> The issue would definitely lie with your CLASSPATH. > >> > >> Ideally, while beginning development using Hadoop 0.20, it is better > >> to use the `hadoop jar` command to launch jars of any kind that > >> require Hadoop libraries; be it MapReduce or not. The command will > >> ensure that all the classpath requirements for Hadoop-side libraries > >> are satisfied, so you don't have to worry. > >> > >> Anyhow, try launching it this way: > >> $ java -classpath hadoop-0.20.2-core.jar -jar HadoopHdfsHello.jar; # > >> This should run just fine. > >> > >> On Mon, Jan 24, 2011 at 5:06 PM, Alessandro Binhara <binhara@gmail.com> > >> wrote: Hello .. i solve problem in jar.. i put a hadoop-core-0.20.2.jar in same jar dir. i configure a class path export CLASSPATH=.:$JAVA_HOME i got this erro in shell root:~# java -jar HahoopHdfsHello.jar Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
  2. 0

    Problems with Guice in Eclipse: Class Loading Exceptions

    Google Groups | 8 years ago | Anne Martens
    java.lang.NoClassDefFoundError: com/google/ >> inject/internal/collect/Lists >> at com.google.inject.multibindings.Multibinder >> $RealMultibinder.initialize(Multibinder.java:254) >> at org.opt4j.start.Opt4JModule.multi(Opt4JModule.java:128) >> >> This was after I fixed a bug in my own code, which might be a reason >> why you did not see it. I uploaded a fixed version of my code to the >> url named above (http://sdqweb.ipd.uka.de/temp/org.opt4j-wrapper- >> plugin.zip<http://sdqweb.ipd.uka.de/temp/org.opt4j-wrapper-%0Aplugin.zip> >> ). >> >> Is it possible that the guice-multibindings-....jar also needs a >> patch? Or is it a problem that multibindings resides in its own jar? >> For the latter, it might be an option to join the two into one jar. >> > > Note this is actually a different issue unrelated to the previous problem, > the issue here is that the multibindings extension uses classes from the > internal Guice package which (rightly) isn't exported > > See http://code.google.com/p/google-guice/issues/detail?id=311 > > If you've got them inside the same Bundle-ClassPath (as you had before) > then it shouldn't matter because the internal package is only hidden from > other bundles / plug-in ... jars on the same Bundle-ClassPath will merge > together just like in a classic Java application > > however, if you're using them as separate bundles then you'd need to > patch the multibindings jar to attach as a fragment to the Guice bundle > as suggested in Issue 311 > > I'll take a look at your updated code to see what's changed... > Aha, found the problem - it's because the packaging of Google-Collection classes inside Guice changed post-20090205... so when you use the old multibindings jar with the new Guice jar there's a package mismatch. In the 20090205 snapshot it was "com.google.inject.internal.common.Lists", now it is "com.google.inject.internal.Lists" - if you build Guice multibindings from trunk (or use the jar I'm going to send you) then everything should work > Kind regards, >> Anne >> >> >> On 3 Mrz., 18:51, Anne Martens <annema...@googlemail.com> wrote: >> > Dear Stuart, >> > >> > thanks a lot for your fast reply! Yes, I'm absolutely ok with using a >> > patched version, I'm glad that there is a readily available solution. >> > >> > I will try it and then let you know. >> > >> > Kind regards >> > from a very happy Anne >> > >> > On 3 Mrz., 17:34, Stuart McCulloch <mcc...@gmail.com> wrote: >> > >> > > 2009/3/3 Anne Martens <annema...@googlemail.com> >> > >> > > > Dear Stuart, >> > >> > > > I have uploaded to code to >> > > >http://sdqweb.ipd.uka.de/temp/org.opt4j-wrapper-plugin.zip. >> > > > You also find a .project file for Eclipse there. Currently, the >> plugin >> > > > needs org.eclipse.core.runtime for the plugin activator >> > > > Opt4JPluginActivator, but you can delete the class and the >> dependency >> > > > if you do not use Eclipse. To get the error, please call >> > > > de.uka.ipd.sdq.dsexplore.opt4j.start.Opt4JStarter.startOpt4J() >> > >> > > Hi Anne, >> > >> > > I tracked the exception down to a binding for a system type >> > > (java.util.Random) >> > > in the OPT4J codebase that requires a constructor proxy - Guice checks >> with >> > > the BytecodeGen utility class to decide whether it needs a bridge >> > > classloader >> > > and it decides it doesn't for a system type, as mentioned in the Guice >> wiki: >> > >> > > http://code.google.com/p/google-guice/wiki/ClassLoading >> > >> > > Unfortunately, the classloader then returned by BytecodeGen is not the >> Guice >> > > classloader but the system classloader. Of course this does not have >> access >> > > to the internal Guice AOP classes, and the injector blows up :( >> > >> > > I tried changing the OPT4J code to avoid the constructor proxy, but I >> > > couldn't >> > > get round it without making major changes to the code, so you'll need >> to use >> > > a patched version of Guice (at least until this is fixed in trunk) >> > >> > > A quick fix would be to remove the following check from >> BytecodeGen.java: >> > >> > > if (delegate == getSystemClassLoaderOrNull()) { >> > > return delegate; >> > > } >> > >> > > Which would then enable bridging for system types - another option >> would be >> > > to return the Guice class loader at this point instead of the system >> > > classloader: >> > >> > > if (delegate == getSystemClassLoaderOrNull()) { >> > > return GUICE_CLASS_LOADER; >> > > } >> > >> > > Either of these would solve your problem, but they are not complete >> because >> > > the Guice classloader won't have access to types in the system >> classloader >> > > that don't reside in the "java" namespace (such as javax.* / >> org.omg.*) >> > > unless >> > > of course you enabled OSGi bootdelegation for these packages... >> > >> > > I've coded up a solution that doesn't suffer from this and raised a >> Guice >> > > issue: >> > >> > > http://code.google.com/p/google-guice/issues/detail?id=343 >> > >> > > You can download a patched version of Guice here: >> > >> > >http://code.google.com/p/peaberry/source/browse/trunk/lib/build/guice. >> .. >> > >> > > which avoids the AOP exception. >> > >> > > Not sure if this patch will make it into Guice 2 because it is in a >> critical >> > > piece >> > > of code, and I don't want to put any undue pressure on Jesse who's >> already >> > > done a lot of testing with the current code. >> > >> > > Are you ok using the patched build until this is fixed in an official >> > > release? >> > >> > > For configuration details, I also uploaded >> > >> > > >http://sdqweb.ipd.uka.de/temp/eclipse-config.txt, >> > > > which is a copy of the configuration details Eclipse provides >> > > > in Help -> About Eclipse Platform. I currently use version: >> 3.4.1build >> > > > id: M20080911-1700. >> > >> > > > Thanks a lot! If you need more information, please let me know. >> > >> > > > Kind regards, >> > > > Anne >> > >> > > > On 2 Mrz., 15:22, Stuart McCulloch <mcc...@gmail.com> wrote: >> > > > > 2009/3/2 Anne Martens <annema...@googlemail.com> >> > Dear Stuart, >> > thanks a lot for your help. >> > Indeed, I only need Guice within a single plugin/bundle. That's >> why I now put the three guice jars (guice-snapshot, >> guice-multibindings- snapshot and aopalliance) back into the "referenced libraries" >> of my opt4j wrapper plugin. I also tried to move the code starting >> Opt4J to a new plugin activator class, that is activated when OSGi loads >> the bundle. >> > Still, I get the same error (with a different stacktrace, of >> course): >> > com.google.inject.internal.ComputationException: com.google.inject.internal.ComputationException: com.google.inject.internal.ComputationException: com.google.inject.internal.ComputationException: com.google.inject.internal.cglib.core.CodeGenerationException: java.lang.reflect.InvocationTargetException-->null at >> com.google.inject.internal.MapMaker$StrategyImpl.compute (MapMaker.java:538) at >> com.google.inject.internal.MapMaker$StrategyImpl.compute (MapMaker.java:404) at com.google.inject.internal.CustomConcurrentHashMap $ComputingImpl.get(CustomConcurrentHashMap.java:2031) at >> > > > com.google.inject.internal.FailableCache.get(FailableCache.java: 46) at >> com.google.inject.InjectorImpl$LateBoundConstructor.bind (InjectorImpl.java:457) at com.google.inject.ClassBindingImpl.initialize (ClassBindingImpl.java:52) at >> com.google.inject.InjectorImpl.initializeBinding(InjectorImpl.java: 347) at com.google.inject.InjectorImpl.createJustInTimeBinding (InjectorImpl.java:639) at >> > > > com.google.inject.InjectorImpl.createJustInTimeBindingRecursive (InjectorImpl.java:584) at com.google.inject.InjectorImpl.getJustInTimeBinding (InjectorImpl.java:179) at >> com.google.inject.InjectorImpl.getBindingOrThrow(InjectorImpl.java: 139) at com.google.inject.InjectorImpl.getInternalFactory (InjectorImpl.java:645) at >> com.google.inject.FactoryProxy.notify(FactoryProxy.java:48) at >> com.google.inject.BindingProcessor.runCreationListeners (BindingProcessor.java:215) at com.google.inject.InjectorBuilder.initializeStatically (InjectorBuilder.java:131) at >> > > > com.google.inject.InjectorBuilder.build(InjectorBuilder.java:105)
  3. 0

    Re: Runtime problem in 8 beta1?

    apache.org | 1 year ago
    java.lang.NoClassDefFoundError: scala/Tuple2$mcJJ$sp >> >>> >>> at >> >>> >>>kafka.consumer.ConsumerConfig.<init>(ConsumerConfig.scala:77) >> >>> >>> at >>com.example.Config.createConsumerConfig(Config.java:40) >> >>> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> >>>Method) >> >>> >>> at >> >>> >>> >> >>> >> >>>>>>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImp >>>>>>>>l. >> >>>>>>j >> >>>>>>av >> >>> >>>a >> >>> >>>:5 >> >>> >>> 7) >> >>> >>> at >> >>> >>> >> >>> >> >>>>>>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc >>>>>>>>es >> >>>>>>s >> >>>>>>or >> >>> >>>I >> >>> >>>mp >> >>> >>> l.java:43) >> >>> >>> at java.lang.reflect.Method.invoke(Method.java:606) >> >>> >>> at >> >>> >>> >> >>> >>> >> >>> >> >>> >> org.springframework.beans.factory.support.SimpleInstantiationStrategy.in >> >>> >>>s >> >>> >>>ta >> >>> >>> ntiate(SimpleInstantiationStrategy.java:160) >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> Just in case its useful here are my fill list of dependencies >>for >> >>>my >> >>> >>> starter project: >> >>> >>> >> >>> >>> >> >>> >>> <dependencies> >> >>> >>> <dependency> >> >>> >>> <groupId>org.springframework</groupId> >> >>> >>> <artifactId>spring-core</artifactId> >> >>> >>> <version>3.2.4.RELEASE</version> >> >>> >>> </dependency> >> >>> >>> <dependency> >> >>> >>> <groupId>org.springframework</groupId> >> >>> >>> <artifactId>spring-context</artifactId> >> >>> >>> <version>3.2.4.RELEASE</version> >> >>> >>> </dependency> >> >>> >>> <dependency> >> >>> >>> <groupId>org.apache.kafka</groupId> >> >>> >>> <artifactId>kafka_2.9.2</artifactId> >> >>> >>> <version>0.8.0-beta1</version> >> >>> >>> </dependency> >> >>> >>> <dependency> >> >>> >>> <groupId>javax.inject</groupId> >> >>> >>> <artifactId>javax.inject</artifactId> >> >>> >>> <version>1</version> >> >>> >>> </dependency> >> >>> >>> <dependency> >> >>> >>> <groupId>org.scala-lang</groupId> >> >>> >>> <artifactId>scala-library</artifactId> >> >>> >>> <version>2.8.0</version> >> >>> >>> </dependency> >> >>> >>> </dependencies> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> -- >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> On 8/26/13 9:33 PM, "Joe Stein" <cryptcom@gmail.com> wrote: >> >>> >>> >> >>> >>> >Scala 2.10 support is slated for 0.8.1 release after a 0.8.0 >> >>>release. >> >>> >>> >Here is the patch you can apply if you need >> >>> >>> >https://issues.apache.org/jira/browse/KAFKA-717 >> >>> >>> > >> >>> >>> > >> >>> >>> >/******************************************* >> >>> >>> > Joe Stein >> >>> >>> > Founder, Principal Consultant >> >>> >>> > Big Data Open Source Security LLC >> >>> >>> > http://www.stealth.ly >> >>> >>> > Twitter: @allthingshadoop >> >>> >>> >********************************************/ >> >>> >>> > >> >>> >>> > >> >>> >>> >On Aug 27, 2013, at 12:16 AM, David Williams >> >>><dwilliams@truecar.com> >> >>> >>> >wrote: >> >>> >>> > >> >>> >>> >> >> >>> >>> >> Hi Jay, >> >>> >>> >> >> >>> >>> >> To which jars are you referring? Does scala have a Maven >> >>>coodinate? >> >>> >>> As >> >>> >>> >> it stands I am compiling with. >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> <dependency> >> >>> >>> >> <groupId>org.scala-lang</groupId> >> >>> >>> >> <artifactId>scala-library</artifactId> >> >>> >>> >> <version>2.10.2</version> >> >>> >>> >> </dependency> >> >>> >>> >> <dependency> >> >>> >>> >> <groupId>org.scala-lang</groupId> >> >>> >>> >> <artifactId>scala-reflect</artifactId> >> >>> >>> >> <version>2.10.2</version> >> >>> >>> >> </dependency> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> -- >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> >> >>> >>> >> On 8/26/13 8:35 PM, "Jay Kreps" <jay.kreps@gmail.com> wrote: >> >>> >>> >> >> >>> >>> >>> Nothing complex here, you just don't have the the scala >>library >> >>>on >> >>> >>>your >> >>> >>> >>> classpath. It works just like any jar--if there is a >>dependency >> >>>on >> >>> >>> >>>classes >> >>> >>> >>> in the jar it needs to be on the classpath. >> >>> >>> >>> >> >>> >>> >>> >> >>> >>> >>> On Mon, Aug 26, 2013 at 7:10 PM, David Williams >> >>> >>> >>> <dwilliams@truecar.com>wrote: >> >>> >>> >>> >> >>> >>> >>>> Hi all, >> >>> >>> >>>> >> >>> >>> >>>> First let me say I have detailed the description of the >>issue >> >>>I >> >>> >>>have >> >>> >>> >>>>in >> >>> >>> >>>> an >> >>> >>> >>>> Stack Overflow Ticket here: >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >> >>> >> http://stackoverflow.com/questions/18455480/kafka-quickstart-java-lang- >> >>> >>> >>>>no >> >>> >>> >>>> classdeffounderror-scala-scalaobject >> >>> >>> >>>> >> >>> >>> >>>> I am trying to build a small prototype for a project in >>which >> >>>I >> >>> >>>want >> >>> >>> >>>>to >> >>> >>> >>>> use Kafka. I am following this example: >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >> >>> >> https://cwiki.apache.org/confluence/display/KAFKA/Consumer+Group+Exampl >> >>> >>> >>>>e >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> However, when I compile and run the code I get this runtime >> >>> >>>exception. >> >>> >>> >>>> What should I do to get a working prototype / example??? >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >>>> Exception in thread "main" >> >>> >>>java.lang.reflect.InvocationTargetException >> >>> >>> >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >>Method) >> >>> >>> >>>> at >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >> >>> >> >>>>>>>>>>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccesso >>>>>>>>>>>>rI >> >>>>>>>>>>m >> >>>>>>>>>>pl >> >>> >>>>>>>. >> >>> >>>>>>>ja >> >>> >>> >>>>va >> >>> >>> >>>> :57) >> >>> >>> >>>> at >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >> >>> >> >>>>>>>>>>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMetho >>>>>>>>>>>>dA >> >>>>>>>>>>c >> >>>>>>>>>>ce >> >>> >>>>>>>s >> >>> >>>>>>>so >> >>> >>> >>>>rI >> >>> >>> >>>> mpl.java:43) >> >>> >>> >>>> at java.lang.reflect.Method.invoke(Method.java:606) >> >>> >>> >>>> at com.simontuffs.onejar.Boot.run(Boot.java:340) >> >>> >>> >>>> at com.simontuffs.onejar.Boot.main(Boot.java:166) >> >>> >>> >>>> Caused by: java.lang.NoClassDefFoundError: >>scala/ScalaObject >> >>> >>> >>>> at java.lang.ClassLoader.defineClass1(Native Method) >> >>> >>> >>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:792) >> >>> >>> >>>> at >> >>> >>> >>>> >> >>> >>> >>>> >> >>> >>> >> >>> >> >>>>>>>>>>>>com.simontuffs.onejar.JarClassLoader.defineClass(JarClassLoader >>>>>>>>>>>>.j >> >>>>>>>>>>a >> >>>>>>>>>>va >> >>> >>>>>>>: >> >>> >>>>>>>80 >> >>> >>> >>>>3) >> >>> >>> >>>> at >> >>> >>> >>>> >> >>> >>> >> >>> >> >>>>>>>>>>>>com.simontuffs.onejar.JarClassLoader.findClass(JarClassLoader.j >>>>>>>>>>>>av >> >>>>>>>>>>a >> >>>>>>>>>>:7 >> >>> >>>>>>>1 >> >>> >>>>>>>0) >> >>> >>> >>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NoClassDefFoundError

      Could not initialize class > org.apache.hadoop.conf.Configuration > HadoopWriterLib.HadoopWriter.OpenFileSystem(HadoopWriter.java:22) > HadoopWriterLib.HadoopWriter.<init>(HadoopWriter.java:16) > HadoopServletTest.doGet(HadoopServletTest.java:35) > javax.servlet.http.HttpServlet.service(HttpServlet.java:621) > javax.servlet.http.HttpServlet.service(HttpServlet.java:722) > > *note* *The full stack trace of the root cause is available in the Apache > Tomcat/7.0.6 logs.* > > The error can be a problem on my ubuntu server ? > > thanks > > On Mon, Jan 24, 2011 at 1:01 PM, Alessandro Binhara <binhara@gmail.com > >wrote: > > > ..i try > > java -classpath hadoop-core-0.20.1.jar -jar HahoopHdfsHello.jar > > > > i got a same error.. > > i will try build a servlet and run on tomcat... > > i try many issues to config a classpath... all fail.. > > > > thanks > > > > > > On Mon, Jan 24, 2011 at 12:54 PM, Harsh J <qwertymaniac@gmail.com> > wrote: > > > >> The issue would definitely lie with your CLASSPATH. > >> > >> Ideally, while beginning development using Hadoop 0.20, it is better > >> to use the `hadoop jar` command to launch jars of any kind that > >> require Hadoop libraries; be it MapReduce or not. The command will > >> ensure that all the classpath requirements for Hadoop-side libraries > >> are satisfied, so you don't have to worry. > >> > >> Anyhow, try launching it this way: > >> $ java -classpath hadoop-0.20.2-core.jar -jar HadoopHdfsHello.jar; # > >> This should run just fine. > >> > >> On Mon, Jan 24, 2011 at 5:06 PM, Alessandro Binhara <binhara@gmail.com> > >> wrote: Hello .. i solve problem in jar.. i put a hadoop-core-0.20.2.jar in same jar dir. i configure a class path export CLASSPATH=.:$JAVA_HOME i got this erro in shell root:~# java -jar HahoopHdfsHello.jar Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

      at HadooHdfsHello.main()
    2. Unknown
      HadooHdfsHello.main
      1. HadooHdfsHello.main(HadooHdfsHello.java:18)
      1 frame