java.lang.UnsatisfiedLinkError: >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z >>> at >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>> Method) [na:1.8.0_77] >>> at >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) >>> [na:1.8.0_77] >>> at >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>> [na:1.8.0_77] >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) >>> [na:1.8.0_77] >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

Google Groups | Fredrik | 11 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Failed to detect a valid hadoop home directory

    Google Groups | 11 months ago | Fredrik
    java.lang.UnsatisfiedLinkError: >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z >>> at >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>> Method) [na:1.8.0_77] >>> at >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) >>> [na:1.8.0_77] >>> at >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>> [na:1.8.0_77] >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) >>> [na:1.8.0_77] >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

    Root Cause Analysis

    1. java.lang.UnsatisfiedLinkError

      >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z >>> at >>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) >>> ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] >>> at >>> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >>> Method) [na:1.8.0_77] >>> at >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) >>> [na:1.8.0_77] >>> at >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>> [na:1.8.0_77] >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) >>> [na:1.8.0_77] >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at >>> org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

      at org.apache.hadoop.fs.Hdfs.<init>()
    2. Hadoop
      Hdfs.<init>
      1. org.apache.hadoop.fs.Hdfs.<init>(Hdfs.java:91)
      1 frame