java.lang.UnsatisfiedLinkError: no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

Google Groups | Fredrik | 10 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Failed to detect a valid hadoop home directory

    Google Groups | 10 months ago | Fredrik
    java.lang.UnsatisfiedLinkError: no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

    Root Cause Analysis

    1. java.lang.UnsatisfiedLinkError

      no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

      at org.apache.hadoop.fs.Hdfs.<init>()
    2. Hadoop
      Hdfs.<init>
      1. org.apache.hadoop.fs.Hdfs.<init>(Hdfs.java:91)
      1 frame