java.lang.UnsatisfiedLinkError: no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

Google Groups | Fredrik | 12 months ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Failed to detect a valid hadoop home directory

    Google Groups | 12 months ago | Fredrik
    java.lang.UnsatisfiedLinkError: no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

    Root Cause Analysis

    1. java.lang.UnsatisfiedLinkError

      no > hadoop in java.library.path > 2016-05-03 06:14:07,887 DEBUG org.apache.hadoop.util.NativeCodeLoader: > java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib > 2016-05-03 06:14:07,887 WARN org.apache.hadoop.util.NativeCodeLoader: > Unable to load native-hadoop library for your platform... using > builtin-java classes where applicable > 2016-05-03 06:14:07,898 DEBUG org.apache.hadoop.util.PerformanceAdvisory: > Both short-circuit local reads and UNIX domain socket are disabled. > 2016-05-03 06:14:07,904 DEBUG > org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: > DataTransferProtocol not using SaslPropertiesResolver, no QOP found in > configuration for dfs.data.transfer.protection > 2016-05-03 06:14:07,909 DEBUG org.apache.hadoop.crypto.OpensslCipher: > Failed to load OpenSSL Cipher. > java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native > Method) ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84) > ~[hadoop-common-2.6.0-cdh5.5.2.jar:na] > at > org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) [na:1.8.0_77] > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > [na:1.8.0_77] > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > [na:1.8.0_77] > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > [na:1.8.0_77] > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:68) at > org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:101)

      at org.apache.hadoop.fs.Hdfs.<init>()
    2. Hadoop
      Hdfs.<init>
      1. org.apache.hadoop.fs.Hdfs.<init>(Hdfs.java:91)
      1 frame