org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on landsat-pds: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain: Unable to load AWS credentials from any provider in the chain

Apache's JIRA Issue Tracker | Steve Loughran | 6 months ago
  1. 0

    If an S3 bucket is public, anyone should be able to read from it. However, you cannot create an s3a client bonded to a public bucket unless you have some credentials; the {{doesBucketExist()}} check rejects the call.

    Apache's JIRA Issue Tracker | 6 months ago | Steve Loughran
    org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on landsat-pds: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain: Unable to load AWS credentials from any provider in the chain
  2. 0

    Access public available Amazon S3 file from Apache Spark

    Stack Overflow | 1 year ago | pkozlov
    com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    GitHub comment 187#167916928

    GitHub | 11 months ago | mwacc
    com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. com.amazonaws.AmazonClientException

    Unable to load AWS credentials from any provider in the chain

    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials()
  2. AWS SDK for Java - Core
    AWSCredentialsProviderChain.getCredentials
    1. com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:117)
    1 frame
  3. AWS Java SDK for Amazon S3
    AmazonS3Client.doesBucketExist
    1. com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3779)
    2. com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1107)
    3. com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1070)
    3 frames
  4. Apache Hadoop Amazon Web Services support
    S3AFileSystem.initialize
    1. org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:288)
    2. org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:267)
    2 frames
  5. Hadoop
    FileSystem.get
    1. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2793)
    2. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:101)
    3. org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
    4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
    5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    5 frames
  6. org.apache.spark
    S3ALineCount.main
    1. org.apache.spark.cloud.s3.examples.S3ALineCount$.innerMain(S3ALineCount.scala:75)
    2. org.apache.spark.cloud.s3.examples.S3ALineCount$.main(S3ALineCount.scala:50)
    3. org.apache.spark.cloud.s3.examples.S3ALineCount.main(S3ALineCount.scala)
    3 frames
  7. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:498)
    4 frames
  8. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    5 frames