com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: 923C5D9E75E44C06), S3 Extended Request ID: HDwje6k+ANEeDsM6aJ8+D5gUmNAMguOk2BvZ8PH3g9z0gpH+IuwT7N19oQOnIr5CIx7Vqb/uThE=

Apache's JIRA Issue Tracker | Steve Loughran | 11 months ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    S3A doesn't auth with S3 frankfurt. This installation only supports v4 API. There are some JVM options which should set this, but even they don't appear to be enough. It appears that we have to allow the s3a client to change the endpoint with which it authenticates from a generic "AWS S3" to a frankfurt-specific one.

    Apache's JIRA Issue Tracker | 11 months ago | Steve Loughran
    com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: 923C5D9E75E44C06), S3 Extended Request ID: HDwje6k+ANEeDsM6aJ8+D5gUmNAMguOk2BvZ8PH3g9z0gpH+IuwT7N19oQOnIr5CIx7Vqb/uThE=
  2. 0

    Cannot access S3 bucket with Hadoop

    Stack Overflow | 2 months ago | razvan
    com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: 1FA2318A386330C0, AWS Error Code: null, AWS Error Message: Bad Request, S3 Extended Request ID: 1S7Eq6s9YxUb9bPwyHP73clJvD619LZ2o0jE8VklMAA9jrKXPbvT7CG6nh0zeuluGrzybiPbgRQ=
  3. 0

    Amazon s3a returns 400 Bad Request with Spark

    Stack Overflow | 1 year ago | crak
    com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: 9D8E8002H3BBDDC7, AWS Error Code: null, AWS Error Message: Bad Request, S3 Extended Request ID: Qme5E3KAr/KX0djiq9poGXPJkmr0vuXAduZujwGlvaAl+oc6vlUpq7LIh70IF3LNgoewjP+HnXA=
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    AmazonS3Exception Bad Request: distcp from frankfurt s3 to emr hdfs failing

    Stack Overflow | 1 year ago | Carsten Blank
    com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: 4A77158C1BD71C29), S3 Extended Request ID: LU41MspxqVnHqyaMreTvggRG480Wb9d+TBx1MAo5v/g9yz07mmPizcZVOtRMQ+GElXs8vl/WZXA=
  6. 0

    GitHub comment 17#251596459

    GitHub | 8 months ago | AmitTwingo
    org.apache.kafka.connect.errors.ConnectException: java.lang.reflect.InvocationTargetException
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. com.amazonaws.services.s3.model.AmazonS3Exception

    Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: 923C5D9E75E44C06), S3 Extended Request ID: HDwje6k+ANEeDsM6aJ8+D5gUmNAMguOk2BvZ8PH3g9z0gpH+IuwT7N19oQOnIr5CIx7Vqb/uThE=

    at com.amazonaws.http.AmazonHttpClient.handleErrorResponse()
  2. AWS SDK for Java - Core
    AmazonHttpClient.execute
    1. com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1182)
    2. com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:770)
    3. com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:489)
    4. com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:310)
    4 frames
  3. AWS Java SDK for Amazon S3
    AmazonS3Client.doesBucketExist
    1. com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3785)
    2. com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1107)
    3. com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1070)
    3 frames
  4. Apache Hadoop Amazon Web Services support
    S3AFileSystem.initialize
    1. org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:307)
    2. org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:284)
    2 frames
  5. Hadoop
    FsShell.main
    1. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2793)
    2. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:101)
    3. org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
    4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
    5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    6. org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
    7. org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
    8. org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
    9. org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
    10. org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
    11. org.apache.hadoop.fs.shell.Command.run(Command.java:165)
    12. org.apache.hadoop.fs.FsShell.run(FsShell.java:315)
    13. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    14. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
    15. org.apache.hadoop.fs.FsShell.main(FsShell.java:373)
    15 frames