java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).

Stack Overflow | ashic | 8 months ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Spark submit cluster mode from s3

    Stack Overflow | 8 months ago | ashic
    java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).
  2. 0

    Apache Spark User List - Problem reading from S3 in standalone application

    nabble.com | 1 year ago
    java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).
  3. 0

    AWS Developer Forums: Problems with old S3 buckets ...

    amazon.com | 1 month ago
    java.lang.IllegalArgumentException: Invalid hostname in URI s3://<MY_BUCKET_NAME>/logfile-20110815.gz /tmp/logfile-20110815.gz
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    1 unregistered visitors

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).

      at org.apache.hadoop.fs.s3.S3Credentials.initialize()
    2. Hadoop
      Jets3tFileSystemStore.initialize
      1. org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:66)
      2. org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:82)
      2 frames
    3. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    4. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
      2 frames
    5. com.sun.proxy
      $Proxy5.initialize
      1. com.sun.proxy.$Proxy5.initialize(Unknown Source)
      1 frame
    6. Hadoop
      FileSystem.get
      1. org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:77)
      2. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
      3. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
      4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
      5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
      5 frames
    7. Spark
      DriverRunner$$anon$1.run
      1. org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1686)
      2. org.apache.spark.util.Utils$.doFetchFile(Utils.scala:598)
      3. org.apache.spark.util.Utils$.fetchFile(Utils.scala:395)
      4. org.apache.spark.deploy.worker.DriverRunner.org$apache$spark$deploy$worker$DriverRunner$$downloadUserJar(DriverRunner.scala:150)
      5. org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:79)
      5 frames