org.apache.hadoop.security.AccessControlException: Permission denied: s3n://test/logs

Stack Overflow | dreamer | 4 months ago
  1. 0

    Spark S3 access denied when using regex

    Stack Overflow | 4 months ago | dreamer
    org.apache.hadoop.security.AccessControlException: Permission denied: s3n://test/logs
  2. 0

    Have trouble running run_tests.sh

    GitHub | 1 year ago | kchew534
    org.apache.hadoop.security.AccessControlException: Permission denied: s3n://test-bucket/secor_dev/backup/test
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    oozie distcp s3 to s3 copy invalid arguments org.jets3t.service.impl.rest.HttpException

    Stack Overflow | 1 month ago | Himateja Madala
    org.apache.hadoop.security.AccessControlException: Permission denied: s3n://XXX/XXX/XXX
  5. 0

    Amazon s3a returns 400 Bad Request with Spark-redshift library

    Stack Overflow | 2 weeks ago | Amit Valse
    java.io.IOException: s3n://bucket-name : 400 : Bad Request error while loading Redshift data through spark-redshift library: The Redshift cluster and the s3 bucket both are in mumbai region. Here is the full error stack: 2017-01-13 13:14:22 WARN TaskSetManager:66 - Lost task 0.0 in stage 0.0 (TID 0, master): java.io.IOException: s3n://bucket-name : 400 : Bad Request

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.jets3t.service.impl.rest.HttpException

      No message provided

      at org.jets3t.service.impl.rest.httpclient.RestStorageService.performRequest()
    2. JetS3t
      StorageService.getObjectDetails
      1. org.jets3t.service.impl.rest.httpclient.RestStorageService.performRequest(RestStorageService.java:423)
      2. org.jets3t.service.impl.rest.httpclient.RestStorageService.performRequest(RestStorageService.java:277)
      3. org.jets3t.service.impl.rest.httpclient.RestStorageService.performRestHead(RestStorageService.java:1038)
      4. org.jets3t.service.impl.rest.httpclient.RestStorageService.getObjectImpl(RestStorageService.java:2250)
      5. org.jets3t.service.impl.rest.httpclient.RestStorageService.getObjectDetailsImpl(RestStorageService.java:2179)
      6. org.jets3t.service.StorageService.getObjectDetails(StorageService.java:1120)
      7. org.jets3t.service.StorageService.getObjectDetails(StorageService.java:575)
      7 frames
    3. Hadoop
      Jets3tNativeFileSystemStore.retrieveMetadata
      1. org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:174)
      1 frame
    4. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:497)
      4 frames
    5. Hadoop
      FileSystem.globStatus
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
      3. org.apache.hadoop.fs.s3native.$Proxy42.retrieveMetadata(Unknown Source)
      4. org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:530)
      5. org.apache.hadoop.fs.Globber.listStatus(Globber.java:69)
      6. org.apache.hadoop.fs.Globber.glob(Globber.java:217)
      7. org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1674)
      7 frames
    6. Hadoop
      FileInputFormat.getSplits
      1. org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:259)
      2. org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229)
      3. org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
      3 frames
    7. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:203)
      2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
      3 frames
    8. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    9. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
      4 frames
    10. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    11. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:242)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:240)
      4 frames
    12. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    13. Spark
      RDD.treeAggregate
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:240)
      2. org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1136)
      3. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      4. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      5. org.apache.spark.rdd.RDD.withScope(RDD.scala:323)
      6. org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1134)
      6 frames
    14. org.apache.spark
      JSONRelation$$anonfun$4.apply
      1. org.apache.spark.sql.execution.datasources.json.InferSchema$.infer(InferSchema.scala:65)
      2. org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:114)
      3. org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:109)
      3 frames
    15. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    16. org.apache.spark
      JSONRelation.dataSchema
      1. org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema$lzycompute(JSONRelation.scala:109)
      2. org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema(JSONRelation.scala:108)
      2 frames
    17. Spark Project SQL
      HadoopFsRelation.schema
      1. org.apache.spark.sql.sources.HadoopFsRelation.schema$lzycompute(interfaces.scala:636)
      2. org.apache.spark.sql.sources.HadoopFsRelation.schema(interfaces.scala:635)
      2 frames
    18. org.apache.spark
      LogicalRelation.<init>
      1. org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:37)
      1 frame
    19. Spark Project SQL
      DataFrameReader.json
      1. org.apache.spark.sql.SQLContext.baseRelationToDataFrame(SQLContext.scala:442)
      2. org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:288)
      2 frames
    20. com.test
      LogParser.main
      1. com.test.LogParser$.main(LogParser.scala:294)
      2. com.test.LogParser.main(LogParser.scala)
      2 frames
    21. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:497)
      4 frames
    22. Spark Project YARN Stable API
      ApplicationMaster$$anon$2.run
      1. org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
      1 frame