Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via spark-reviews by darose, 2 years ago
/Jets3tNativeFileSystemStore', uninitialized 32, uninitialized 32, 'org/jets3t/service/security/AWSCredent ials' } Bytecode: 0000000: bb00 0259 b700 034e 2d2b 2cb6 0004 bb00 0000010: 0559 2db6 0006 2db6 0007 b700 083a 042a 0000020: bb00 0959
via github.com by Unknown author, 1 year ago
/jets3t/service/security/AWSCredent ials' } Bytecode: 0000000: bb00 0259 b700 034e 2d2b 2cb6 0004 bb00 0000010: 0559 2db6 0006 2db6 0007 b700 083a 042a 0000020: bb00 0959 1904 b700 0ab5 000b a700 0b3a 0000030: 042a 1904 b700 0d2a 2c12
via cloudera.com by Unknown author, 2 years ago
Bad type on operand stack Exception Details: Location: org/apache/hadoop/fs/s3native/Jets3tNativeFileSyst emStore.copy(Ljava/lang/String;Ljava/lang/String;) V @155: invokevirtual Reason: Type 'org/jets3t/service/model/S3Object' (current
via cloudera.com by Unknown author, 2 years ago
Bad type on operand stack Exception Details: Location: org/apache/hadoop/fs/s3native/Jets3tNativeFileSyst emStore.copy(Ljava/lang/String;Ljava/lang/String;) V @155: invokevirtual Reason: Type 'org/jets3t/service/model/S3Object' (current
via Stack Overflow by Samuel Alexander
, 2 years ago
/AWSCredentials' } Bytecode: 0000000: bb00 0259 b700 034e 2d2b 2cb6 0004 bb00 0000010: 0559 2db6 0006 2db6 0007 b700 083a 042a 0000020: bb00 0959 1904 b700 0ab5 000b a700 0b3a 0000030: 042a 1904 b700 0d2a 2c12 0e03 b600 0fb5 0000040: 0010 2a2c 1211 1400
java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    org/apache/hadoop/fs/s3native/Jets3tNativeFileSystemStore.initialize(Ljava/net/URI;Lorg/apache/hadoop/conf/Configuration;)V @38: invokespecial
  Reason:
    Type 'org/jets3t/service/security/AWSCredentials' (current frame, stack[3]) is not assignable to 'org/jets3t/service/security/ProviderCredentials
'
  Current Frame:
    bci: @38
    flags: { }
    locals: { 'org/apache/hadoop/fs/s3native/Jets3tNativeFileSystemStore', 'java/net/URI', 'org/apache/hadoop/conf/Configuration', 'org/apache/hadoop
/fs/s3/S3Credentials', 'org/jets3t/service/security/AWSCredentials' }
    stack: { 'org/apache/hadoop/fs/s3native/Jets3tNativeFileSystemStore', uninitialized 32, uninitialized 32, 'org/jets3t/service/security/AWSCredent
ials' }
  Bytecode:
    0000000: bb00 0259 b700 034e 2d2b 2cb6 0004 bb00
    0000010: 0559 2db6 0006 2db6 0007 b700 083a 042a
    0000020: bb00 0959 1904 b700 0ab5 000b a700 0b3a
    0000030: 042a 1904 b700 0d2a 2c12 0e03 b600 0fb5
    0000040: 0010 2a2c 1211 1400 12b6 0014 1400 15b8
    0000050: 0017 b500 182a 2c12 1914 0015 b600 1414
    0000060: 0015 b800 17b5 001a 2abb 001b 592b b600
    0000070: 1cb7 001d b500 1eb1                    
  Exception Handler Table:
    bci [14, 44] => handler: 47
  Stackmap Table:                                                                                                                          [344/1956]
    full_frame(@47,{Object[#176],Object[#177],Object[#178],Object[#179]},{Object[#180]})
    same_frame(@55)	at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:280)	at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:270)	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)	at org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:107)	at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67)	at org.apache.spark.rdd.HadoopRDD$$anon$1.(HadoopRDD.scala:156)	at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:149)	at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:64)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)	at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:34)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:34)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:34)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:109)	at org.apache.spark.scheduler.Task.run(Task.scala:53)	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:213)	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:42)	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:41)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:415)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)	at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:41)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:744)