java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

spark-user | Vipul Pandey | 3 years ago
  1. 0

    Re: Using ProtoBuf 2.5 for messages with Spark Streaming

    spark-user | 3 years ago | Vipul Pandey
    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
  2. 0

    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

    GitHub | 10 months ago | padmanaik
    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
  3. 0

    Get exception when accessing cdh4 from shell - java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields most likely due to protobuf-java-2.5.0.jar being on the main classpath now Full stack trace: {code} trisberg@carbon:~/Test$ ./spring-xd-1.0.0.BUILD-SNAPSHOT/shell/bin/xd-shell --hadoopDistro cdh4 16:55:22,680 WARN main conf.Configuration:824 - fs.default.name is deprecated. Instead, use fs.defaultFS _____ __ _______ / ___| (-) \ \ / / _ \ \ `--. _ __ _ __ _ _ __ __ _ \ V /| | | | `--. \ '_ \| '__| | '_ \ / _` | / ^ \| | | | /\__/ / |_) | | | | | | | (_| | / / \ \ |/ / \____/| .__/|_| |_|_| |_|\__, | \/ \/___/ | | __/ | |_| |___/ eXtreme Data 1.0.0.BUILD-SNAPSHOT | Admin Server Target: http://localhost:9393 Welcome to the Spring XD shell. For assistance hit TAB or type "help". xd:>hadoop config fs --namenode hdfs://cdh4:8020 xd:>hadoop fs ls / Hadoop configuration changed, re-initializing shell... 16:55:28,853 WARN Spring Shell util.NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable -ls: Fatal internal error java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:30108) at com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193) at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:629) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:819) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1646) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1592) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1567) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:271) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190) at org.apache.hadoop.fs.shell.Command.run(Command.java:154) at org.apache.hadoop.fs.FsShell.run(FsShell.java:254) at org.springframework.xd.shell.hadoop.FsShellCommands.run(FsShellCommands.java:412) at org.springframework.xd.shell.hadoop.FsShellCommands.runCommand(FsShellCommands.java:407) at org.springframework.xd.shell.hadoop.FsShellCommands.ls(FsShellCommands.java:110) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:196) at org.springframework.shell.core.SimpleExecutionStrategy.invoke(SimpleExecutionStrategy.java:64) at org.springframework.shell.core.SimpleExecutionStrategy.execute(SimpleExecutionStrategy.java:48) at org.springframework.shell.core.AbstractShell.executeCommand(AbstractShell.java:127) at org.springframework.shell.core.JLineShell.promptLoop(JLineShell.java:530) at org.springframework.shell.core.JLineShell.run(JLineShell.java:178) at java.lang.Thread.run(Thread.java:744) {code}

    Spring JIRA | 3 years ago | Thomas Risberg
    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Get exception when accessing cdh4 from shell - java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields most likely due to protobuf-java-2.5.0.jar being on the main classpath now Full stack trace: {code} trisberg@carbon:~/Test$ ./spring-xd-1.0.0.BUILD-SNAPSHOT/shell/bin/xd-shell --hadoopDistro cdh4 16:55:22,680 WARN main conf.Configuration:824 - fs.default.name is deprecated. Instead, use fs.defaultFS _____ __ _______ / ___| (-) \ \ / / _ \ \ `--. _ __ _ __ _ _ __ __ _ \ V /| | | | `--. \ '_ \| '__| | '_ \ / _` | / ^ \| | | | /\__/ / |_) | | | | | | | (_| | / / \ \ |/ / \____/| .__/|_| |_|_| |_|\__, | \/ \/___/ | | __/ | |_| |___/ eXtreme Data 1.0.0.BUILD-SNAPSHOT | Admin Server Target: http://localhost:9393 Welcome to the Spring XD shell. For assistance hit TAB or type "help". xd:>hadoop config fs --namenode hdfs://cdh4:8020 xd:>hadoop fs ls / Hadoop configuration changed, re-initializing shell... 16:55:28,853 WARN Spring Shell util.NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable -ls: Fatal internal error java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:30108) at com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193) at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:629) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:819) at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1646) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1592) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1567) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:271) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190) at org.apache.hadoop.fs.shell.Command.run(Command.java:154) at org.apache.hadoop.fs.FsShell.run(FsShell.java:254) at org.springframework.xd.shell.hadoop.FsShellCommands.run(FsShellCommands.java:412) at org.springframework.xd.shell.hadoop.FsShellCommands.runCommand(FsShellCommands.java:407) at org.springframework.xd.shell.hadoop.FsShellCommands.ls(FsShellCommands.java:110) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:196) at org.springframework.shell.core.SimpleExecutionStrategy.invoke(SimpleExecutionStrategy.java:64) at org.springframework.shell.core.SimpleExecutionStrategy.execute(SimpleExecutionStrategy.java:48) at org.springframework.shell.core.AbstractShell.executeCommand(AbstractShell.java:127) at org.springframework.shell.core.JLineShell.promptLoop(JLineShell.java:530) at org.springframework.shell.core.JLineShell.run(JLineShell.java:178) at java.lang.Thread.run(Thread.java:744) {code}

    Spring JIRA | 3 years ago | Thomas Risberg
    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
  6. 0

    Flume NG 简介及配置实战 - leejun2005的个人页面 - 开源中国社区

    oschina.net | 3 months ago
    java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.

    5 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.UnsupportedOperationException

      This is supposed to be overridden by subclasses.

      at com.google.protobuf.GeneratedMessage.getUnknownFields()
    2. Protocol Buffer Java API
      GeneratedMessage.getUnknownFields
      1. com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
      1 frame
    3. Apache Hadoop HDFS
      ClientNamenodeProtocolProtos$GetFileInfoRequestProto.getSerializedSize
      1. org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:30042)
      1 frame
    4. Protocol Buffer Java API
      AbstractMessageLite.toByteString
      1. com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49)
      1 frame
    5. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149)
      2. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193)
      2 frames
    6. Unknown
      $Proxy14.getFileInfo
      1. $Proxy14.getFileInfo(Unknown Source)
      1 frame
    7. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      4. java.lang.reflect.Method.invoke(Method.java:597)
      4 frames
    8. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
      2 frames
    9. Unknown
      $Proxy14.getFileInfo
      1. $Proxy14.getFileInfo(Unknown Source)
      1 frame
    10. Apache Hadoop HDFS
      DistributedFileSystem.getFileStatus
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:628)
      2. org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545)
      3. org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:805)
      3 frames
    11. Hadoop
      FileSystem.globStatus
      1. org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1670)
      2. org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1616)
      2 frames
    12. Hadoop
      FileInputFormat.getSplits
      1. org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174)
      2. org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205)
      2 frames
    13. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:140)
      2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
      3 frames
    14. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    15. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
      2. org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
      4 frames
    16. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    17. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
      2. org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:30)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
      4 frames
    18. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    19. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
      2. org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
      4 frames
    20. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    21. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
      2. org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
      4 frames
    22. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    23. Spark
      PairRDDFunctions.reduceByKey
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
      2. org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:58)
      3. org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:354)
      3 frames