org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object

Cask Community Issue Tracker | Shankar Selvam | 1 year ago
  1. 0

    [CDAP-2991] Explore doesn't work when it launches map-reduce job - Cask Community Issue Tracker

    cask.co | 11 months ago
    org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object
  2. 0

    Explore has an issue after the recent updates to twill-snapshot and dependency changes in cdap. {noformat} 2015-07-10 19:37:35,384 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:431) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 17 more Caused by: java.lang.RuntimeException: Map operator initialization failed at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:154) ... 22 more Caused by: java.lang.NoSuchMethodError: org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.addListener(Lcom/google/common/util/concurrent/Service$Listener;Ljava/util/concurrent/Executor;)V at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:403) at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:392) at org.apache.twill.internal.zookeeper.DefaultZKClientService.<init>(DefaultZKClientService.java:98) at org.apache.twill.zookeeper.ZKClientService$Builder.build(ZKClientService.java:101) at co.cask.cdap.common.guice.ZKClientModule.provideZKClientService(ZKClientModule.java:53) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013) at co.cask.cdap.hive.context.ContextManager.createContext(ContextManager.java:149) at co.cask.cdap.hive.context.ContextManager.getContext(ContextManager.java:85) at co.cask.cdap.hive.stream.StreamSerDe.initialize(StreamSerDe.java:88) at org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:307) at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:353) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:123) ... 22 more {noformat} In org.apache.twill.internal.zookeeper.DefaultZKClientService, we use com.google.common.util.concurrent.AbstractService#addListener, which isn't present in older guava versions. the explore-query container's classpath has job.jar before the jar files in the container directory, and since job.jar has guava-11, we use that and get NoSuchMethodError exception. however, we did not update guava version from twill-0.5.0 to twill-0.6.0-SNAPSHOT, its just the new changes started exposing the issue.

    Cask Community Issue Tracker | 1 year ago | Shankar Selvam
    org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object
  3. 0

    Explore has an issue after the recent updates to twill-snapshot and dependency changes in cdap. {noformat} 2015-07-10 19:37:35,384 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:431) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 17 more Caused by: java.lang.RuntimeException: Map operator initialization failed at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:154) ... 22 more Caused by: java.lang.NoSuchMethodError: org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.addListener(Lcom/google/common/util/concurrent/Service$Listener;Ljava/util/concurrent/Executor;)V at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:403) at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:392) at org.apache.twill.internal.zookeeper.DefaultZKClientService.<init>(DefaultZKClientService.java:98) at org.apache.twill.zookeeper.ZKClientService$Builder.build(ZKClientService.java:101) at co.cask.cdap.common.guice.ZKClientModule.provideZKClientService(ZKClientModule.java:53) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013) at co.cask.cdap.hive.context.ContextManager.createContext(ContextManager.java:149) at co.cask.cdap.hive.context.ContextManager.getContext(ContextManager.java:85) at co.cask.cdap.hive.stream.StreamSerDe.initialize(StreamSerDe.java:88) at org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:307) at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:353) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:123) ... 22 more {noformat} In org.apache.twill.internal.zookeeper.DefaultZKClientService, we use com.google.common.util.concurrent.AbstractService#addListener, which isn't present in older guava versions. the explore-query container's classpath has job.jar before the jar files in the container directory, and since job.jar has guava-11, we use that and get NoSuchMethodError exception. however, we did not update guava version from twill-0.5.0 to twill-0.6.0-SNAPSHOT, its just the new changes started exposing the issue.

    Cask Community Issue Tracker | 1 year ago | Shankar Selvam
    org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: Error in configuring object
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    NoSuchMethodError when running on Hadoop but not when run locally

    Stack Overflow | 1 year ago | tom
    java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture;
  6. 0

    Hi, I'm trying out connecting to Cassandra and reading / writing data. I'm able to connect (e.g. create an RDD pointing to a cassandra table), but when I retrieve the data it fails. I've created a fat jar using this in my sbt: {code} libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.6.0" % "provided" ,"org.apache.spark" %% "spark-sql" % "1.6.0" % "provided" ,"org.apache.spark" %% "spark-hive" % "1.6.0" % "provided" ,"org.apache.spark" %% "spark-streaming" % "1.6.0" % "provided" ,"org.apache.spark" %% "spark-mllib" % "1.6.0" % "provided" ,"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0-M1" ) // META-INF discarding mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => { case PathList("META-INF", xs @ _*) => MergeStrategy.discard case x => MergeStrategy.first } } {code} When I launch a spark shell session like this, I am able to connect to a table and count rows; {code} /opt/spark/current/bin/spark-shell --master local[2] --conf "spark.cassandra.connection.host=[cassandra-host]” --conf "spark.cassandra.auth.username=[my username]“ --conf "spark.cassandra.auth.password=[my pwd]“ --jars fat-jar-assembly-1.0.jar Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.6.0 /_/ Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67) Type in expressions to have them evaluated. Type :help for more information. Spark context available as sc. SQL context available as sqlContext. scala> import com.datastax.spark.connector._ import com.datastax.spark.connector._ scala> val personRDD = sc.cassandraTable(“test”,”person"); personRDD: com.datastax.spark.connector.rdd.CassandraTableScanRDD[com.datastax.spark.connector.CassandraRow] = CassandraTableScanRDD[0] at RDD at CassandraRDD.scala:15 scala> println(personRDD.count) 16/04/15 12:43:41 WARN ReplicationStrategy$NetworkTopologyStrategy: Error while computing token map for keyspace test with datacenter ***: could not achieve replication factor 2 (found 0 replicas only), check your keyspace replication settings. 2 {code} When I launch it without the --master local[2], then it doesn't work: {code} /opt/spark/current/bin/spark-shell --conf "spark.cassandra.connection.host=[cassandra-host]” --conf "spark.cassandra.auth.username=[my username]“ --conf "spark.cassandra.auth.password=[my pwd]“ --jars fat-jar-assembly-1.0.jar Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.6.0 /_/ Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67) Type in expressions to have them evaluated. Type :help for more information. spark.driver.cores is set but does not apply in client mode. Spark context available as sc. SQL context available as sqlContext. scala> import com.datastax.spark.connector._ import com.datastax.spark.connector._ scala> val message = sc.cassandraTable(“test”,”person”); message: com.datastax.spark.connector.rdd.CassandraTableScanRDD[com.datastax.spark.connector.CassandraRow] = CassandraTableScanRDD[0] at RDD at CassandraRDD.scala:15 scala> println(message.count) 16/04/14 14:16:04 WARN ReplicationStrategy$NetworkTopologyStrategy: Error while computing token map for keyspace test with datacenter ****: could not achieve replication factor 2 (found 0 replicas only), check your keyspace replication settings. [Stage 0:> (0 + 2) / 2]16/04/14 14:16:09 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, [spark-node-on-yarn]): java.io.IOException: Failed to open native connection to Cassandra at {**.**.246, **.**.10}:9042 at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.compute(CassandraTableScanRDD.scala:218) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture; at com.datastax.driver.core.Connection.initAsync(Connection.java:177) at com.datastax.driver.core.Connection$Factory.open(Connection.java:731) at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:251) at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:199) at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:77) at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1414) at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:393) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155) ... 14 more 16/04/14 14:16:21 WARN TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4, [spark-node-on-yarn]): java.io.IOException: Failed to open native connection to Cassandra at {**.**.246, **.**.10}:9042 at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148) at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.compute(CassandraTableScanRDD.scala:218) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture; at com.datastax.driver.core.Connection.initAsync(Connection.java:177) at com.datastax.driver.core.Connection$Factory.open(Connection.java:731) at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:251) at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:199) at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:77) at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1414) at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:393) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155) ... 14 more {code} But the 'hack' makes spark running on the local node. So it's not distributed anymore, so it won't work with any real data.

    DataStax JIRA | 8 months ago | Ben Teeuwen
    java.io.IOException: Failed to open native connection to Cassandra at {**.**.246, **.**.10}:9042

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NoSuchMethodError

      org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.addListener(Lcom/google/common/util/concurrent/Service$Listener;Ljava/util/concurrent/Executor;)V

      at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>()
    2. org.apache.twill
      ZKClientService$Builder.build
      1. org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:403)
      2. org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:392)
      3. org.apache.twill.internal.zookeeper.DefaultZKClientService.<init>(DefaultZKClientService.java:98)
      4. org.apache.twill.zookeeper.ZKClientService$Builder.build(ZKClientService.java:101)
      4 frames
    3. co.cask.cdap
      ZKClientModule.provideZKClientService
      1. co.cask.cdap.common.guice.ZKClientModule.provideZKClientService(ZKClientModule.java:53)
      1 frame
    4. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    5. Google Guice - Core Library
      InjectorImpl.getInstance
      1. com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104)
      2. com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40)
      3. com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
      4. com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031)
      5. com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
      6. com.google.inject.Scopes$1$1.get(Scopes.java:65)
      7. com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40)
      8. com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978)
      9. com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024)
      10. com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974)
      11. com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013)
      11 frames
    6. co.cask.cdap
      StreamSerDe.initialize
      1. co.cask.cdap.hive.context.ContextManager.createContext(ContextManager.java:149)
      2. co.cask.cdap.hive.context.ContextManager.getContext(ContextManager.java:85)
      3. co.cask.cdap.hive.stream.StreamSerDe.initialize(StreamSerDe.java:88)
      3 frames
    7. Hive Query Language
      ExecMapper.configure
      1. org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:307)
      2. org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:353)
      3. org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:123)
      3 frames
    8. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    9. Hadoop
      ReflectionUtils.newInstance
      1. org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
      2. org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
      3. org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
      3 frames
    10. Hadoop
      MapRunner.configure
      1. org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
      1 frame
    11. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    12. Hadoop
      ReflectionUtils.newInstance
      1. org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
      2. org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
      3. org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
      3 frames
    13. Hadoop
      YarnChild$2.run
      1. org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:431)
      2. org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
      3. org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
      3 frames
    14. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    15. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
      1 frame
    16. Hadoop
      YarnChild.main
      1. org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
      1 frame