hadoop.mapred.TaskAttemptListenerImpl

Diagnostics report from attempt_1433489536106_0003_m_000010_0: Error: cascading.pipe. OperatorException: [com.snowplowanalytics....][com.twitter.scalding.RichPipe .each(RichPipe.scala:471)] operator Each failed executing operation

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web3

  • via GitHub by neekipatel
    ,
  • via Unknown by Neeki Patel,
  • Stack trace

    • hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1433489536106_0003_m_000010_0: Error: cascading.pipe. OperatorException: [com.snowplowanalytics....][com.twitter.scalding.RichPipe .each(RichPipe.scala:471)] operator Each failed executing operation at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:107) at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:39) at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:80) at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:145) at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:133) at com.twitter.scalding.FlatMapFunction$$anonfun$operate$2.apply(Operations.scala:48) at com.twitter.scalding.FlatMapFunction$$anonfun$operate$2.apply(Operations.scala:46) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at com.twitter.scalding.FlatMapFunction.operate(Operations.scala:46) at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:99) at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:39) at cascading.flow.stream.SourceStage.map(SourceStage.java:102) at cascading.flow.stream.SourceStage.run(SourceStage.java:58) at cascading.flow.hadoop.FlowMapper.run(FlowMapper.java:130) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:170) Caused by: java.lang.NullPointerException at com.snowplowanalytics.snowplow.enrich.common.utils.JsonUtils$.stripInstanceEtc(JsonUtils.scala:240) at com.snowplowanalytics.snowplow.enrich.common.utils.JsonUtils$.extractJson(JsonUtils.scala:204) at com.snowplowanalytics.snowplow.enrich.common.utils.JsonUtils$.validateAndReformatJson(JsonUtils.scala:189) at com.snowplowanalytics.snowplow.enrich.common.utils.JsonUtils$$anonfun$1.apply(JsonUtils.scala:59) at com.snowplowanalytics.snowplow.enrich.common.utils.JsonUtils$$anonfun$1.apply(JsonUtils.scala:58) at com.snowplowanalytics.snowplow.enrich.common.enrichments.EnrichmentManager$$anonfun$4.apply(EnrichmentManager.scala:103) at com.snowplowanalytics.snowplow.enrich.common.enrichments.EnrichmentManager$$anonfun$4.apply(EnrichmentManager.scala:103) at com.snowplowanalytics.snowplow.enrich.common.utils.MapTransformer$$anonfun$1.apply(MapTransformer.scala:158) at com.snowplowanalytics.snowplow.enrich.common.utils.MapTransformer$$anonfun$1.apply(MapTransformer.scala:155) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:224) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:403) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:403) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    We couldn't find other users who have seen this exception.