java.lang.Exception: com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}]

Google Groups | Tausif Shaikh | 8 months ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    Error 500 java.lang.NullPointerException

    Google Groups | 8 months ago | Tausif Shaikh
    java.lang.Exception: com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}]
  2. 0

    Loading CSV File: Unparseable timestamp found

    Google Groups | 1 month ago | Unknown author
    com.metamx.common.RE: Failure on row[2017-01-03,"S2SSOFTSYS","S2SSOFTSYS","pub_Sites","320x50_mobile","","Direct/Tag Connection","Display",1.000000,0.000000,0.000000,0.000000,0.000000]*
  3. 0

    Configure druid to parse json files with nested structures - failing

    Google Groups | 9 months ago | Scott Kinney
    java.lang.Exception: com.metamx.common.RE: Failure on row[{"arr": [{"data": [{"f": 60, "i": [-1, -1, -1], "delta_t": 1, "kw": [68.948, 79.242, 67.05], "t": "2015-07-28T15:19:18.769", "v": [-1, -1, -1], "orig_t": "2015-07-28T15:19:18.769"}], "id": "pgx.hq.stem-8e-71-6b.vmonitor.site.telemetry"}], "ver": "1.0"}]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.metamx.common.parsers.ParseException

      Unparseable timestamp found!

      at io.druid.data.input.impl.MapInputRowParser.parse()
    2. Druid
      StringInputRowParser.parse
      1. io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:72)[druid-api-0.9.1.1.jar:0.9.1.1]
      2. io.druid.data.input.impl.StringInputRowParser.parseMap(StringInputRowParser.java:136)[druid-api-0.9.1.1.jar:0.9.1.1]
      3. io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131)[druid-api-0.9.1.1.jar:0.9.1.1]
      3 frames
    3. Druid Hadoop
      DetermineHashedPartitionsJob$DetermineCardinalityMapper.run
      1. io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:98)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
      2. io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:69)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
      3. io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:283)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
      3 frames
    4. Hadoop
      LocalJobRunner$Job$MapTaskRunnable.run
      1. org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)[hadoop-mapreduce-client-core-2.3.0.jar:?]
      2. org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)[hadoop-mapreduce-client-core-2.3.0.jar:?]
      3. org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)[hadoop-mapreduce-client-common-2.3.0.jar:?]
      3 frames
    5. Java RT
      ThreadPoolExecutor$Worker.run
      1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)[?:1.8.0_91]
      2. java.util.concurrent.FutureTask.run(FutureTask.java:266)[?:1.8.0_91]
      3. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[?:1.8.0_91]
      4. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[?:1.8.0_91]
      4 frames