java.lang.Exception

com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}]


Solutions on the web3108

Solution icon of googlegroups
via Google Groups by Tausif Shaikh, 1 year ago
com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178

Solution icon of googlegroups
Invalid submit value (POST on URL: http://cbscentral.rc.fas.harvard.edu/REST/services/prearchive/move from 10.242.38.132 (58001) user: 10.242.38.132 Headers: host: cbscentral.rc.fas.harvard.edu user-agent: Mozilla/5.0 (X11; Linux x86_64; rv:10.0.4

Solution icon of stackoverflow
Null token

Solution icon of oraclecommunity
via Oracle Community by Shawn.Luo, 1 year ago
Null token

Solution icon of oraclecommunity
via Oracle Community by Nra, 1 year ago
null

Solution icon of oraclecommunity
via Oracle Community by 686273, 1 year ago
null

Solution icon of stackoverflow
via Stack Overflow by abdel
, 1 year ago
null

Solution icon of github
via GitHub by johnou
, 7 months ago
null

Stack trace

  • java.lang.Exception: com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}] at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)[hadoop-mapreduce-client-common-2.3.0.jar:?] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)[hadoop-mapreduce-client-common-2.3.0.jar:?] Caused by: com.metamx.common.RE: Failure on row[{"HTTP_USER_AGENT": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36", "PORTFOLIO_ID": null, "NAME": "no_op", "POPUP_ID": 5, "REMOTE_ADDR": "122.15.120.178", "COUNTRY": "IN", "CREATED_AT": "2016-06-07 08:34:33", "FRAMEWORK_ID": null, "DOMAIN_NAME": "unknown", "TEMPLATE_ID": null, "TOKEN": null, "BUCKET_ID": null, "EMAIL": null, "ID": 1}] at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:88)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1] at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:283)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1] at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)[hadoop-mapreduce-client-core-2.3.0.jar:?] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)[hadoop-mapreduce-client-core-2.3.0.jar:?] at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)[hadoop-mapreduce-client-common-2.3.0.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)[?:1.8.0_91] at java.util.concurrent.FutureTask.run(FutureTask.java:266)[?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[?:1.8.0_91] at java.lang.Thread.run(Thread.java:745)[?:1.8.0_91] Caused by: com.metamx.common.parsers.ParseException: Unparseable timestamp found! at io.druid.data.input.impl.MapInputRowParser.parse(MapInputRowParser.java:72)[druid-api-0.9.1.1.jar:0.9.1.1] at io.druid.data.input.impl.StringInputRowParser.parseMap(StringInputRowParser.java:136)[druid-api-0.9.1.1.jar:0.9.1.1] at io.druid.data.input.impl.StringInputRowParser.parse(StringInputRowParser.java:131)[druid-api-0.9.1.1.jar:0.9.1.1] at io.druid.indexer.HadoopDruidIndexerMapper.parseInputRow(HadoopDruidIndexerMapper.java:98)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1] at io.druid.indexer.HadoopDruidIndexerMapper.map(HadoopDruidIndexerMapper.java:69)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1] at io.druid.indexer.DetermineHashedPartitionsJob$DetermineCardinalityMapper.run(DetermineHashedPartitionsJob.java:283)[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1] at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)[hadoop-mapreduce-client-core-2.3.0.jar:?] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)[hadoop-mapreduce-client-core-2.3.0.jar:?] at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)[hadoop-mapreduce-client-common-2.3.0.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)[?:1.8.0_91] at java.util.concurrent.FutureTask.run(FutureTask.java:266)[?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[?:1.8.0_91]

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.