org.elasticsearch.index.mapper.MapperParsingException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Hello, When i want to send a json log message like below with SQS ------------------------------------------ { "Application": "APPFRAME", "Closed": true, "Committed": true, "Date": "2013-12-16T14:24:00.0562958+02:00", .......... -------------------------------------------- logstash creates INFINITE errors on to logstash.log file. It gives {:timestamp=>"2013-12-13T12:00:06.788000+0000", :message=>"Failed to index an event, will retry", :exception=>org.elasticsearch.action.UnavailableShardsException: [logstash-2013.12.13][1] [2] shardIt, [0] active : Timeout waiting for [1m], request: index {[logstash-2013.12.13][elasticsearch][bVY-SJYXTQeB7Gc1ZrxrIA], source[{"@source":"file://ip-10-10-100-128//vol/elasticsearch/log/logstash.log","@tags":[],"@fields":{}, "@timestamp":"2013-12-13T11:59:06.728Z","@source_host":"ip-10-10-100-128", "@source_path":"//vol/elasticsearch/log/logstash.log",.... OR [2013-12-16 11:43:21,009][DEBUG][action.index ] [Kaluu] [logstash-2013.12.16][3], node[-fVBxnAPRq2g9DJzyYssAQ], [P], s[STARTED]: Failed to execute [index {[logsta$ org.elasticsearch.index.mapper.MapperParsingException: Failed to parse [@fields.Application] at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:320) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:587) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:459) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:507) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:449) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:486) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:430) at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:297) at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:211) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperation$ at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java$ at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.NumberFormatException: For input string: "XXXXXXXXX" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:441) at java.lang.Long.parseLong(Long.java:483) at org.elasticsearch.common.xcontent.support.AbstractXContentParser.longValue(AbstractXContentParser.java:72) at org.elasticsearch.index.mapper.core.LongFieldMapper.innerParseCreateField(LongFieldMapper.java:284) at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:182) at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:307) ... 13 more The log file can goes to GB size if i don't terminate Logstash. When i changed to Application fields to integer, everythings works well. Also if i send the same message via Redis, it works fine. Thanks,
    via by serdar özay,
  • Hello, When i want to send a json log message like below with SQS ------------------------------------------ { "Application": "APPFRAME", "Closed": true, "Committed": true, "Date": "2013-12-16T14:24:00.0562958+02:00", .......... -------------------------------------------- logstash creates INFINITE errors on to logstash.log file. It gives {:timestamp=>"2013-12-13T12:00:06.788000+0000", :message=>"Failed to index an event, will retry", :exception=>org.elasticsearch.action.UnavailableShardsException: [logstash-2013.12.13][1] [2] shardIt, [0] active : Timeout waiting for [1m], request: index {[logstash-2013.12.13][elasticsearch][bVY-SJYXTQeB7Gc1ZrxrIA], source[{"@source":"file://ip-10-10-100-128//vol/elasticsearch/log/logstash.log","@tags":[],"@fields":{}, "@timestamp":"2013-12-13T11:59:06.728Z","@source_host":"ip-10-10-100-128", "@source_path":"//vol/elasticsearch/log/logstash.log",.... OR [2013-12-16 11:43:21,009][DEBUG][action.index ] [Kaluu] [logstash-2013.12.16][3], node[-fVBxnAPRq2g9DJzyYssAQ], [P], s[STARTED]: Failed to execute [index {[logsta$ org.elasticsearch.index.mapper.MapperParsingException: Failed to parse [@fields.Application] at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:320) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:587) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:459) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:507) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:449) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:486) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:430) at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:297) at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:211) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperation$ at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java$ at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.NumberFormatException: For input string: "XXXXXXXXX" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:441) at java.lang.Long.parseLong(Long.java:483) at org.elasticsearch.common.xcontent.support.AbstractXContentParser.longValue(AbstractXContentParser.java:72) at org.elasticsearch.index.mapper.core.LongFieldMapper.innerParseCreateField(LongFieldMapper.java:284) at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:182) at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:307) ... 13 more The log file can goes to GB size if i don't terminate Logstash. When i changed to Application fields to integer, everythings works well. Also if i send the same message via Redis, it works fine. Thanks,
    via by serdar özay,
  • Issue in filtering
    via GitHub by mlecoq
    ,
  • Kibana removing data from _source
    via GitHub by garthk
    ,
  • How to remove only the nested field in an event?
    via by Ehtesh Choudhury,
    • org.elasticsearch.index.mapper.MapperParsingException: Failed to parse [@fields.Application] at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:320) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:587) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:459) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:507) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:449) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:486) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:430) at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:297) at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:211)

    Users with the same issue

    Unknown visitor1 times, last one,