java.lang.RuntimeException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • GitHub comment 832#242316167
    via GitHub by anandk89
    ,
  • mongo-hadoop hive insert error
    via by Josh Morris,
  • Can't integrate Elasticsearch with Hive
    via by Atul Paldhikar,
  • RE: Can't integrate Elasticsearch with Hive
    via by Unknown author,
  • GitHub comment 41#86869182
    via GitHub by ManikandanV
    ,
  • I'm working with a prospective customer (Equinix) who encountered an issue with the latest Hadoop connector when operating MongoDB 2.8 with the latest Hive distribution. While the issue appears to be specific to Hive code involving the generation of a malformed job.xml, it appears like the issue is triggered by our connector when data is inserted into the Hive table and written back to MongoDB (the reverse doesn't appear to cause an issue). The issue appears to be similar to the one described in the forums here: https://groups.google.com/forum/#!topic/mongodb-user/lKbha0SzMP8 I have provided repro steps from the customer as well as a detailed error log below. I'm also working on getting connected with our Hortonworks partners to get their assistance; nonetheless, I think this is something we should track as the issue seems to be triggered by basic functionality. Let me know if you like direct contact with the Equinix engineers to discuss the issue. REPRO STEPS as described by client: insert the data onto eqx_eg_accounts_view to test this. CREATE EXTERNAL TABLE accounts_external ( account _id STRING, account_name STRING ) STORED BY 'com.mongodb.hadoop.hive.MongoStorageHandler' WITH SERDEPROPERTIES('mongo.columns.mapping'='{"account_id":"account_id", "account_name":"account_name"}') TBLPROPERTIES('mongo.uri'='mongodb://sv2lxgsed01:27017/data-terminal.accounts'); INSERT OVERWRITE TABLE accounts SELECT a.account_id, a.account_name from eqx_eg_accounts_view a where a.account_id = 10000; If we insert the data in the Mongo collection ACCOUNTS and select in Hive, we were able to see the data, but while trying to insert the data onto this External table in Hive as above, it throws error. Error during job, obtaining debugging information... FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask Select account_id, account_name from accounts_external; hive> insert into table individuals values(123,"test1",30); Query ID = hive_20141215235858_01f9d535-307e-4413-ad46-b432b6d3316f Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_1418635895251_0095, Tracking URL = http://sv2lxgsed02.corp.equinix.com:8088/proxy/application_1418635895251_0095/ Kill Command = /usr/hdp/2.2.0.0-2041/hadoop/bin/hadoop job -kill job_1418635895251_0095 Hadoop job information for Stage-0: number of mappers: 0; number of reducers: 0 2014-12-15 23:58:48,006 Stage-0 map = 0%, reduce = 0% Ended Job = job_1418635895251_0095 with errors Error during job, obtaining debugging information... FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-0: HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec Error Log: Log Type: stderr Log Upload Time: 15-Dec-2014 23:58:48 Log Length: 325 [Fatal Error] job.xml:832:51: Character reference "&#0" is an invalid XML character. log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Log Type: stdout Log Upload Time: 15-Dec-2014 23:58:48 Log Length: 0 Log Type: syslog Log Upload Time: 15-Dec-2014 23:58:48 Log Length: 2868 2014-12-15 23:58:44,933 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1418635895251_0095_000001 2014-12-15 23:58:45,283 FATAL [main] org.apache.hadoop.conf.Configuration: error parsing conf job.xml org.xml.sax.SAXParseException; systemId: file:///opt/hadoop/yarn/local/usercache/hive/appcache/application_1418635895251_0095/container_1418635895251_0095_01_000001/job.xml; lineNumber: 832; columnNumber: 51; Character reference "&#0" is an invalid XML character. at org.apache.xerces.parsers.DOMParser.parse(Unknown Source) at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source) at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2354) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2423) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2376) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2283) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1110) at org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:51) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1421) 2014-12-15 23:58:45,285 FATAL [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:///opt/hadoop/yarn/local/usercache/hive/appcache/application_1418635895251_0095/container_1418635895251_0095_01_000001/job.xml; lineNumber: 832; columnNumber: 51; Character reference "&#0" is an invalid XML character. at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2519) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2376) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2283) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1110) at org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:51) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1421) Caused by: org.xml.sax.SAXParseException; systemId: file:///opt/hadoop/yarn/local/usercache/hive/appcache/application_1418635895251_0095/container_1418635895251_0095_01_000001/job.xml; lineNumber: 832; columnNumber: 51; Character reference "&#0" is an invalid XML character. at org.apache.xerces.parsers.DOMParser.parse(Unknown Source) at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source) at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2354) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2423) ... 5 more 2014-12-15 23:58:45,288 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting with status 1
    via by Dylan Tong,
  • Bulk dataload : hadoop jar /opt/cloudera/parcels/CLABS_PHOENIX-4.3.0-1.clabs_phoenix1.0.0.p0.78/lib/phoenix/phoenix-4.3.0-clabs-phoenix-1.0.0-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --table test_phoenix_import --zookeeper n1,n2,n3 --delimiter \001 --input /user/wym/test_phoenix_import The path ‘/user/wym/test_phoenix_import’ is a directory contains Hive data, so the default delimiter is \001. Exception: 15/09/08 15:22:17 INFO zookeeper.ClientCnxn: EventThread shut down 15/09/08 15:22:17 INFO zookeeper.ZooKeeper: Session: 0x34f1c9be8ba5667 closed Exception in thread "main" java.lang.IllegalArgumentException: Illegal delimiter character: 001 at org.apache.phoenix.mapreduce.CsvBulkLoadTool.configureOptions(CsvBulkLoadTool.java:327) at org.apache.phoenix.mapreduce.CsvBulkLoadTool.loadData(CsvBulkLoadTool.java:201) at org.apache.phoenix.mapreduce.CsvBulkLoadTool.run(CsvBulkLoadTool.java:186) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:97) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 15/09/08 15:22:18 INFO client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x34f1c9be8ba5665 15/09/08 15:22:18 INFO zookeeper.ClientCnxn: EventThread shut down 15/09/08 15:22:18 INFO zookeeper.ZooKeeper: Session: 0x34f1c9be8ba5665 closed
    via by Yuming Wang,
    • java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:///mnt/resource/yarn
    • m/usercache/root/appcache/application_1472110818465_0003/container_1472110818465_0003_01_000001/job.xml; lineNumber: 885; columnNumber: 51; Character reference "&#0" is an invalid XML character. at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2656) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2513) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1233) at org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:51) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1465)

    Users with the same issue

    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    1 more bugmates