Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Apache's JIRA Issue Tracker by GitHub Import, 1 year ago
via Stack Overflow by Unknown author, 2 years ago
setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
via Google Groups by Unknown author, 1 year ago
This parser does not support specification "null" version "null"
via Apache's JIRA Issue Tracker by Markus Jelsma, 1 year ago
This parser does not support specification "null" version "null"
java.lang.UnsupportedOperationException: This parser does not support specification "null" version "null" at javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:590) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1143) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1119) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1063) at org.apache.hadoop.conf.Configuration.get(Configuration.java:416) at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:521) at org.apache.hadoop.security.Groups.<init>(Groups.java:55) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:140) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:205) at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:226) at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:33) at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:82) at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) at org.apache.spark.SparkContext.<init>(SparkContext.scala:210) at org.apache.spark.SparkContext.<init>(SparkContext.scala:100) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53) at net.hydromatic.optiq.impl.spark.SparkHandlerImpl.<init>(SparkHandlerImpl.java:48) at net.hydromatic.optiq.impl.spark.SparkHandlerImpl.instance(SparkHandlerImpl.java:79) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at net.hydromatic.optiq.jdbc.OptiqPrepare$Dummy.createHandler(OptiqPrepare.java:124) at net.hydromatic.optiq.jdbc.OptiqPrepare$Dummy.getSparkHandler(OptiqPrepare.java:114) at net.hydromatic.optiq.jdbc.OptiqConnectionImpl$ContextImpl.spark(OptiqConnectionImpl.java:395) at net.hydromatic.optiq.prepare.OptiqPrepareImpl.createPlanner(OptiqPrepareImpl.java:192) at net.hydromatic.optiq.prepare.OptiqPrepareImpl$1.apply(OptiqPrepareImpl.java:140) at net.hydromatic.optiq.prepare.OptiqPrepareImpl$1.apply(OptiqPrepareImpl.java:138) at net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepare_(OptiqPrepareImpl.java:234) at net.hydromatic.optiq.prepare.OptiqPrepareImpl.prepareSql(OptiqPrepareImpl.java:208) at net.hydromatic.optiq.jdbc.OptiqConnectionImpl.parseQuery(OptiqConnectionImpl.java:188) at net.hydromatic.optiq.jdbc.MetaImpl.prepare(MetaImpl.java:603) at net.hydromatic.avatica.AvaticaStatement.execute(AvaticaStatement.java:68) at sqlline.Commands.execute(Commands.java:822) at sqlline.Commands.sql(Commands.java:732) at sqlline.SqlLine.dispatch(SqlLine.java:808) at sqlline.SqlLine.begin(SqlLine.java:681) at sqlline.SqlLine.start(SqlLine.java:398) at sqlline.SqlLine.main(SqlLine.java:292)