java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.getDiskReport(DirectoryScanner.java:549) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.scan(DirectoryScanner.java:422) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.reconcile(DirectoryScanner.java:403) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.run(DirectoryScanner.java:359) at java.util.concurrent.Executors$RunnableAdapter.cal l(Executors.java:471) at java.util.concurrent.FutureTask.runAndReset(Future Task.java:304) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.access$301(ScheduledThreadPoolE xecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.run(ScheduledThreadPoolExecutor .java:293)

cloudera.com | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Datanode shut down when running Hive - Cloudera Community

    cloudera.com | 2 years ago
    java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.getDiskReport(DirectoryScanner.java:549) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.scan(DirectoryScanner.java:422) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.reconcile(DirectoryScanner.java:403) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.run(DirectoryScanner.java:359) at java.util.concurrent.Executors$RunnableAdapter.cal l(Executors.java:471) at java.util.concurrent.FutureTask.runAndReset(Future Task.java:304) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.access$301(ScheduledThreadPoolE xecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.run(ScheduledThreadPoolExecutor .java:293)
  2. 0

    Solved: Datanode shut down when running Hive - Cloudera Community

    cloudera.com | 8 months ago
    java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.getDiskReport(DirectoryScanner.java:549) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.scan(DirectoryScanner.java:422) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.reconcile(DirectoryScanner.java:403) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.run(DirectoryScanner.java:359) at java.util.concurrent.Executors$RunnableAdapter.cal l(Executors.java:471) at java.util.concurrent.FutureTask.runAndReset(Future Task.java:304) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.access$301(ScheduledThreadPoolE xecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.run(ScheduledThreadPoolExecutor .java:293)
  3. 0

    Grails 2.0.0-console errors

    GitHub | 5 years ago | bronoman
    java.lang.RuntimeException: It looks like you are missing some calls to the r:layoutResources tag. After rendering your page the following have not been rendered: [defer]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Cannot create blade or bladeset while the app server is running

    GitHub | 3 years ago | thecapdan
    java.lang.RuntimeException: java.nio.file.NoSuchFileException: C:\Users\danielo\Desktop\TESTING\BladeRunnerJS-v0 .8-131-gc0618d0-DEV\BladeRunnerJS\apps\ted\bap1-bladeset\src\@appns\@bladeset
  6. 0

    Porting project JBoss AS 6 to JBoss AS 7

    Stack Overflow | 5 years ago | Rounak
    java.lang.RuntimeException: com.sun.faces.config.ConfigurationException : CONFIGURATION FAILED! Class org.jboss.as.web.deployment.jsf.JandexAnnotationPr ovider is not an instance of com.sun.faces.spi.AnnotationProvider at com.sun.faces.config.ConfigureListener.contextInitialized(ConfigureLi stener.java:292) [jsf-impl.jar:] at org.apache.catalina.core.StandardContext.contextListenerStart(Standar dContext.java:3368) [jbossweb-7.0.1.Final.jar:7.0.2.Final] at org.apache.catalina.core.StandardContext.start(StandardContext.java:3 821) [jbossweb-7.0.1.Final.jar:7.0.2.Final] at org.jboss.as.web.deployment.WebDeploymentService.start(WebDeploymentS ervice.java:70) [jboss-as-web-7.0.2.Final.jar:7.0.2.Final] at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(Se rviceControllerImpl.java:1824) at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceCont rollerImpl.java:1759) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor. java:1110) [:1.7.0] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor .java:603) [:1.7.0]

  1. rp 1 times, last 5 months ago
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.RuntimeException

    java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.getDiskReport(DirectoryScanner.java:549) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.scan(DirectoryScanner.java:422) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.reconcile(DirectoryScanner.java:403) at org.apache.hadoop.hdfs.server.datanode.DirectorySc anner.run(DirectoryScanner.java:359) at java.util.concurrent.Executors$RunnableAdapter.cal l(Executors.java:471) at java.util.concurrent.FutureTask.runAndReset(Future Task.java:304) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.access$301(ScheduledThreadPoolE xecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$S cheduledFutureTask.run(ScheduledThreadPoolExecutor .java:293)

    at java.util.concurrent.ThreadPoolExecutor.runWorker()
  2. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames