java.lang.OutOfMemoryError

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • We've seen a catalogd crash triggered by loading the metadata for a table with about 20K partitions and 77 columns that has incremental stats. It looks like the serialized message is over 2GB which is the Java max array size. Ideally we should catch this exception and fail the query that needs this table's metadata with an appropriate message. {code} I1107 06:47:56.641507 30252 jni-util.cc:177] java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Arrays.java:2271) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140) at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145) at org.apache.thrift.protocol.TBinaryProtocol.writeString(TBinaryProtocol.java:187) at com.cloudera.impala.thrift.THdfsPartition$THdfsPartitionStandardScheme.write(THdfsPartition.java:1831) at com.cloudera.impala.thrift.THdfsPartition$THdfsPartitionStandardScheme.write(THdfsPartition.java:1543) at com.cloudera.impala.thrift.THdfsPartition.write(THdfsPartition.java:1389) at com.cloudera.impala.thrift.THdfsTable$THdfsTableStandardScheme.write(THdfsTable.java:1123) at com.cloudera.impala.thrift.THdfsTable$THdfsTableStandardScheme.write(THdfsTable.java:969) at com.cloudera.impala.thrift.THdfsTable.write(THdfsTable.java:848) at com.cloudera.impala.thrift.TTable$TTableStandardScheme.write(TTable.java:1628) at com.cloudera.impala.thrift.TTable$TTableStandardScheme.write(TTable.java:1395) at com.cloudera.impala.thrift.TTable.write(TTable.java:1209) at com.cloudera.impala.thrift.TCatalogObject$TCatalogObjectStandardScheme.write(TCatalogObject.java:1241) at com.cloudera.impala.thrift.TCatalogObject$TCatalogObjectStandardScheme.write(TCatalogObject.java:1098) at com.cloudera.impala.thrift.TCatalogObject.write(TCatalogObject.java:938) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse$TGetAllCatalogObjectsResponseStandardScheme.write(TGetAllCatalogObjectsResponse.java:487) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse$TGetAllCatalogObjectsResponseStandardScheme.write(TGetAllCatalogObjectsResponse.java:421) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse.write(TGetAllCatalogObjectsResponse.java:365) at org.apache.thrift.TSerializer.serialize(TSerializer.java:79) at com.cloudera.impala.service.JniCatalog.getCatalogObjects(JniCatalog.java:110) {code} You can identify this issue by looking at the metastore database. Here is how to see the size of the incremental stats for table id 12345. The table with this value of 624 MB of incremental stats led to the catalogd crash shown above. {code} postgres=# select pg_size_pretty(sum(length("PARTITION_PARAMS"."PARAM_KEY") + length("PARTITION_PARAMS"."PARAM_VALUE"))) from "PARTITIONS", "PARTITION_PARAMS" where "PARTITIONS"."TBL_ID"=12345 and "PARTITIONS"."PART_ID" = "PARTITION_PARAMS"."PART_ID" and "PARTITION_PARAMS"."PARAM_KEY" LIKE 'impala_intermediate%'; pg_size_pretty ---------------- 624 MB (1 row) {code}
    via by Silvius Rus,
  • There is out of memory occured when responding big file data using ShallowEtagHeaderFilter. We configured mvc like below. <!-- Enables the Spring MVC @Controller programming model --> <mvc:annotation-driven/> <mvc:default-servlet-handler /> <!-- Handles HTTP GET requests for /resources/** by efficiently serving up static resources in the ${webappRoot}/resources directory --> <!-- mvc:resources mapping="/resources/**" location="/resources/" /--> <mvc:resources mapping="/data/**" location="file:${config.uploadDir}\" /> For example, If a client request resource with http://localhost/data/content/a.zip(1GBytes) then following stack trace is shown. But It's normal when commenting out ETAG filter configuration. I also attached spring's web.xml and apache tomcat's web.xml. java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Unknown Source) at java.io.ByteArrayOutputStream.grow(Unknown Source) at java.io.ByteArrayOutputStream.ensureCapacity(Unknown Source) at java.io.ByteArrayOutputStream.write(Unknown Source) at org.springframework.web.filter.ShallowEtagHeaderFilter$ShallowEtagResponseWrapper$ResponseServletOutputStream.write(ShallowEtagHeaderFilter.java:245) at org.springframework.util.StreamUtils.copy(StreamUtils.java:125) at org.springframework.util.FileCopyUtils.copy(FileCopyUtils.java:109) at org.springframework.web.servlet.resource.ResourceHttpRequestHandler.writeContent(ResourceHttpRequestHandler.java:244) at org.springframework.web.servlet.resource.ResourceHttpRequestHandler.handleRequest(ResourceHttpRequestHandler.java:145) at org.springframework.web.servlet.mvc.HttpRequestHandlerAdapter.handle(HttpRequestHandlerAdapter.java:49) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:925) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:856) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:936) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:827) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:812) at javax.servlet.http.HttpServlet.service(HttpServlet.java:728) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.web.filter.ShallowEtagHeaderFilter.doFilterInternal(ShallowEtagHeaderFilter.java:73) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:343) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:97) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:100) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:78)
    via by ChangMin Jeon,
  • There is out of memory occured when responding big file data using ShallowEtagHeaderFilter. We configured mvc like below. <!-- Enables the Spring MVC @Controller programming model --> <mvc:annotation-driven/> <mvc:default-servlet-handler /> <!-- Handles HTTP GET requests for /resources/** by efficiently serving up static resources in the ${webappRoot}/resources directory --> <!-- mvc:resources mapping="/resources/**" location="/resources/" /--> <mvc:resources mapping="/data/**" location="file:${config.uploadDir}\" /> For example, If a client request resource with http://localhost/data/content/a.zip(1GBytes) then following stack trace is shown. But It's normal when commenting out ETAG filter configuration. I also attached spring's web.xml and apache tomcat's web.xml. java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Unknown Source) at java.io.ByteArrayOutputStream.grow(Unknown Source) at java.io.ByteArrayOutputStream.ensureCapacity(Unknown Source) at java.io.ByteArrayOutputStream.write(Unknown Source) at org.springframework.web.filter.ShallowEtagHeaderFilter$ShallowEtagResponseWrapper$ResponseServletOutputStream.write(ShallowEtagHeaderFilter.java:245) at org.springframework.util.StreamUtils.copy(StreamUtils.java:125) at org.springframework.util.FileCopyUtils.copy(FileCopyUtils.java:109) at org.springframework.web.servlet.resource.ResourceHttpRequestHandler.writeContent(ResourceHttpRequestHandler.java:244) at org.springframework.web.servlet.resource.ResourceHttpRequestHandler.handleRequest(ResourceHttpRequestHandler.java:145) at org.springframework.web.servlet.mvc.HttpRequestHandlerAdapter.handle(HttpRequestHandlerAdapter.java:49) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:925) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:856) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:936) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:827) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:812) at javax.servlet.http.HttpServlet.service(HttpServlet.java:728) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.web.filter.ShallowEtagHeaderFilter.doFilterInternal(ShallowEtagHeaderFilter.java:73) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:343) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:97) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:100) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:355) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:78)
    via by ChangMin Jeon,
  • FULL PRODUCT VERSION : java version "1.8.0_25" Java(TM) SE Runtime Environment (build 1.8.0_25-b17) Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) java version "1.7.0_72" Java(TM) SE Runtime Environment (build 1.7.0_72-b14) Java HotSpot(TM) 64-Bit Server VM (build 24.72-b04, mixed mode) ADDITIONAL OS VERSION INFORMATION : Darwin braeburn 12.6.0 Darwin Kernel Version 12.6.0: Wed Dec 17 19:11:40 PST 2014; root:xnu-2050.48.15~1/RELEASE_X86_64 x86_64 EXTRA RELEVANT SYSTEM CONFIGURATION : Does not matter. A DESCRIPTION OF THE PROBLEM : Application crashes with: java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Arrays.java:2271) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:122) This happens because in ByteArrayOutputStream.grow() Arrays.copyOf is called with Integer.MAX_VALUE which does not work. The newCapacity should be smaller e.g. Integer.MAX_VALUE - 2 seems to work. STEPS TO FOLLOW TO REPRODUCE THE PROBLEM : Create the following junit test and run it with -Xmx4069m: @Test public void testByteArrayOutputStream() throws IOException { byte[] buf = new byte[0x20000000]; ByteArrayOutputStream out = new ByteArrayOutputStream(0x3fffffff); out.write(buf); out.write(buf); } EXPECTED VERSUS ACTUAL BEHAVIOR : EXPECTED - No crash. ACTUAL - java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Arrays.java:3236) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140) at java.io.OutputStream.write(OutputStream.java:75) at ArrayTest.testByteArrayOutputStream(ArrayTest.java:21) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.junit.runner.JUnitCore.run(JUnitCore.java:157) at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:74) at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:211) at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:67) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134) ERROR MESSAGES/STACK TRACES THAT OCCUR : java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Arrays.java:3236) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140) at java.io.OutputStream.write(OutputStream.java:75) at ArrayTest.testByteArrayOutputStream(ArrayTest.java:21) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.junit.runner.JUnitCore.run(JUnitCore.java:157) at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:74) at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:211) at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:67) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134) REPRODUCIBILITY : This bug can be reproduced occasionally. ---------- BEGIN SOURCE ---------- import org.junit.Test; import java.io.ByteArrayOutputStream; import java.io.IOException; import java.util.Arrays; public class ArrayTest { @Test public void testByteArrayOutputStream() throws IOException { byte[] buf = new byte[0x20000000]; ByteArrayOutputStream out = new ByteArrayOutputStream(0x3fffffff); out.write(buf); out.write(buf); } } ---------- END SOURCE ----------
    via by Webbug Group,
    • java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.util.Arrays.copyOf(Arrays.java:2271) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140) at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145) at org.apache.thrift.protocol.TBinaryProtocol.writeString(TBinaryProtocol.java:187) at com.cloudera.impala.thrift.THdfsPartition$THdfsPartitionStandardScheme.write(THdfsPartition.java:1831) at com.cloudera.impala.thrift.THdfsPartition$THdfsPartitionStandardScheme.write(THdfsPartition.java:1543) at com.cloudera.impala.thrift.THdfsPartition.write(THdfsPartition.java:1389) at com.cloudera.impala.thrift.THdfsTable$THdfsTableStandardScheme.write(THdfsTable.java:1123) at com.cloudera.impala.thrift.THdfsTable$THdfsTableStandardScheme.write(THdfsTable.java:969) at com.cloudera.impala.thrift.THdfsTable.write(THdfsTable.java:848) at com.cloudera.impala.thrift.TTable$TTableStandardScheme.write(TTable.java:1628) at com.cloudera.impala.thrift.TTable$TTableStandardScheme.write(TTable.java:1395) at com.cloudera.impala.thrift.TTable.write(TTable.java:1209) at com.cloudera.impala.thrift.TCatalogObject$TCatalogObjectStandardScheme.write(TCatalogObject.java:1241) at com.cloudera.impala.thrift.TCatalogObject$TCatalogObjectStandardScheme.write(TCatalogObject.java:1098) at com.cloudera.impala.thrift.TCatalogObject.write(TCatalogObject.java:938) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse$TGetAllCatalogObjectsResponseStandardScheme.write(TGetAllCatalogObjectsResponse.java:487) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse$TGetAllCatalogObjectsResponseStandardScheme.write(TGetAllCatalogObjectsResponse.java:421) at com.cloudera.impala.thrift.TGetAllCatalogObjectsResponse.write(TGetAllCatalogObjectsResponse.java:365) at org.apache.thrift.TSerializer.serialize(TSerializer.java:79) at com.cloudera.impala.service.JniCatalog.getCatalogObjects(JniCatalog.java:110)

    Users with the same issue

    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor2 times, last one,
    51 more bugmates