Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    via by Unknown author

    Spark was running out of memory.

  2. ,
    via by Unknown author

    Spark ran out of memory

Solutions on the web

via Google Groups by Saravanan Tirugnanum, 1 year ago
java.lang.InterruptedException: 	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(	at java.util.concurrent.CountDownLatch.await(	at org.apache.kafka.clients.producer.internals.ProduceRequestResult.await(	at org.apache.kafka.clients.producer.internals.RecordAccumulator.awaitFlushCompletion(	at org.apache.kafka.clients.producer.KafkaProducer.flush(	at org.apache.kafka.connect.util.KafkaBasedLog.readToEnd(	at	at	at	at org.apache.kafka.connect.runtime.WorkerSourceTask$WorkerSourceTaskThread.execute(	at