org.apache.kafka.common.errors.InterruptException: Flush interrupted.

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

,
via activeintelligence.org by Unknown author

Spark was running out of memory.

,
via activeintelligence.org by Unknown author

Spark ran out of memory

Solutions on the web

via Google Groups by Saravanan Tirugnanum, 1 year ago
org.apache.kafka.common.errors.InterruptException: Flush interrupted.
at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
at org.apache.kafka.clients.producer.internals.ProduceRequestResult.await(ProduceRequestResult.java:57)
at org.apache.kafka.clients.producer.internals.RecordAccumulator.awaitFlushCompletion(RecordAccumulator.java:422)
at org.apache.kafka.connect.util.KafkaBasedLog.readToEnd(KafkaBasedLog.java:192)
at org.apache.kafka.connect.storage.KafkaOffsetBackingStore.get(KafkaOffsetBackingStore.java:112)
at org.apache.kafka.connect.storage.OffsetStorageReaderImpl.offsets(OffsetStorageReaderImpl.java:78)
at com.walmart.ei.oms.transformation.OMSSourceTask.start(OMSSourceTask.java:80)
at org.apache.kafka.connect.runtime.WorkerSourceTask$WorkerSourceTaskThread.execute(WorkerSourceTask.java:341)
at org.apache.kafka.connect.util.ShutdownableThread.run(ShutdownableThread.java:82)

Users with the same issue

2 times, 1 year ago
2 times, 2 months ago
4 times, 4 months ago
24 times, 8 months ago
2 times, 9 months ago
26 more bugmates

Know the solutions? Share your knowledge to help other developers to debug faster.