java.io.IOException: Failed to cache: Unable to request space from worker

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by Krishnaprasad, 1 year ago
Failed to cache: Unable to request space from worker
via Google Groups by Zaicheng Wang, 10 months ago
Failed to cache: Not enough space left on worker ip-10-10-48-40.ec2.internal/10.10.48.40:29998 to store blockId 3808428037. Please consult http://www.alluxio.org/docs/1.3/en/Debugging-Guide.html for common solutions to address this problem.
via Google Groups by Tim B, 1 year ago
Failed to cache: Unable to request space from worker
via Google Groups by test520, 1 year ago
Failed to cache: alluxio.exception. BlockAlreadyExistsException: Temp blockId 16,777,216 is not available, because it is already committed
via Google Groups by Amran Chen, 11 months ago
Failed to cache: alluxio.exception. BlockAlreadyExistsException: Temp blockId 33,554,432 is not available, because it is already committed
via Google Groups by Kaiming Wan, 10 months ago
Failed to cache: /home/alluxio/ramdisk/alluxioworker/.tmp_blocks/678/5bbebd62959576a6-c000000 (Permission denied)
java.io.IOException: Failed to cache: Unable to request space from worker
at alluxio.client.block.LocalBlockOutStream.requestSpace(LocalBlockOutStream.java:137)
at alluxio.client.block.LocalBlockOutStream.flush(LocalBlockOutStream.java:114)
at alluxio.client.block.BufferedBlockOutStream.write(BufferedBlockOutStream.java:104)
at alluxio.client.file.FileOutStream.write(FileOutStream.java:284)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter.writeObject(TextOutputFormat.java:83)
at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter.write(TextOutputFormat.java:98)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at com.flytxt.bigdata.mr.WordCount$IntSumReducer.reduce(WordCount.java:101)
at com.flytxt.bigdata.mr.WordCount$IntSumReducer.reduce(WordCount.java:90)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Know the solutions? Share your knowledge to help other developers to debug faster.