Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by Unknown author, 1 year ago
Could not setMode for UFS file hdfs://hstore/user/hduser/.staging/job_1470024122519_4394/job.split
via Google Groups by Unknown author, 1 year ago
Could not setMode for UFS file hdfs://hstore/user/hduser/.staging/job_1470024122519_5219/job.split
alluxio.exception.AccessControlException: Could not setMode for 
UFS file hdfs://hstore/user/hduser/.staging/job_1470024122519_4394/job.split	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)	at alluxio.exception.AlluxioException.fromThrift(AlluxioException.java:99)	at alluxio.AbstractClient.retryRPC(AbstractClient.java:326)	at alluxio.client.file.FileSystemMasterClient.setAttribute(FileSystemMasterClient.java:299)	at alluxio.client.file.BaseFileSystem.setAttribute(BaseFileSystem.java:298)	at alluxio.hadoop.AbstractFileSystem.setPermission(AbstractFileSystem.java:353)	at alluxio.hadoop.FileSystem.setPermission(FileSystem.java:25)	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:578)	at org.apache.hadoop.mapreduce.split.JobSplitWriter.createFile(JobSplitWriter.java:101)	at org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles(JobSplitWriter.java:77)	at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:603)	at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:415)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)	at com.antfact.laundry.shuqi.job.IOReadTest.run(IOReadTest.java:72)	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)	at com.antfact.laundry.shuqi.job.IOReadTest.main(IOReadTest.java:38)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:606)	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)