Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via sqoop-user by Vishwakarma, Chhaya, 7 months ago
Output directory hdfs://sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC<http://sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC> already exists
via sqoop-user by Vishwakarma, Chhaya, 7 months ago
Output directory hdfs://sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC<http://sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC> already exists
via sqoop-user by Kunal Gaikwad, 1 year ago
Output directory hdfs:// sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC already exists
via sqoop-user by Kunal Gaikwad, 7 months ago
Output directory hdfs:// sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC already exists
via sqoop-user by Rahul Dhuvad, 1 year ago
Output directory hdfs:// sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC already exists
via sqoop-user by Kunal Gaikwad, 1 year ago
Output directory hdfs:// sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC already exists
org.apache.hadoop.mapred.FileAlreadyExistsException: Output
directory hdfs://
sandbox.hortonworks.com:8020/user/hdfs/Product_DailyRevenueDetails_HPOC
already exists	at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)	at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:422)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)	at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)	at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)	at org.apache.sqoop.Sqoop.run(Sqoop.java:148)	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)	at org.apache.sqoop.Sqoop.main(Sqoop.java:244)