java.io.IOException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • spark2 scala throwing an error / exception
    via Stack Overflow by user2543622
    ,
  • Spark Error at the time of Quit
    via Stack Overflow by ankitbeohar90
    ,
  • Hive Context not working (spark 1.6.2)
    via Stack Overflow by aswa09
    ,
  • ERROR ShutDownHookManager
    via Stack Overflow by Arish
    ,
  • Spark failed to delete temp directory
    via Stack Overflow by arpit bhatnagar
    ,
    • java.io.IOException: Failed to delete: C:\Users\a-vs\AppData\Local\Temp\2\spark-e66a8443-a4a5-47df-807f-03667a23889c at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:986) at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:64) at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:61) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:61) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:215) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:187) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:187) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:187) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1857) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:187) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:187) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:187) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:187) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:177) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

    Users with the same issue

    muffinmannen
    104 times, last one,
    Unknown visitor2 times, last one,
    Unknown visitor1 times, last one,
    johnxfly
    2 times, last one,
    tyson925
    5 times, last one,