Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

java.io.IOException: Failed to create local dir

avatar

I was trying to run this line :
val fraud = sc.textFile("hdfs://sandbox-hdp.hortonworks.com:8020/tmp/fraud.csv")
but then I kept getting this error (although it worked on spark shell!)

java.io.IOException: Failed to create local dir in /tmp/blockmgr-c40d2915-3861-4bbe-8e1c-5eca677c552e/0e.
  at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
  at org.apache.spark.storage.DiskStore.remove(DiskStore.scala:135)
  at org.apache.spark.storage.BlockManager.removeBlockInternal(BlockManager.scala:1457)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:991)
  at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
  at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:792)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1350)
  at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:122)
  at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:88)
  at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:56)
  at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1488)
  at org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1037)
  at org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply(SparkContext.scala:1029)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
  at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1029)
  at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)
  at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
  at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
  at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
  ... 48 elided
1 REPLY 1

avatar
Super Guru

@Khouloud Landari,

Did you check if there is enough space in /tmp folder in all the nodes (workers + master). Also check the permissions of the folder.

.

-Aditya