Member since
11-10-2023
3
Posts
3
Kudos Received
0
Solutions
11-10-2023
12:41 AM
Hello, im new in Hadoop and just want to know, can i using hdfs as local storage in my Spark driver? For example: im sending throught Livy a task where kind":"pyspark" and "code" which contains some operations, that in result should be create some new file. When i do it in yarn cluster mode, i find that new file was created in a local storage of node with path like: /tmp/hadoop-username/nm-local-dir/usercache/root/appcache...... Can i have any way for set path instead local in hdfs? I want save my spark results(new created file) in hdfs When i set spark.local.dir or yarn.nodemanager.local-dirs = hdfs:///temp Livy session just not starting Mounting HDFS dfs-fuse not seems like the best way. Or i should use my own fileApp.jar that will be work on each node and each sessions?
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
-
HDFS