- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Created on 07-26-2018 08:37 PM - edited 08-17-2019 06:52 AM
Default location for spark event log history is hdfs:///spark-history (Spark) and hdfs:///spark2-history/ (spark2).
This will help to debug spark-history page load issue or if you have huge count of event log files you can archive it by creating the new active location.
Following are the steps to change this default location.
1. Create new directory on hdfs for e.g
$hdfs dfs -mkdir /spark2-history_new $hdfs dfs -chown spark:hadoop /spark2-history_new
2. Login to Amabri==>Spark==>config.
3. Update following parameters with new path "hdfs:///spark2-history_new/ " as follow.
4. Save the configuration.
5. Restart Spark service to enable new changes.
6. Run spark job, your new event log file will get save in new location. Same you can view using the spark history UI.