Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark2 History Server Cant Start using HDP 2.5

avatar
New Member

Hi, I am having trouble to start Spark2 History Server in Ambari. Below is the standard errors.

stderr: /var/lib/ambari-agent/data/errors-3723.txt

2017-12-01 11:26:34,759 - Found multiple matches for stack version, cannot identify the correct one from: 2.5.3.0-37, 2.6.3.0-235
2017-12-01 11:26:34,759 - Cannot copy spark2 tarball to HDFS because stack version could be be determined.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py", line 103, in <module>
    JobHistoryServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py", line 56, in start
    spark_service('jobhistoryserver', upgrade_type=upgrade_type, action='start')
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/spark_service.py", line 65, in spark_service
    make_tarfile(tmp_archive_file, source_dir)
  File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/spark_service.py", line 38, in make_tarfile
    os.remove(output_filename)
TypeError: coercing to Unicode: need string or buffer, NoneType found
1 ACCEPTED SOLUTION

avatar
Super Guru

@Ashikin,

1)Can you try moving the file /usr/hdp/2.6.3.0-235 to some other folder. Make sure that only 2.5.3.0-37 folder is there in /usr/hdp folder.

2) Then run hdp-select set all 2.5.3.0-37

3)Now run hdp-select versions. Should return only 2.5.3.0-37

4) Restart spark history

If it still fails . Then try the manual steps for creating the tar file which i have mentioned above.

For the failure which you have mentioned while running the manual steps, you can try creating these folders manually

hdfs dfs -mkdir /hdp/apps/2.5.3.0-37/
hdfs dfs -mkdir /hdp/apps/2.5.3.0-37/spark2
hdfs dfs -chown -R hdfs:hdfs /hdp/apps/2.5.3.0-37

View solution in original post

10 REPLIES 10

avatar
New Member

Hi Aditya, I have problem that Zeppelin UI cant found my spark2 interpreter. Error stated 'Prefix not found'