Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Zeppeln service is not starting anymore due to missing file in trash folder

Hello,

I can not start Zeppelin service from the Ambari launcher due to following error:

Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 282, in _run_command
    result_dict = json.loads(out)
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/__init__.py", line 307, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 335, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 353, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

The above exception was the cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 681, in <module>
    Master().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 250, in start
    self.create_zeppelin_dir(params)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 65, in create_zeppelin_dir
    recursive_chmod=True
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 672, in action_create_on_execute
    self.action_delayed("create")
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 669, in action_delayed
    self.get_hdfs_resource_executor().action_delayed(action_name, self)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 368, in action_delayed
    self.action_delayed_for_nameservice(None, action_name, main_resource)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 400, in action_delayed_for_nameservice
    self._set_owner(self.target_status)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 563, in _set_owner
    self.util.run_command(path, 'SETOWNER', method='PUT', owner=owner, group=group, assertable_result=False)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 209, in run_command
    return self._run_command(*args, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 290, in _run_command
    raise WebHDFSCallException(err_msg, result_dict)
resource_management.libraries.providers.hdfs_resource.WebHDFSCallException: Execution of 'curl -sS -L -w '%{http_code}' -X PUT -d '' -H 'Content-Length: 0' 'http://foodscience-labroratory.com:50070/webhdfs/v1/user/zeppelin/.Trash/Current/user/hadoop/test/foodmeal_extracted.csv?op=SETOWNER&owner=zeppelin&group=&user.name=hdfs'' returned status_code=400.

How can I fix this? Where is this trash folder of Zeppelin in HDFS located? Because the 400 error means the file in the trash is already deleted but why is he trying to load the file? Is there a command for the console to empty the zeppelin trash folder manually?


Thanks in advance 😉

7 REPLIES 7

Mentor

@Andreas Kühnert

You can locate the trash in hdfs as the root user to switch hdfs user

su - hdfs
$ hdfs dfs -lsr /user/zeppelin/.Trash

All the deleted files should be there unless you delete the file using the option -skipTrash

Here is how to recover files from trash

HTH

I already deleted the content of trash folder with hdfs dfs -expunge

But still the same error appears:

Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 282, in _run_command
    result_dict = json.loads(out)
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/__init__.py", line 307, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 335, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 353, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded


The above exception was the cause of the following exception:


Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 681, in <module>
    Master().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 250, in start
    self.create_zeppelin_dir(params)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/ZEPPELIN/package/scripts/master.py", line 65, in create_zeppelin_dir
    recursive_chmod=True
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 672, in action_create_on_execute
    self.action_delayed("create")
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 669, in action_delayed
    self.get_hdfs_resource_executor().action_delayed(action_name, self)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 368, in action_delayed
    self.action_delayed_for_nameservice(None, action_name, main_resource)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 400, in action_delayed_for_nameservice
    self._set_owner(self.target_status)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 563, in _set_owner
    self.util.run_command(path, 'SETOWNER', method='PUT', owner=owner, group=group, assertable_result=False)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 209, in run_command
    return self._run_command(*args, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 290, in _run_command
    raise WebHDFSCallException(err_msg, result_dict)
resource_management.libraries.providers.hdfs_resource.WebHDFSCallException: Execution of 'curl -sS -L -w '%{http_code}' -X PUT -d '' -H 'Content-Length: 0' 'http://foodscience-labroratory.com:50070/webhdfs/v1/user/zeppelin/.Trash/190114023342/user/hadoop/test/foodmeal_extracted.csv?op=SETOWNER&owner=zeppelin&group=&user.name=hdfs'' returned status_code=400.

Maybe there is something like a caching folder?

New Contributor

@Andreas Kühnert

From the error log, it seems that Zeppelin is trying to create some directories during installation.

You probably need to check the permissions on /user/zeppelin/ directory.

It has drwxr-xr-x

Can I delete .trash folder of zeppelin?

Anyway, problem has been solved. Permission of .trash folder was wrong.

Mentor

@Andreas Kühnert

Can you try recreating the file foodmeal_extracted.csv in hdfs in the same path that Zeppelin is looking i.e /user/zeppelin/.Trash/190114023342/user/hadoop/test/foodmeal_extracted.csv for and see if you can trick it ? 🙂