Member since
04-18-2018
18
Posts
0
Kudos Received
0
Solutions
02-16-2020
05:47 PM
Thanks, I did try this and it worked out fine. Thanks!
... View more
01-30-2020
05:43 PM
Hi,
Can some one help me how to import pandas and Numpy in Livy2.
I'm using zeppelin and using the Livy2. In the Livy i'm creating a notebook and provide the following.
%pyspark
import pandas as py
I see the below error as
"No module found"
If i use the same using the pyspark interpreter it looks good and i don't have any issues. Need some help on this.
Thanks
Sambasivam.
... View more
Labels:
- Labels:
-
Apache Zeppelin
01-27-2019
04:35 PM
I just changed the amb_ranger_admin password along with the admin password in the Ranger UI and in the ambari for ranger admin then it started working. Thanks!
... View more
01-26-2019
09:02 PM
Hi, We are in the process of upgrading the HDP stack from 2.6.3 to 3.1.0 and we face the following error. Verify Ambari and Ranger Password Synchronization Reason: Credentials for user 'amb_ranger_admin' in Ambari do not match Ranger.Failed on: RANGER We did change the password in Ranger and followed by the Ambari as well. Not sure why this synchronization is not happening? Can any one help me out on this. Thanks!
... View more
Labels:
10-30-2018
02:05 PM
Hi, I want to disable the Hive shell for the users and provide access at the AD Group level, if [ "$SERVICE" = "cli" ] && [ "$USER" != "samba" ]; then
echo "Sorry! We have disabled hive-shell contact Admin" exit 1
fi This works good at the user level access but then i want to provide access at the AD group level. I tried with groups instead of user but then it didn't work out, Can some one help me out on this.
... View more
Labels:
- Labels:
-
Apache Hive
09-27-2018
10:40 PM
Hi, Currently we are using HDP 2.6.x and planning to upgrade to HDP 3.0.0 but then in HDP 3.0.0 we see that the TEZ view is not available, so where do i go and check my jobs status that are executed under the LLAP Queue. As i know that YARN will just have one daemon running for LLAP but then the details would be viewed in the tez view in 2.6.x. Now with HDP 3.0.0 where to go and view?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Tez
08-09-2018
07:03 PM
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 361, in <module> NameNode().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 99, in start upgrade_suspended=params.upgrade_suspended, env=env) File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 175, in namenode create_log_dir=True File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 276, in service Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/2.6.3.0-235/hadoop/sbin/hadoop-daemon.sh --config /usr/hdp/2.6.3.0-235/hadoop/conf start namenode'' returned 1. starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-<URL>.out I'm unable to start the name node and doesn't start. I see the following error in the stack trace Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 361, in <module>
NameNode().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 99, in start
upgrade_suspended=params.upgrade_suspended, env=env)
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 175, in namenode
create_log_dir=True
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 276, in service
Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/2.6.3.0-235/hadoop/sbin/hadoop-daemon.sh --config /usr/hdp/2.6.3.0-235/hadoop/conf start namenode'' returned 1. starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-<URL>.out Active NameNode Started 3 alerts ZKFailoverController Started No alerts Standby NameNode Stopped 3 alerts ZKFailoverController Started No alerts DataNodes2/2 Started DataNodes Status2 live / 0 dead / 0 decommissioning JournalNodes3/3 JournalNodes Live Can some one help me out on this. Thanks!
... View more
Labels:
07-23-2018
04:13 PM
@jonathan Sneep Hi, Thanks a lot. It looks good now. Thanks once again.
... View more
07-23-2018
03:40 PM
@Jonathan Sneep Hi, Can you tell me what parameters to be updated as a part of interpreter.json. should i have to add in interpreter settings and interpreter group
... View more
07-23-2018
01:59 PM
@Jonathan Sneep Thanks for your response. I added the service and followed the below steps Installed python get-pip.py and did a pip install py4j in the advanced zeppelin-config added python. zeppelin.interpreter.group.order zeppelin.interpreters But i haven't downloaded zeppelin or copy the interpreter/python directory to the HDP and haven't done anything on the interpreter.json. where should i have to download to get the interpreter/python directory. Currently i only see the following in interpter folder (/usr/hdp/current/zeppelin-server/interpreter) drwxr-xr-x 2 zeppelin zeppelin 136 Jun 27 13:13 angular drwxr-xr-x 3 zeppelin zeppelin 20 Jun 27 13:13 lib drwxr-xr-x 2 zeppelin zeppelin 12288 Jun 27 13:13 jdbc drwxr-xr-x 2 zeppelin zeppelin 4096 Jun 27 13:13 livy drwxr-xr-x 2 zeppelin zeppelin 4096 Jun 27 13:13 md drwxr-xr-x 2 zeppelin zeppelin 191 Jun 27 13:13 sh drwxr-xr-x 3 zeppelin zeppelin 66 Jun 27 13:13 spark Thanks!
... View more
07-20-2018
01:16 PM
Hi, I'm trying to add python interpreter in zeppelin. I checked the availability of python and also the path is set export PATH=$PATH:/usr/bin/python Already i have installed python get-pip.py and did a pip install py4j I have also added the in the advanced zeppelin-config in the zeppelin.interpreter.group.order zeppelin.interpreters After this still not able to see the python interpreter in the notebook.
... View more
Labels:
07-16-2018
05:25 PM
@sindhu Thanks for your response. I was looking at something where i can add a node and create that one as an edge node. By that way the users can start using that node for accessing hive and all.
... View more
07-16-2018
03:05 PM
Is there a document to add an additional edge node in the existing cluster
... View more
Labels:
- Labels:
-
Apache Hadoop
06-01-2018
07:38 PM
Hive server interactive starts up as hive user, so the hive user needs access to the ambari agent tmp directory. so please provide permission as "chmod 777 /var/lib/ambari-agent/tmp"
... View more
05-16-2018
09:53 PM
The service.xml needs to be changed to the default one under the following. /var/li/knox-server/data-xxx/services
... View more