Member since
02-02-2021
116
Posts
2
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
747 | 08-13-2021 09:44 AM | |
3710 | 04-27-2021 04:23 PM | |
1389 | 04-26-2021 10:47 AM | |
924 | 03-29-2021 06:01 PM | |
2758 | 03-17-2021 04:53 PM |
05-13-2021
10:10 AM
hi Experts,
I successfully installed hive via ambari. Recently I tried to add another hive instance so it would be HA.
Now on one of the hive metastore ambari is complaining about an error.
Metastore on test02.domain.com failed (Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/alerts/alert_hive_metastore.py", line 200, in execute
timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of 'export HIVE_CONF_DIR='/etc/hive/conf.server' ; hive --hiveconf hive.metastore.uris=thrift://test02.domain.com:9083 --hiveconf hive.metastore.client.connect.retry.delay=1 --hiveconf hive.metastore.failure.retries=1 --hiveconf hive.metastore.connect.retries=1 --hiveconf hive.metastore.client.socket.timeout=14 --hiveconf hive.execution.engine=mr -e 'show databases;'' returned 4. Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
)
I believe hive is working. Any help is much appreciated on how to resolve this ambari alert error. Thanks,
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache Hive
05-04-2021
10:55 AM
Hi experts,
I am trying to deploy a new cluster via ambari, however, ambari is showing a warning after I deployed the basic components such as zookeeper, hdfs, yarn, and mapreduce.
The following components are reporting unexpected versions:
test1.test.com
ZOOKEEPER_SERVER: UNKNOWN
DATANODE: UNKNOWN
TEZ_CLIENT: UNKNOWN
ZOOKEEPER_CLIENT: UNKNOWN
HDFS_CLIENT: UNKNOWN
Any help is much appreciated.
Thanks,
... View more
Labels:
04-27-2021
04:23 PM
1 Kudo
Here is an update. So I finally was able to use ambari to get Hive installed and services started using the apache bigtop repo. Also was able to connect via hiveCLI as well as beeline(hiveserver2) and ran a simple "show databases;" which ran successfully. So after symlinking the following directories, hiveserver2 finally was able to start successfully. [root@test ~]# ll /usr/bgtp/current/ total 32 lrwxrwxrwx 1 root root 13 Apr 23 19:38 hive-client -> /usr/lib/hive lrwxrwxrwx 1 root root 13 Apr 23 19:37 hive-metastore -> /usr/lib/hive lrwxrwxrwx 1 root root 13 Apr 27 16:28 hive-server2 -> /usr/lib/hive lrwxrwxrwx 1 root root 22 Apr 27 16:29 hive-webhcat -> /usr/lib/hive-hcatalog I did not find any documentation on how to install hive with the apache bigtop packages using ambari. There was some documentation on how to install hive with the apache bigtop package using the command line. If anyone finds any documentation on how to install the different components in the apache bigtop package, please let me know. Thanks,
... View more
04-27-2021
02:23 PM
UPDATE: So I tried to reinstall hive again and to comment out those packages as i had mentioned earlier on this thread. I just noticed the error on the bottom in ambari resource_management.core.exceptions.Fail: Configuration parameter 'hiveserver2-site' was not found in configurations dictionary! So I ran the command cp /var/lib/ambari-server/resources/stacks/BigInsights/4.2.5/services/HIVE/configuration/hiveserver2-site.xml /var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/configuration restarted ambari and now it seems like I dont get any errors in ambari anymore. Per ambari, it is able to start all the hive components successfully, however my hiveserver2 seems to be crashing after ambari starts it. All the other hive components remain running.
... View more
04-27-2021
11:22 AM
When modifying the file "/var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/metainfo.xml" to bypass the above errors, I am able to successfully install the different hive components(metastore, hiveserver2, and webhcat). I am also able to successfully start the metastore. I commented out the following lines from the file "/var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/metainfo.xml" to bypass the error. <osSpecifics> <osSpecific> <osFamily>any</osFamily> <packages> <package> <name>hive</name> </package> <package> <name>hive-hcatalog</name> </package> <!-- <package> <name>webhcat-tar-hive</name> </package> <package> <name>webhcat-tar-pig</name> </package> <package> <name>mysql-connector-java</name> <skipUpgrade>true</skipUpgrade> <condition>should_install_mysql_connector</condition> </package> --> </packages> </osSpecific> <osSpecific> <osFamily>amazon2015,redhat6,suse11,suse12</osFamily> <packages> <package> <name>mysql</name> <skipUpgrade>true</skipUpgrade> </package> </packages> </osSpecific> However, I get the following errors when trying to start hiveserver2 and webhcat. 2021-04-27 13:18:33,875 - WARNING. Cannot copy pig tarball because file does not exist: /usr/bgtp/1.0/pig/pig.tar.gz . It is possible that this component is not installed on this host.
2021-04-27 13:18:33,877 - WARNING. Cannot copy hive tarball because file does not exist: /usr/bgtp/1.0/hive/hive.tar.gz . It is possible that this component is not installed on this host.
2021-04-27 13:18:33,878 - WARNING. Cannot copy sqoop tarball because file does not exist: /usr/bgtp/1.0/sqoop/sqoop.tar.gz . It is possible that this component is not installed on this host.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py", line 161, in <module>
HiveServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py", line 77, in start
self.configure(env) # FOR SECURITY
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py", line 51, in configure
hive(name='hiveserver2')
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 269, in hive
mode=0600)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 66, in action_create
encoding = self.resource.encoding
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 52, in __call__
return self.get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 144, in get_content
rendered = self.template.render(self.context)
File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 891, in render
return self.environment.handle_exception(exc_info, True)
File "<template>", line 2, in top-level template code
File "/usr/lib/python2.6/site-packages/ambari_jinja2/filters.py", line 176, in do_dictsort
return sorted(value.items(), key=sort_func)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'hiveserver2-site' was not found in configurations dictionary!
... View more
04-27-2021
10:05 AM
@vidanimegh This is where I got the repos https://bigtop.apache.org/download.html#releases Also I have not found any documentation on how to deploy a hadoop cluster using ambari from the apache bigtop package. All I found was that in the newest version they added a mpack which makes it easier for users to implement through ambari. However, I have implemented the mpack, but it only consists of basic components such as hdfs, yarn, mapreduce, and zookeeper. I was able to implement other components such as tez or sqoop successfully, but having issues with hive.
... View more
04-27-2021
07:42 AM
@vidanimegh Thanks for the response. However, I was able to install other components of hadoop such as zookeeper, hdfs, yarn, mapreduce using the bigtop mpack via ambari. Also earlier ambari was doing a yum install hcatalog which it could not find the package. But I modified the file "/var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/metainfo.xml" so that it did a yum install hive-hcatalog. If I modify that file again to ignore these packages, I am able to successfully install hive metastore and successfully accessed the hive database, however, am having issues starting hiveserver2 and webhcat as it seems to be looking for those files.
... View more
04-27-2021
06:25 AM
@vidanimegh Thanks for the response. Please find the below info... [root@test ~]# yum list webhcat-tar-hive* Loaded plugins: fastestmirror, rhnplugin This system is receiving updates from RHN Classic or Red Hat Satellite. Loading mirror speeds from cached hostfile * base: bay.uchicago.edu * epel: dl.fedoraproject.org * extras: mirror.mobap.edu * updates: ftp.ussg.iu.edu Error: No matching Packages to list [root@test ~]# cat /etc/yum.repos.d/bigtop.repo [bigtop] name=Bigtop enabled=1 gpgcheck=1 baseurl=http://repos.bigtop.apache.org/releases/1.5.0/centos/7/$basearch gpgkey=https://dist.apache.org/repos/dist/release/bigtop/KEYS [root@test ~]# cat /etc/yum.repos.d/BGTP.repo [BGTP-1.0] name=BGTP-1.0 baseurl=http://repos.bigtop.apache.org/releases/1.5.0/centos/7/x86_64/ path=/ enabled=1 gpgcheck=0[root@test ~]#
... View more
04-26-2021
05:24 PM
Hi experts, I am trying to install hive using ambari which i downloaded the bigtop repo. Here is the error: Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 79, in <module>
HCatClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 35, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 821, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 264, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install webhcat-tar-hive' returned 1. Error: Nothing to do Any help is very much appreciated. Thanks,
... View more
Labels:
04-26-2021
10:47 AM
ok so I finally figured out how to have ambari run the command /usr/bin/yum -d 0 -e 0 -y install hive-hcatalog I had to modify the file: /var/lib/ambari-server/resources/common-services/HIVE/0.12.0.2.0/metainfo.xml under the <osSpecifics> tag. Restarted ambari and then ambari was able to run the correct command to have hive installed.
... View more