Member since
10-18-2017
16
Posts
0
Kudos Received
0
Solutions
06-15-2018
08:12 AM
I could migrate data from Hive to SAP HANA via Spark. Thank you anyway.
... View more
06-13-2018
01:04 AM
I would like to migrate and load data on Hive tables into SAP HANA tables. However, there are no infomation from Hive to SAP HANA on the web. If anyone has already done that practice or has beneficial infomation, could you share that. FYI I could have done from SAP HANA to Hive with Hadoop ODBC Adapter. I would like to know the reverse practice. Regards, Hiroshi Shidara
... View more
Labels:
- Labels:
-
Apache Hive
05-23-2018
12:39 AM
@emaxwell Really Thank you for replying. I got it. I will get ready with CentOS 7.4 and try to install that. Regards,
... View more
05-22-2018
09:18 AM
I have tried to install Spark client with Ambari. However, I have faced into the below errors. File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install spark2_2_6_4_0_91' returned 1. Error: Package: glibc-headers-2.17-196.el7.x86_64 (centos74) Requires: glibc = 2.17-196.el7 Installed: glibc-2.17-222.el7.x86_64 (installed) glibc = 2.17-222.el7 Available: glibc-2.17-196.el7.x86_64 (centos74) glibc = 2.17-196.el7 Error: Package: glibc-devel-2.17-196.el7.x86_64 (centos74) Requires: glibc = 2.17-196.el7 Installed: glibc-2.17-222.el7.x86_64 (installed) glibc = 2.17-222.el7 Available: glibc-2.17-196.el7.x86_64 (centos74) glibc = 2.17-196.el7 Error: Package: pciutils-3.5.1-2.el7.x86_64 (centos74) Requires: pciutils-libs = 3.5.1-2.el7 Installed: pciutils-libs-3.5.1-3.el7.x86_64 (installed) pciutils-libs = 3.5.1-3.el7 Available: pciutils-libs-3.5.1-2.el7.x86_64 (centos74) pciutils-libs = 3.5.1-2.el7 Error: Package: libdb-devel-5.3.21-20.el7.x86_64 (centos74) Requires: libdb(x86-64) = 5.3.21-20.el7 Installed: libdb-5.3.21-24.el7.x86_64 (installed) libdb(x86-64) = 5.3.21-24.el7 Available: libdb-5.3.21-20.el7.x86_64 (centos74) libdb(x86-64) = 5.3.21-20.el7 I thought that error was difference of the version which was glibc, pcutils-libs and libdb. So, I would have downgraded the version but It was not available. If you have information, please give me that. Regards,
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Spark
05-22-2018
01:41 AM
@Jay Kumar SenSharma Thank you for replying. I have soluted this problem. It is problem that I do not install Java, snappy and some items. I appreciate your reply firstly. Regards,
... View more
05-21-2018
10:07 AM
I could not to install some components. What is this error? If you have the answer, please give me that. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 32, in hook
setup_java()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 216, in setup_java
raise Fail(format("Unable to access {java_exec}. Confirm you have copied jdk to this host."))
resource_management.core.exceptions.Fail: Unable to access /usr/lib/jvm/java-1.8.0-openjdk/bin/java. Confirm you have copied jdk to this host.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3955.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3955.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
BeforeInstallHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute
self.save_component_version_to_structured_out(self.command_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out
stack_select_package_name = stack_select.get_package_name()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name
package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages
supported_packages = get_supported_packages()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages
raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select
stdout: /var/lib/ambari-agent/data/output-3955.txt
2018-05-21 09:44:33,525 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-05-21 09:44:33,530 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-05-21 09:44:33,531 - Group['livy'] {}
2018-05-21 09:44:33,533 - Adding group Group['livy']
2018-05-21 09:44:33,557 - Group['spark'] {}
2018-05-21 09:44:33,557 - Adding group Group['spark']
2018-05-21 09:44:33,573 - Group['hdfs'] {}
2018-05-21 09:44:33,573 - Adding group Group['hdfs']
2018-05-21 09:44:33,589 - Group['zeppelin'] {}
2018-05-21 09:44:33,589 - Adding group Group['zeppelin']
2018-05-21 09:44:33,604 - Group['hadoop'] {}
2018-05-21 09:44:33,605 - Adding group Group['hadoop']
2018-05-21 09:44:33,620 - Group['users'] {}
2018-05-21 09:44:33,621 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,621 - Adding user User['hive']
2018-05-21 09:44:33,655 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,655 - Adding user User['zookeeper']
2018-05-21 09:44:33,682 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,682 - Adding user User['ams']
2018-05-21 09:44:33,708 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-05-21 09:44:33,708 - Adding user User['tez']
2018-05-21 09:44:33,736 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-05-21 09:44:33,737 - Adding user User['zeppelin']
2018-05-21 09:44:33,764 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,764 - Adding user User['livy']
2018-05-21 09:44:33,790 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,790 - Adding user User['spark']
2018-05-21 09:44:33,818 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-05-21 09:44:33,818 - Adding user User['ambari-qa']
2018-05-21 09:44:33,846 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,846 - Adding user User['kafka']
2018-05-21 09:44:33,873 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-05-21 09:44:33,874 - Adding user User['hdfs']
2018-05-21 09:44:33,901 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,901 - Adding user User['sqoop']
2018-05-21 09:44:33,928 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,929 - Adding user User['yarn']
2018-05-21 09:44:33,956 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,956 - Adding user User['mapred']
2018-05-21 09:44:33,984 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:33,984 - Adding user User['hbase']
2018-05-21 09:44:34,011 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-21 09:44:34,011 - Adding user User['hcat']
2018-05-21 09:44:34,036 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-21 09:44:34,040 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2018-05-21 09:44:34,040 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2018-05-21 09:44:34,041 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-05-21 09:44:34,045 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-05-21 09:44:34,046 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-05-21 09:44:34,046 - Creating directory Directory['/tmp/hbase-hbase'] since it doesn't exist.
2018-05-21 09:44:34,046 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2018-05-21 09:44:34,047 - Changing permission for /tmp/hbase-hbase from 755 to 775
2018-05-21 09:44:34,047 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-21 09:44:34,048 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-21 09:44:34,049 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-05-21 09:44:34,056 - call returned (0, '1014')
2018-05-21 09:44:34,057 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-05-21 09:44:34,061 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
2018-05-21 09:44:34,062 - Group['hdfs'] {}
2018-05-21 09:44:34,062 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-05-21 09:44:34,062 - FS Type:
2018-05-21 09:44:34,063 - Directory['/etc/hadoop'] {'mode': 0755}
2018-05-21 09:44:34,063 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2018-05-21 09:44:34,063 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-05-21 09:44:34,063 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist.
2018-05-21 09:44:34,064 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2018-05-21 09:44:34,064 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2018-05-21 09:44:34,064 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3955.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3955.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
2018-05-21 09:44:34,082 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries Regards,
... View more
- Tags:
- error
- Installation
11-01-2017
07:30 AM
@vperiasamy Thank you for reply. I resolved the problem. Ranger setting caused that error. "kerberos" seem to query by host name. But It is set to be IP in the Ranger setting, "External address". I have changed IP to hostname in the Ranger setting. In the RangerHA environmet,"kerberized cluster" get to the loadbalancer(RangerHA).
... View more
10-26-2017
12:29 AM
I think that error is related to ResourceManager of Yarn and policies are invalid.
... View more
10-26-2017
12:27 AM
@vperiasamy Sorry, my english is broken. Under Ranger Admin HA environmet, I try to enable kerberize. Enabling kerberize fail by "Start and Test Service" in the wizard. The log is attached to the previous question.
... View more
10-25-2017
05:07 AM
Fail to refer to policies on Ranger Admin HA and kerberized environment. First, I validated Ranger Admin HA. Second, I validated enabling kerberize. "Start and Test Service", in the setting wizard, failed and forcibly clicked "Complete" In this result, ResourceManager and policies are invalid. Please refer to the attachments. Regards,
... View more
Labels:
- Labels:
-
Apache Ranger
10-25-2017
04:16 AM
@Jay SenSharma Thank you for your infomation and the document. I refer to it.
... View more
10-25-2017
04:14 AM
@Robert Levas @Jay Sensharma Thank you for your infomation. We will try the practice.
... View more
10-24-2017
10:51 AM
I want to use High Availability for kerberos. If anyone have the method, Please give me the infomation. Regards,
... View more
Labels:
- Labels:
-
Kerberos