Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP3.0 upgrade, Stuck in 'Move Hive Tables'.. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected...

Highlighted

HDP3.0 upgrade, Stuck in 'Move Hive Tables'.. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected...

New Contributor

Dear All,

I am in progress of upgrading HDP 3.0 from 2.6. Now i am stuck at 80%, during Move Hive Tables... PFB the logs...

2019-02-04 20:12:46,631 - Action afix 'pre_actionexecute' not present 2019-02-04 20:12:46,632 - Task. Type: EXECUTE, Scrmydomaint: scrmydomaints/post_upgrade.py - Function: move_tables 2019-02-04 20:12:46,795 - Action afix 'pre_move_tables' not present 2019-02-04 20:12:46,811 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf 2019-02-04 20:12:46,832 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} hive-server2 - 3.0.1.0-187 2019-02-04 20:12:46,849 - call returned (0, 'hive-server2 - 3.0.1.0-187') 2019-02-04 20:12:46,850 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187, Upgrade Direction=upgrade -> 3.0.1.0-187 2019-02-04 20:12:46,864 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource(mydomain:8080/resources/CredentialUtil.jar'), 'mode': 0755} 2019-02-04 20:12:46,888 - Not downloading the file from http://mydomain-ipaddress186.corp.mydomain.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. Feb 04, 2019 8:12:47 PM org.apache.hadoop.util.NativeCodeLoader <clinit> WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable hive123 2019-02-04 20:12:47,364 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755} 2019-02-04 20:12:47,371 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpXtKxfn 2>/tmp/tmpXtGHAn''] {'logoutput': None, 'quiet': False} 2019-02-04 20:12:47,420 - call returned (0, '') 2019-02-04 20:12:47,420 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":16544,"group":"hdfs","length":0,"modificationTime":1532607674651,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'') 2019-02-04 20:12:47,421 - HdfsResource['/warehouse/tablespace/external/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777} 2019-02-04 20:12:47,421 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/warehouse/tablespace/external/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRR36mQ 2>/tmp/tmpzV0iz3''] {'logoutput': None, 'quiet': False} 2019-02-04 20:12:47,466 - call returned (0, '') 2019-02-04 20:12:47,466 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":109,"fileId":2001714,"group":"hadoop","length":0,"modificationTime":1549310578051,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'') 2019-02-04 20:12:47,467 - Skmydomainping the operation for not managed DFS directory /warehouse/tablespace/external/hive since immutable_paths contains it. 2019-02-04 20:12:47,467 - HdfsResource['/warehouse/tablespace/managed/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0700} 2019-02-04 20:12:47,468 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/warehouse/tablespace/managed/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpgyPHg6 2>/tmp/tmp47sejB''] {'logoutput': None, 'quiet': False} 2019-02-04 20:12:47,513 - call returned (0, '') 2019-02-04 20:12:47,513 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":109,"fileId":2001716,"group":"hadoop","length":0,"modificationTime":1549310578036,"owner":"hive","pathSuffix":"","permission":"700","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'') 2019-02-04 20:12:47,513 - Skmydomainping the operation for not managed DFS directory /warehouse/tablespace/managed/hive since immutable_paths contains it. 2019-02-04 20:12:47,514 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hdfs getconf -confKey dfs.namenode.acls.enabled 1>/tmp/tmp11ZZm6 2>/tmp/tmprWKuLC''] {'quiet': False} 2019-02-04 20:12:48,092 - call returned (0, '') 2019-02-04 20:12:48,093 - get_user_call_output returned (0, u'true', u'') 2019-02-04 20:12:48,093 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hdfs getconf -confKey dfs.namenode.posix.acl.inheritance.enabled 1>/tmp/tmpvtqs28 2>/tmp/tmpfzETLf''] {'quiet': False} 2019-02-04 20:12:48,658 - call returned (0, '') 2019-02-04 20:12:48,658 - get_user_call_output returned (0, u'true', u'') 2019-02-04 20:12:48,659 - Execute['hdfs dfs -setfacl -m default:user:hive:rwx /warehouse/tablespace/external/hive'] {'user': 'hdfs'} 2019-02-04 20:12:50,176 - Execute['hdfs dfs -setfacl -m default:user:hive:rwx /warehouse/tablespace/managed/hive'] {'user': 'hdfs'} 2019-02-04 20:12:51,692 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']} 2019-02-04 20:12:51,693 - Execute['/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service strictmanagedmigration --hiveconf hive.strict.managed.tables=true -m automatic --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse'] {'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'user': 'hdfs'} /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected SLF4J: Class path contains multmydomainle SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multmydomainle_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scrmydomaints/post_upgrade.py", line 52, in <module> HivePostUpgrade().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/scrmydomaint/scrmydomaint.py", line 353, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scrmydomaints/post_upgrade.py", line 49, in move_tables user = params.hdfs_user) File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run returns=self.resource.returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service strictmanagedmigration --hiveconf hive.strict.managed.tables=true -m automatic --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse' returned 255. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected SLF4J: Class path contains multmydomainle SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multmydomainle_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Command failed after 1 tries

Please help

Regards,

Giridharan C

2 REPLIES 2

Re: HDP3.0 upgrade, Stuck in 'Move Hive Tables'.. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected...

Contributor

@Giridharan C - This is a bug in HDP-3.0 and was fixed in HDP-3.0.1, Ambari-2.7,1 release

To workaround the problem, please try the following:

Open /etc/hive/conf/hive-env.sh on hive metastore host, change the following line

if [ $HADOOP_OPTS != *-Dhive.log.file* ]; then 

TO

if [[ "$HADOOP_OPTS" != *-Dhive.log.file* ]]; then

and retry the failed task

Re: HDP3.0 upgrade, Stuck in 'Move Hive Tables'.. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected...

New Contributor
@vsharma

- After changing above statement from hive-env.sh, i rerun the update and getting below error...

resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service  strictmanagedmigration --hiveconf hive.strict.managed.tables=true  -m automatic  --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse' returned 255. ====================

Don't have an account?
Coming from Hortonworks? Activate your account here