Member since
04-16-2018
17
Posts
1
Kudos Received
0
Solutions
06-12-2019
04:15 PM
for my case, tez.am.container.reuse.enabled = True. I ran a simple select * statement from the readonly user, but still the Tez session is in running state for the user. application_1560323362594_1913 HIVE-9c2f473e-a685-4233-a522-394933d2905d TEZ readonly default RUNNING UNDEFINED 0% http://mydomain:37441/ui/ Can any one tell me what was the issue. How to use Tez to use the resource correctly. Please dont share any document, I am tired. Please post me practical solution. Thanks...
... View more
04-05-2019
02:45 PM
Is there anyone to discuss this question... Or i am posting it in a wrong place....? Hellooooooooooooooooo.......................
... View more
03-29-2019
10:41 AM
Hello All, Please find the below resource utilization of my cluster... Am i using my hardware resource properly... Please guide me
... View more
Labels:
- Labels:
-
Apache YARN
-
Cloudera Manager
02-26-2019
02:12 PM
Hello @subhash parise.. I am still facing issue in connecting hive warehouse.. PFB error... >>> from pyspark_llap import HiveWarehouseSession >>> hive=HiveWarehouseSession.session(spark).build() >>> hive.showDatabases().show(100) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/tmp/spark-ac3ac8d8-5b24-4339-9ef6-e6eed46932cf/userFiles-29c83c49-13b6-4e28-b902-5a6cfbdf7ada/pyspark_hwc-1.0.0.3.0.0.0-1634.zip/pyspark_llap/sql/session.py", line 127, in showDatabases File "/usr/local/lib/python3.7/site-packages/pyspark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ File "/usr/local/lib/python3.7/site-packages/pyspark/sql/utils.py", line 63, in deco return f(*a, **kw) File "/usr/local/lib/python3.7/site-packages/pyspark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o36.showDatabases. : java.lang.RuntimeException: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [anonymous] does not have [USE] privilege on [default]
... View more
02-26-2019
01:39 PM
after restarting ambari-agent it is working fine..
... View more
02-22-2019
10:32 PM
I am also facing the same issue in HDP3... Is it resolved...????
... View more
02-20-2019
09:42 PM
recently I have upgraded to HDP 3. If I run the stale config, it is not working(failing) to one of the master server(NN runnnig) all other nodes are working fine if I apply stale config. Whatever the services running in this host are failing. even i have restarted ambari server and ambari agent but no luck. Please help Thanks in advance..
... View more
Labels:
02-20-2019
12:12 PM
1 Kudo
Environment: HDP 3.0.1 Ranger Enabled (HDFS, YARN, HIVE, HBASE) Hive 3.1 Spark 2.3.1 Non Kerberized cluster AWS EC2 instance(authentication using ssh private key) I followed from the below urls and configured in custom configuration connect hive from spark.. https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/integrating-hive/content/hive_configure_a_spark_hive_connection.html we have started pyspark with llap jar and zip file: pyspark --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-1.0.0.3.0.0.0-1634.zip Then i created hivewarehose session: from pyspark_llap import HiveWarehouseSession hive = HiveWarehouseSession.session(spark).build() after creating hivewarehouse session I have give below hive command to check the hive server connection hive.showDatabases() I got below error.. hive.describeTable("emp")
Traceback (most recent call last):
File "", line 1, in
File "/tmp/spark-b1374fa6-2eb8-4363-bbf6-4199925667fc/userFiles-874da928-a8b6-47bd-a127-7ca65d5ff096/pyspark_hwc-1.0.0.3.0.0.0-1634.zip/pyspark_llap/sql/session.py", line 135, in describeTable
File "/usr/hdp/3.0.0.0-1634/spark2/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
File "/usr/hdp/3.0.0.0-1634/spark2/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/usr/hdp/3.0.0.0-1634/spark2/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o70.describeTable.
: java.lang.RuntimeException: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [anonymous] does not have [USE] privilege on [test]
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.execute(HiveWarehouseSessionImpl.java:70)
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.describeTable(HiveWarehouseSessionImpl.java:123)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)
Caused by: shadehive.org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [anonymous] does not have [USE] privilege on [test]
at shadehive.org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:300)
at shadehive.org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:286)
at shadehive.org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:324)
at shadehive.org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:265)
at shadehive.org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:101)
at org.apache.commons.dbcp2.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:198)
at org.apache.commons.dbcp2.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:198)
at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.useDatabase(HS2JDBCWrapper.scala:215)
at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.executeStmt(HS2JDBCWrapper.scala:163)
at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.executeStmt(HS2JDBCWrapper.scala)
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$1(HiveWarehouseSessionImpl.java:50)
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.execute(HiveWarehouseSessionImpl.java:67)
... 12 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [anonymous] does not have [USE] privilege on [test]
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:199)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:260)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:541)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:527)
at sun.reflect.GeneratedMethodAccessor58.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy66.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:312)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:562)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [anonymous] does not have [USE] privilege on [test]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:483)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:1306)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:1070)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:697)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1830)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1777)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1772)
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
... 26 more I have cross checked all the user privileges and resource access in ranger all are looking good.. Please check the ablove issue and help us to resolve this.. Thanks in advance.. Regards, Giridharan c
... View more
Labels:
02-12-2019
01:12 PM
@subhash parise - Thanks for the answer.. as mentioned in the url https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/integrating-hive/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html I have added custom spark2-default properties and rerun the jobs and the job succeeded but i am not seeing any databases created from hive-cli show databases..
... View more
02-12-2019
11:24 AM
In HDP3, I am trying to ingest DB and File copy using PySaprk. After the job completion hive db is not listed in hive cli. After checking the property in Ambari came to know there is spark warehouse directory where all the hive tables are created and it is not listed under hive databases. spark.sql.warehouse.dir=/apps/spark/warehouse So i changed above property to hive warehouse spark.sql.warehouse.dir=/warehouse/tablespace/managed/hive After changing the spark warehouse property i rerun the spark job and tried to list from hive-cli show databases, no luck. When i tried to check the hdfs location /warehouse/tablespace/managed/hive i can see all the db names are created as.db files. how to resolve this issue, Please help.
... View more
Labels:
02-07-2019
02:48 PM
@vsharma - After changing above statement from hive-env.sh, i rerun the update and getting below error... resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service strictmanagedmigration --hiveconf hive.strict.managed.tables=true -m automatic --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse' returned 255. ====================
... View more
02-05-2019
12:21 PM
Dear All, I am in progress of upgrading HDP 3.0 from 2.6. Now i am stuck at 80%, during Move Hive Tables... PFB the logs... 2019-02-04 20:12:46,631 - Action afix 'pre_actionexecute' not present
2019-02-04 20:12:46,632 - Task. Type: EXECUTE, Scrmydomaint: scrmydomaints/post_upgrade.py - Function: move_tables
2019-02-04 20:12:46,795 - Action afix 'pre_move_tables' not present
2019-02-04 20:12:46,811 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-02-04 20:12:46,832 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
hive-server2 - 3.0.1.0-187
2019-02-04 20:12:46,849 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2019-02-04 20:12:46,850 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187, Upgrade Direction=upgrade -> 3.0.1.0-187
2019-02-04 20:12:46,864 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource(mydomain:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2019-02-04 20:12:46,888 - Not downloading the file from http://mydomain-ipaddress186.corp.mydomain.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Feb 04, 2019 8:12:47 PM org.apache.hadoop.util.NativeCodeLoader <clinit>
WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hive123
2019-02-04 20:12:47,364 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755}
2019-02-04 20:12:47,371 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpXtKxfn 2>/tmp/tmpXtGHAn''] {'logoutput': None, 'quiet': False}
2019-02-04 20:12:47,420 - call returned (0, '')
2019-02-04 20:12:47,420 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":16544,"group":"hdfs","length":0,"modificationTime":1532607674651,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-02-04 20:12:47,421 - HdfsResource['/warehouse/tablespace/external/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2019-02-04 20:12:47,421 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/warehouse/tablespace/external/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRR36mQ 2>/tmp/tmpzV0iz3''] {'logoutput': None, 'quiet': False}
2019-02-04 20:12:47,466 - call returned (0, '')
2019-02-04 20:12:47,466 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":109,"fileId":2001714,"group":"hadoop","length":0,"modificationTime":1549310578051,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-02-04 20:12:47,467 - Skmydomainping the operation for not managed DFS directory /warehouse/tablespace/external/hive since immutable_paths contains it.
2019-02-04 20:12:47,467 - HdfsResource['/warehouse/tablespace/managed/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0700}
2019-02-04 20:12:47,468 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://mydomain-ipaddress138.corp.mydomain.com:50070/webhdfs/v1/warehouse/tablespace/managed/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpgyPHg6 2>/tmp/tmp47sejB''] {'logoutput': None, 'quiet': False}
2019-02-04 20:12:47,513 - call returned (0, '')
2019-02-04 20:12:47,513 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":109,"fileId":2001716,"group":"hadoop","length":0,"modificationTime":1549310578036,"owner":"hive","pathSuffix":"","permission":"700","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-02-04 20:12:47,513 - Skmydomainping the operation for not managed DFS directory /warehouse/tablespace/managed/hive since immutable_paths contains it.
2019-02-04 20:12:47,514 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hdfs getconf -confKey dfs.namenode.acls.enabled 1>/tmp/tmp11ZZm6 2>/tmp/tmprWKuLC''] {'quiet': False}
2019-02-04 20:12:48,092 - call returned (0, '')
2019-02-04 20:12:48,093 - get_user_call_output returned (0, u'true', u'')
2019-02-04 20:12:48,093 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hdfs getconf -confKey dfs.namenode.posix.acl.inheritance.enabled 1>/tmp/tmpvtqs28 2>/tmp/tmpfzETLf''] {'quiet': False}
2019-02-04 20:12:48,658 - call returned (0, '')
2019-02-04 20:12:48,658 - get_user_call_output returned (0, u'true', u'')
2019-02-04 20:12:48,659 - Execute['hdfs dfs -setfacl -m default:user:hive:rwx /warehouse/tablespace/external/hive'] {'user': 'hdfs'}
2019-02-04 20:12:50,176 - Execute['hdfs dfs -setfacl -m default:user:hive:rwx /warehouse/tablespace/managed/hive'] {'user': 'hdfs'}
2019-02-04 20:12:51,692 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mydomain-ipaddress138.corp.mydomain.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'princmydomainal_name': 'missing_princmydomainal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2019-02-04 20:12:51,693 - Execute['/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service strictmanagedmigration --hiveconf hive.strict.managed.tables=true -m automatic --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse'] {'environment': {'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112'}, 'user': 'hdfs'}
/etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected
SLF4J: Class path contains multmydomainle SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multmydomainle_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scrmydomaints/post_upgrade.py", line 52, in <module>
HivePostUpgrade().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/scrmydomaint/scrmydomaint.py", line 353, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scrmydomaints/post_upgrade.py", line 49, in move_tables
user = params.hdfs_user)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/3.0.1.0-187/hive/bin/hive --config /etc/hive/conf --service strictmanagedmigration --hiveconf hive.strict.managed.tables=true -m automatic --modifyManagedTables --oldWarehouseRoot /apps/hive/warehouse' returned 255. /etc/hive/conf/hive-env.sh: line 46: [: !=: unary operator expected
SLF4J: Class path contains multmydomainle SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multmydomainle_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Command failed after 1 tries Please help Regards, Giridharan C
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)