<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Fresh Hive Service Installation Failing To Start in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/296572#M218294</link>
    <description>&lt;P&gt;Hi Team,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Using Ambari; 2.7.3&lt;/P&gt;
&lt;P&gt;HDP: 3.0&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am trying to install hive service with existing postgresql but it is failing with below ERROR and Exception.&lt;/P&gt;
&lt;P&gt;Can anyone advice me as where did i go wrong?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Created user 'hive' with password as 'admin'&lt;/P&gt;
&lt;P&gt;created databases 'hive'&lt;/P&gt;
&lt;P&gt;GRANT ALL PRIVILEGES ON DATABASE hive TO hive&lt;/P&gt;
&lt;P&gt;Downloaded jar file and ran setup&lt;/P&gt;
&lt;P&gt;ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Below is the ERROR message&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Traceback (most recent call last):&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 211, in &amp;lt;module&amp;gt;&lt;BR /&gt;HiveMetastore().execute()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute&lt;BR /&gt;method(env)&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 61, in start&lt;BR /&gt;create_metastore_schema() # execute without config lock&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 378, in create_metastore_schema&lt;BR /&gt;user = params.hive_user&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__&lt;BR /&gt;self.env.run()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run&lt;BR /&gt;self.run_action(resource, action)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action&lt;BR /&gt;provider_action()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run&lt;BR /&gt;returns=self.resource.returns)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner&lt;BR /&gt;result = function(command, **kwargs)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call&lt;BR /&gt;tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper&lt;BR /&gt;result = _call(command, **kwargs_copy)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call&lt;BR /&gt;raise ExecutionFailed(err_msg, code, out, err)&lt;BR /&gt;resource_management.core.exceptions.ExecutionFailed: Execution of 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose' returned 1. SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.1175-1/hive2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.1175-1/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank" rel="noopener"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]&lt;BR /&gt;Metastore connection URL: jdbc:postgresql://mastern1.bms.com:5432/hive&lt;BR /&gt;Metastore Connection Driver : org.postgresql.Driver&lt;BR /&gt;Metastore connection User: hive&lt;BR /&gt;org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.&lt;BR /&gt;Underlying cause: org.postgresql.util.PSQLException : FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;SQL Error code: 0&lt;BR /&gt;org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:80)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.getConnectionToMetastore(HiveSchemaTool.java:133)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.testConnectionToMetastore(HiveSchemaTool.java:187)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:291)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:277)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:526)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.run(RunJar.java:233)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:148)&lt;BR /&gt;Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:525)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:146)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:197)&lt;BR /&gt;at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)&lt;BR /&gt;at org.postgresql.jdbc.PgConnection.&amp;lt;init&amp;gt;(PgConnection.java:211)&lt;BR /&gt;at org.postgresql.Driver.makeConnection(Driver.java:459)&lt;BR /&gt;at org.postgresql.Driver.connect(Driver.java:261)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:664)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:247)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:76)&lt;BR /&gt;... 11 more&lt;BR /&gt;Suppressed: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:525)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:146)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:206)&lt;BR /&gt;... 18 more&lt;BR /&gt;*** schemaTool failed ***&lt;BR /&gt;stdout: /var/lib/ambari-agent/data/output-150.txt&lt;BR /&gt;2020-05-25 23:30:45,946 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:46,079 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:47,387 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:47,452 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:47,456 - Group['hdfs'] {}&lt;BR /&gt;2020-05-25 23:30:47,459 - Group['hadoop'] {}&lt;BR /&gt;2020-05-25 23:30:47,460 - Group['users'] {}&lt;BR /&gt;2020-05-25 23:30:47,533 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,536 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,540 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,542 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,544 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,572 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,574 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,576 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,579 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,581 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2020-05-25 23:30:47,632 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;BR /&gt;2020-05-25 23:30:47,742 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;BR /&gt;2020-05-25 23:30:47,743 - Group['hdfs'] {}&lt;BR /&gt;2020-05-25 23:30:47,744 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}&lt;BR /&gt;2020-05-25 23:30:47,746 - FS Type: HDFS&lt;BR /&gt;2020-05-25 23:30:47,747 - Directory['/etc/hadoop'] {'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:47,890 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:47,891 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}&lt;BR /&gt;2020-05-25 23:30:48,009 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce &amp;amp;&amp;amp; getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}&lt;BR /&gt;2020-05-25 23:30:48,149 - Skipping Execute[('setenforce', '0')] due to not_if&lt;BR /&gt;2020-05-25 23:30:48,150 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,153 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,153 - Changing owner for /var/run/hadoop from 1004 to root&lt;BR /&gt;2020-05-25 23:30:48,154 - Changing group for /var/run/hadoop from 1002 to root&lt;BR /&gt;2020-05-25 23:30:48,154 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,155 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,187 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}&lt;BR /&gt;2020-05-25 23:30:48,191 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}&lt;BR /&gt;2020-05-25 23:30:48,259 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:48,335 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:48,336 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:48,339 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:48,367 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:48,396 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:48,423 - Skipping unlimited key JCE policy check and setup since the Java VM is not managed by Ambari&lt;BR /&gt;2020-05-25 23:30:49,053 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:49,098 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}&lt;BR /&gt;2020-05-25 23:30:49,171 - call returned (0, 'hive-server2 - 2.6.5.1175-1')&lt;BR /&gt;2020-05-25 23:30:49,173 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:49,305 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('&lt;A href="http://mastern1.bms.com:8080/resources/CredentialUtil.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/CredentialUtil.jar&lt;/A&gt;'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:49,307 - Not downloading the file from &lt;A href="http://mastern1.bms.com:8080/resources/CredentialUtil.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/CredentialUtil.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists&lt;BR /&gt;2020-05-25 23:30:51,924 - Directory['/etc/hive'] {'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:51,924 - Directories to fill with configs: [u'/usr/hdp/current/hive-metastore/conf', u'/usr/hdp/current/hive-metastore/conf/conf.server']&lt;BR /&gt;2020-05-25 23:30:51,925 - Directory['/etc/hive/2.6.5.1175-1/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:51,926 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.5.1175-1/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:51,959 - Generating config: /etc/hive/2.6.5.1175-1/0/mapred-site.xml&lt;BR /&gt;2020-05-25 23:30:51,960 - File['/etc/hive/2.6.5.1175-1/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,086 - File['/etc/hive/2.6.5.1175-1/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,086 - File['/etc/hive/2.6.5.1175-1/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,090 - File['/etc/hive/2.6.5.1175-1/0/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,096 - File['/etc/hive/2.6.5.1175-1/0/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,098 - File['/etc/hive/2.6.5.1175-1/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,099 - Directory['/etc/hive/2.6.5.1175-1/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}&lt;BR /&gt;2020-05-25 23:30:52,100 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.5.1175-1/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:52,126 - Generating config: /etc/hive/2.6.5.1175-1/0/conf.server/mapred-site.xml&lt;BR /&gt;2020-05-25 23:30:52,127 - File['/etc/hive/2.6.5.1175-1/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,289 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,289 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,293 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,304 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,305 - File['/etc/hive/2.6.5.1175-1/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,306 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}&lt;BR /&gt;2020-05-25 23:30:52,307 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] because contents don't match&lt;BR /&gt;2020-05-25 23:30:52,308 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:52,326 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml&lt;BR /&gt;2020-05-25 23:30:52,328 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,698 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,700 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}&lt;BR /&gt;2020-05-25 23:30:52,710 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,711 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('&lt;A href="http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar&lt;/A&gt;'), 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,711 - Not downloading the file from &lt;A href="http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists&lt;BR /&gt;2020-05-25 23:30:52,712 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,713 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,714 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,715 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {u'hive.metastore.metrics.enabled': u'true', u'hive.server2.metrics.enabled': u'true', u'hive.service.metrics.hadoop2.component': u'hivemetastore', u'hive.service.metrics.reporter': u'HADOOP2'}}&lt;BR /&gt;2020-05-25 23:30:52,750 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml&lt;BR /&gt;2020-05-25 23:30:52,757 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,792 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] {'content': Template('hadoop-metrics2-hivemetastore.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,793 - File['/var/lib/ambari-agent/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:52,796 - Directory['/tmp/hive'] {'owner': 'hive', 'create_parents': True, 'mode': 0777}&lt;BR /&gt;2020-05-25 23:30:52,797 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose'] {'not_if': u"ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -info -dbType postgres -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'}&lt;/P&gt;
&lt;P&gt;Command failed after 1 tries&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanking in Advance!!&lt;/P&gt;</description>
    <pubDate>Tue, 26 May 2020 06:29:44 GMT</pubDate>
    <dc:creator>shrikant_bm</dc:creator>
    <dc:date>2020-05-26T06:29:44Z</dc:date>
    <item>
      <title>Fresh Hive Service Installation Failing To Start</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/296572#M218294</link>
      <description>&lt;P&gt;Hi Team,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Using Ambari; 2.7.3&lt;/P&gt;
&lt;P&gt;HDP: 3.0&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am trying to install hive service with existing postgresql but it is failing with below ERROR and Exception.&lt;/P&gt;
&lt;P&gt;Can anyone advice me as where did i go wrong?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Created user 'hive' with password as 'admin'&lt;/P&gt;
&lt;P&gt;created databases 'hive'&lt;/P&gt;
&lt;P&gt;GRANT ALL PRIVILEGES ON DATABASE hive TO hive&lt;/P&gt;
&lt;P&gt;Downloaded jar file and ran setup&lt;/P&gt;
&lt;P&gt;ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Below is the ERROR message&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Traceback (most recent call last):&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 211, in &amp;lt;module&amp;gt;&lt;BR /&gt;HiveMetastore().execute()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute&lt;BR /&gt;method(env)&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 61, in start&lt;BR /&gt;create_metastore_schema() # execute without config lock&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 378, in create_metastore_schema&lt;BR /&gt;user = params.hive_user&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__&lt;BR /&gt;self.env.run()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run&lt;BR /&gt;self.run_action(resource, action)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action&lt;BR /&gt;provider_action()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run&lt;BR /&gt;returns=self.resource.returns)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner&lt;BR /&gt;result = function(command, **kwargs)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call&lt;BR /&gt;tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper&lt;BR /&gt;result = _call(command, **kwargs_copy)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call&lt;BR /&gt;raise ExecutionFailed(err_msg, code, out, err)&lt;BR /&gt;resource_management.core.exceptions.ExecutionFailed: Execution of 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose' returned 1. SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.1175-1/hive2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.1175-1/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank" rel="noopener"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]&lt;BR /&gt;Metastore connection URL: jdbc:postgresql://mastern1.bms.com:5432/hive&lt;BR /&gt;Metastore Connection Driver : org.postgresql.Driver&lt;BR /&gt;Metastore connection User: hive&lt;BR /&gt;org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.&lt;BR /&gt;Underlying cause: org.postgresql.util.PSQLException : FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;SQL Error code: 0&lt;BR /&gt;org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:80)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.getConnectionToMetastore(HiveSchemaTool.java:133)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.testConnectionToMetastore(HiveSchemaTool.java:187)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:291)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:277)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:526)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.run(RunJar.java:233)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:148)&lt;BR /&gt;Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:525)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:146)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:197)&lt;BR /&gt;at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)&lt;BR /&gt;at org.postgresql.jdbc.PgConnection.&amp;lt;init&amp;gt;(PgConnection.java:211)&lt;BR /&gt;at org.postgresql.Driver.makeConnection(Driver.java:459)&lt;BR /&gt;at org.postgresql.Driver.connect(Driver.java:261)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:664)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:247)&lt;BR /&gt;at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:76)&lt;BR /&gt;... 11 more&lt;BR /&gt;Suppressed: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive", SSL off&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:525)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:146)&lt;BR /&gt;at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:206)&lt;BR /&gt;... 18 more&lt;BR /&gt;*** schemaTool failed ***&lt;BR /&gt;stdout: /var/lib/ambari-agent/data/output-150.txt&lt;BR /&gt;2020-05-25 23:30:45,946 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:46,079 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:47,387 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:47,452 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:47,456 - Group['hdfs'] {}&lt;BR /&gt;2020-05-25 23:30:47,459 - Group['hadoop'] {}&lt;BR /&gt;2020-05-25 23:30:47,460 - Group['users'] {}&lt;BR /&gt;2020-05-25 23:30:47,533 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,536 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,540 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,542 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,544 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,572 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,574 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,576 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,579 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2020-05-25 23:30:47,581 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2020-05-25 23:30:47,632 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;BR /&gt;2020-05-25 23:30:47,742 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;BR /&gt;2020-05-25 23:30:47,743 - Group['hdfs'] {}&lt;BR /&gt;2020-05-25 23:30:47,744 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}&lt;BR /&gt;2020-05-25 23:30:47,746 - FS Type: HDFS&lt;BR /&gt;2020-05-25 23:30:47,747 - Directory['/etc/hadoop'] {'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:47,890 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:47,891 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}&lt;BR /&gt;2020-05-25 23:30:48,009 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce &amp;amp;&amp;amp; getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}&lt;BR /&gt;2020-05-25 23:30:48,149 - Skipping Execute[('setenforce', '0')] due to not_if&lt;BR /&gt;2020-05-25 23:30:48,150 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,153 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,153 - Changing owner for /var/run/hadoop from 1004 to root&lt;BR /&gt;2020-05-25 23:30:48,154 - Changing group for /var/run/hadoop from 1002 to root&lt;BR /&gt;2020-05-25 23:30:48,154 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,155 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:48,187 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}&lt;BR /&gt;2020-05-25 23:30:48,191 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}&lt;BR /&gt;2020-05-25 23:30:48,259 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:48,335 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:48,336 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:48,339 - File['/usr/hdp/2.6.5.1175-1/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2020-05-25 23:30:48,367 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:48,396 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:48,423 - Skipping unlimited key JCE policy check and setup since the Java VM is not managed by Ambari&lt;BR /&gt;2020-05-25 23:30:49,053 - Using hadoop conf dir: /usr/hdp/2.6.5.1175-1/hadoop/conf&lt;BR /&gt;2020-05-25 23:30:49,098 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}&lt;BR /&gt;2020-05-25 23:30:49,171 - call returned (0, 'hive-server2 - 2.6.5.1175-1')&lt;BR /&gt;2020-05-25 23:30:49,173 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.5.1175-1 -&amp;gt; 2.6.5.1175-1&lt;BR /&gt;2020-05-25 23:30:49,305 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('&lt;A href="http://mastern1.bms.com:8080/resources/CredentialUtil.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/CredentialUtil.jar&lt;/A&gt;'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:49,307 - Not downloading the file from &lt;A href="http://mastern1.bms.com:8080/resources/CredentialUtil.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/CredentialUtil.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists&lt;BR /&gt;2020-05-25 23:30:51,924 - Directory['/etc/hive'] {'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:51,924 - Directories to fill with configs: [u'/usr/hdp/current/hive-metastore/conf', u'/usr/hdp/current/hive-metastore/conf/conf.server']&lt;BR /&gt;2020-05-25 23:30:51,925 - Directory['/etc/hive/2.6.5.1175-1/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:51,926 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.5.1175-1/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:51,959 - Generating config: /etc/hive/2.6.5.1175-1/0/mapred-site.xml&lt;BR /&gt;2020-05-25 23:30:51,960 - File['/etc/hive/2.6.5.1175-1/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,086 - File['/etc/hive/2.6.5.1175-1/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,086 - File['/etc/hive/2.6.5.1175-1/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,090 - File['/etc/hive/2.6.5.1175-1/0/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,096 - File['/etc/hive/2.6.5.1175-1/0/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,098 - File['/etc/hive/2.6.5.1175-1/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,099 - Directory['/etc/hive/2.6.5.1175-1/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}&lt;BR /&gt;2020-05-25 23:30:52,100 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.5.1175-1/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:52,126 - Generating config: /etc/hive/2.6.5.1175-1/0/conf.server/mapred-site.xml&lt;BR /&gt;2020-05-25 23:30:52,127 - File['/etc/hive/2.6.5.1175-1/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,289 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,289 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,293 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,304 - File['/etc/hive/2.6.5.1175-1/0/conf.server/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,305 - File['/etc/hive/2.6.5.1175-1/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,306 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}&lt;BR /&gt;2020-05-25 23:30:52,307 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] because contents don't match&lt;BR /&gt;2020-05-25 23:30:52,308 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}&lt;BR /&gt;2020-05-25 23:30:52,326 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml&lt;BR /&gt;2020-05-25 23:30:52,328 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,698 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,700 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}&lt;BR /&gt;2020-05-25 23:30:52,710 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,711 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('&lt;A href="http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar&lt;/A&gt;'), 'mode': 0644}&lt;BR /&gt;2020-05-25 23:30:52,711 - Not downloading the file from &lt;A href="http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar" target="_blank" rel="noopener"&gt;http://mastern1.bms.com:8080/resources/DBConnectionVerification.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists&lt;BR /&gt;2020-05-25 23:30:52,712 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,713 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,714 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}&lt;BR /&gt;2020-05-25 23:30:52,715 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {u'hive.metastore.metrics.enabled': u'true', u'hive.server2.metrics.enabled': u'true', u'hive.service.metrics.hadoop2.component': u'hivemetastore', u'hive.service.metrics.reporter': u'HADOOP2'}}&lt;BR /&gt;2020-05-25 23:30:52,750 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml&lt;BR /&gt;2020-05-25 23:30:52,757 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}&lt;BR /&gt;2020-05-25 23:30:52,792 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] {'content': Template('hadoop-metrics2-hivemetastore.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}&lt;BR /&gt;2020-05-25 23:30:52,793 - File['/var/lib/ambari-agent/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}&lt;BR /&gt;2020-05-25 23:30:52,796 - Directory['/tmp/hive'] {'owner': 'hive', 'create_parents': True, 'mode': 0777}&lt;BR /&gt;2020-05-25 23:30:52,797 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose'] {'not_if': u"ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -info -dbType postgres -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'}&lt;/P&gt;
&lt;P&gt;Command failed after 1 tries&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanking in Advance!!&lt;/P&gt;</description>
      <pubDate>Tue, 26 May 2020 06:29:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/296572#M218294</guid>
      <dc:creator>shrikant_bm</dc:creator>
      <dc:date>2020-05-26T06:29:44Z</dc:date>
    </item>
    <item>
      <title>Re: Fresh Hive Service Installation Failing To Start</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/296622#M218324</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/59432"&gt;@shrikant_bm&lt;/a&gt;&amp;nbsp;You need to complete required setup for hive database and user in Postgres:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;FATAL: no pg_hba.conf entry for host "192.168.0.109", user "hive", database "hive",&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;Once you make the required changes and restart Postgres, your error will go away.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;More about this here:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.cloudera.com/t5/Support-Questions/HDP-Ambari-installation-throws-quot-org-postgresql-util/m-p/277977" target="_blank"&gt;https://community.cloudera.com/t5/Support-Questions/HDP-Ambari-installation-throws-quot-org-postgresql-util/m-p/277977&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Or search our community for other post about same issue:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.cloudera.com/t5/forums/searchpage/tab/message?advanced=false&amp;amp;allow_punctuation=false&amp;amp;q=pg_hba.conf" target="_blank"&gt;https://community.cloudera.com/t5/forums/searchpage/tab/message?advanced=false&amp;amp;allow_punctuation=false&amp;amp;q=pg_hba.conf&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post. &amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Steven&amp;nbsp;@ DFHZ&lt;/P&gt;</description>
      <pubDate>Tue, 26 May 2020 12:31:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/296622#M218324</guid>
      <dc:creator>stevenmatison</dc:creator>
      <dc:date>2020-05-26T12:31:06Z</dc:date>
    </item>
    <item>
      <title>Re: Fresh Hive Service Installation Failing To Start</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/297105#M218597</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/60150"&gt;@stevenmatison&lt;/a&gt;&amp;nbsp;: Thanks for the update!!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This issue got fixed by adding an new line in pg_hba.conf file&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;host&amp;nbsp; all&amp;nbsp; hive&amp;nbsp; 192.168.0.109/32&amp;nbsp; trust&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Restart postgresql&lt;/P&gt;</description>
      <pubDate>Tue, 02 Jun 2020 19:04:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Fresh-Hive-Service-Installation-Failing-To-Start/m-p/297105#M218597</guid>
      <dc:creator>shrikant_bm</dc:creator>
      <dc:date>2020-06-02T19:04:39Z</dc:date>
    </item>
  </channel>
</rss>

