Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot connect to hive

avatar
Explorer

I'm running HDP 2.5 Sandbox on Docker on a Mac. I do get the following error when trying to use the Hive View in Ambari. I tried rebooting but it did not help. Any ideas?

        Service 'hiveserver' check failed:
java.lang.Exception: Cannot connect to hive

java.lang.Exception: Cannot connect to hive
	at org.apache.ambari.view.hive2.actor.message.job.ExecutionFailed.<init>(ExecutionFailed.java:28)
	at org.apache.ambari.view.hive2.actor.JdbcConnector.notifyConnectFailure(JdbcConnector.java:385)
	at org.apache.ambari.view.hive2.actor.JdbcConnector.connect(JdbcConnector.java:422)
	at org.apache.ambari.view.hive2.actor.JdbcConnector.handleNonLifecycleMessage(JdbcConnector.java:179)
	at org.apache.ambari.view.hive2.actor.JdbcConnector.handleMessage(JdbcConnector.java:171)
	at org.apache.ambari.view.hive2.actor.HiveActor.onReceive(HiveActor.java:38)
	at akka.actor.UntypedActor$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
	at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
	at akka.actor.ActorCell.invoke(ActorCell.scala:487)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
	at akka.dispatch.Mailbox.run(Mailbox.scala:220)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
1 ACCEPTED SOLUTION

avatar
New Contributor

@Joav Bally the Hive server stops because of high Hiveserver 2 Heap Size.

  1. Go to Hive Services
  2. Click Config and scroll to the bottom
  3. In Storage section. Change the HiveServer2 heap Size.(Click the refresh icon that displays on hovering to set the recommended)
  4. Save the Current HDP and restart the hive service

This will solve the issue.

View solution in original post

10 REPLIES 10

avatar
Expert Contributor

@Joav Bally

Make sure you have hiveserver2 is up and running. Do the other checks passed successfully?

avatar
Explorer

yes, the other checks passed successfully. I can't get hiveserver2 to run. I started it several times now and it shows ok but then goes red again.

avatar
Expert Contributor

That's basically it, View cannot access hive because of hiveserver2 is down. Please copy and paste the output of the hiverserver2 when you try to starting it. Also hiveserver2.log file would be useful to see the underline problem

avatar
Explorer

log.txtHere is what I have.

avatar
Expert Contributor

@Joav Bally

Could you ssh into the machine and retrieve /var/log/hive/hiveserver2.log from the time you try to start it?

It would be more explicit if we go directly to the hiveserver2 log files instead of what Ambari brings

avatar
Explorer

That's what I have, hope this was what you are looking for:

stderr: /var/lib/ambari-agent/data/errors-357.txt

None

stdout: /var/lib/ambari-agent/data/output-357.txt

2016-09-22 20:34:48,695 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:48,695 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:48,696 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:48,797 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:48,797 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:48,906 - checked_call returned (0, '')
2016-09-22 20:34:48,908 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:48,908 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:49,188 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:49,193 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:49,194 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:49,345 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:49,346 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:49,553 - checked_call returned (0, '')
2016-09-22 20:34:49,554 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:49,554 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:49,560 - Group['hadoop'] {}
2016-09-22 20:34:49,562 - Group['users'] {}
2016-09-22 20:34:49,562 - Group['zeppelin'] {}
2016-09-22 20:34:49,566 - Group['knox'] {}
2016-09-22 20:34:49,567 - Group['ranger'] {}
2016-09-22 20:34:49,567 - Group['spark'] {}
2016-09-22 20:34:49,568 - Group['livy'] {}
2016-09-22 20:34:49,570 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,581 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,595 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,596 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,597 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,606 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,610 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,611 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2016-09-22 20:34:49,612 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,612 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,613 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,614 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,615 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,615 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,617 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,620 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,621 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,622 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,623 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,625 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,627 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,628 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,629 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,630 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-22 20:34:49,632 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-09-22 20:34:49,715 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-09-22 20:34:49,715 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:49,716 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-22 20:34:49,719 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-09-22 20:34:49,815 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-09-22 20:34:49,815 - Group['hdfs'] {}
2016-09-22 20:34:49,816 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2016-09-22 20:34:49,816 - FS Type: 
2016-09-22 20:34:49,817 - Directory['/etc/hadoop'] {'mode': 0755}
2016-09-22 20:34:49,837 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,838 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-09-22 20:34:49,862 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2016-09-22 20:34:49,946 - Skipping Execute[('setenforce', '0')] due to not_if
2016-09-22 20:34:49,947 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:49,950 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2016-09-22 20:34:49,950 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2016-09-22 20:34:49,959 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2016-09-22 20:34:49,962 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2016-09-22 20:34:49,963 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:49,975 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,976 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2016-09-22 20:34:49,979 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,984 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2016-09-22 20:34:50,056 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2016-09-22 20:34:50,495 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:50,495 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:50,496 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:50,590 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:50,591 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:50,696 - checked_call returned (0, '')
2016-09-22 20:34:50,697 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:50,697 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:50,702 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2016-09-22 20:34:50,798 - call returned (0, 'hive-server2 - 2.5.0.0-1245')
2016-09-22 20:34:50,799 - Stack Feature Version Info: stack_version=2.5, version=2.5.0.0-1245, current_cluster_version=2.5.0.0-1245 -> 2.5.0.0-1245
2016-09-22 20:34:50,814 - HdfsResource['/user/hcat'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hcat', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:50,817 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpI_7GC8 2>/tmp/tmpn0lkho''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:50,981 - call returned (0, '')
2016-09-22 20:34:50,982 - Called copy_to_hdfs tarball: mapreduce
2016-09-22 20:34:50,982 - Default version is 2.5.0.0-1245
2016-09-22 20:34:50,982 - Source file: /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz
2016-09-22 20:34:50,982 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:50,983 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpKwBYNm 2>/tmp/tmp0aNFSA''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,121 - call returned (0, '')
2016-09-22 20:34:51,123 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,126 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp67ntEe 2>/tmp/tmpB6VQ56''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,364 - call returned (0, '')
2016-09-22 20:34:51,365 - DFS file /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz is identical to /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz, skipping the copying
2016-09-22 20:34:51,365 - Will attempt to copy mapreduce tarball from /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz.
2016-09-22 20:34:51,365 - Called copy_to_hdfs tarball: tez
2016-09-22 20:34:51,365 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,365 - Source file: /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz
2016-09-22 20:34:51,365 - HdfsResource['/hdp/apps/2.5.0.0-1245/tez'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,366 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/tez?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpcg2mVj 2>/tmp/tmpwYRv71''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,508 - call returned (0, '')
2016-09-22 20:34:51,512 - HdfsResource['/hdp/apps/2.5.0.0-1245/tez/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,514 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/tez/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRFcGmK 2>/tmp/tmpKrLY8H''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,643 - call returned (0, '')
2016-09-22 20:34:51,645 - DFS file /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz is identical to /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz, skipping the copying
2016-09-22 20:34:51,645 - Will attempt to copy tez tarball from /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz.
2016-09-22 20:34:51,646 - Called copy_to_hdfs tarball: pig
2016-09-22 20:34:51,646 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,646 - Source file: /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz
2016-09-22 20:34:51,646 - HdfsResource['/hdp/apps/2.5.0.0-1245/pig'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,647 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/pig?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpYE0UtS 2>/tmp/tmpFIV_Au''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,797 - call returned (0, '')
2016-09-22 20:34:51,801 - HdfsResource['/hdp/apps/2.5.0.0-1245/pig/pig.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/pig/pig.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,803 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/pig/pig.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpkapHOv 2>/tmp/tmpCJVOT3''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,951 - call returned (0, '')
2016-09-22 20:34:51,952 - DFS file /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz is identical to /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz, skipping the copying
2016-09-22 20:34:51,953 - Will attempt to copy pig tarball from /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz.
2016-09-22 20:34:51,953 - Called copy_to_hdfs tarball: hive
2016-09-22 20:34:51,953 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,953 - Source file: /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz
2016-09-22 20:34:51,953 - HdfsResource['/hdp/apps/2.5.0.0-1245/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,954 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpt9Czrz 2>/tmp/tmpXpmA10''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,098 - call returned (0, '')
2016-09-22 20:34:52,100 - HdfsResource['/hdp/apps/2.5.0.0-1245/hive/hive.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hive/hive.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,101 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/hive/hive.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpIklHh5 2>/tmp/tmp7TmFWB''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,226 - call returned (0, '')
2016-09-22 20:34:52,228 - DFS file /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz is identical to /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz, skipping the copying
2016-09-22 20:34:52,229 - Will attempt to copy hive tarball from /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz.
2016-09-22 20:34:52,230 - Called copy_to_hdfs tarball: sqoop
2016-09-22 20:34:52,230 - Default version is 2.5.0.0-1245
2016-09-22 20:34:52,230 - Source file: /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz
2016-09-22 20:34:52,231 - HdfsResource['/hdp/apps/2.5.0.0-1245/sqoop'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:52,232 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/sqoop?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRAP_QA 2>/tmp/tmp_8jqW9''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,368 - call returned (0, '')
2016-09-22 20:34:52,370 - HdfsResource['/hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,371 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpIH2gWB 2>/tmp/tmpwcrDz3''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,511 - call returned (0, '')
2016-09-22 20:34:52,513 - DFS file /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz is identical to /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz, skipping the copying
2016-09-22 20:34:52,514 - Will attempt to copy sqoop tarball from /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz.
2016-09-22 20:34:52,514 - Called copy_to_hdfs tarball: hadoop_streaming
2016-09-22 20:34:52,515 - Default version is 2.5.0.0-1245
2016-09-22 20:34:52,515 - Source file: /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar
2016-09-22 20:34:52,515 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:52,516 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp6JnXdt 2>/tmp/tmpaqgOJU''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,646 - call returned (0, '')
2016-09-22 20:34:52,649 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,650 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp4iplqc 2>/tmp/tmpE5XUQz''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,794 - call returned (0, '')
2016-09-22 20:34:52,797 - DFS file /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar is identical to /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar, skipping the copying
2016-09-22 20:34:52,797 - Will attempt to copy hadoop_streaming tarball from /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar to DFS at /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar.
2016-09-22 20:34:52,798 - HdfsResource['/apps/hive/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0777}
2016-09-22 20:34:52,799 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/apps/hive/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpTyMYcl 2>/tmp/tmpXf1Iqg''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,934 - call returned (0, '')
2016-09-22 20:34:52,935 - Skipping the operation for not managed DFS directory /apps/hive/warehouse since immutable_paths contains it.
2016-09-22 20:34:52,936 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:52,938 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpco40Q8 2>/tmp/tmpRE1mEg''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,074 - call returned (0, '')
2016-09-22 20:34:53,075 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
2016-09-22 20:34:53,076 - Directory['/etc/hive'] {'mode': 0755}
2016-09-22 20:34:53,076 - Directories to fill with configs: ['/usr/hdp/current/hive-server2/conf', '/usr/hdp/current/hive-server2/conf/conf.server']
2016-09-22 20:34:53,077 - Directory['/usr/hdp/current/hive-server2/conf'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
2016-09-22 20:34:53,080 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,098 - Generating config: /usr/hdp/current/hive-server2/conf/mapred-site.xml
2016-09-22 20:34:53,099 - File['/usr/hdp/current/hive-server2/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,187 - File['/usr/hdp/current/hive-server2/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,189 - File['/usr/hdp/current/hive-server2/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,189 - File['/usr/hdp/current/hive-server2/conf/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,190 - File['/usr/hdp/current/hive-server2/conf/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,190 - Directory['/usr/hdp/current/hive-server2/conf/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
2016-09-22 20:34:53,190 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,206 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/mapred-site.xml
2016-09-22 20:34:53,206 - File['/usr/hdp/current/hive-server2/conf/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,275 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,276 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,276 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,277 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,279 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'hidden': {'javax.jdo.option.ConnectionPassword': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}, 'javax.jdo.option.ConnectionPassword': {'hidden': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,287 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/hive-site.xml
2016-09-22 20:34:53,288 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,470 - Generating Atlas Hook config file /usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties
2016-09-22 20:34:53,471 - PropertiesFile['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2016-09-22 20:34:53,476 - Generating properties file: /usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties
2016-09-22 20:34:53,476 - File['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,497 - Writing File['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] because contents don't match
2016-09-22 20:34:53,498 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,508 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/hiveserver2-site.xml
2016-09-22 20:34:53,508 - File['/usr/hdp/current/hive-server2/conf/conf.server/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,523 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,523 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2016-09-22 20:34:53,526 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2016-09-22 20:34:53,527 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://sandbox.hortonworks.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2016-09-22 20:34:53,527 - Not downloading the file from http://sandbox.hortonworks.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2016-09-22 20:34:53,534 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
2016-09-22 20:34:53,541 - File['/usr/hdp/current/hive-server2/conf/conf.server/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,542 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,543 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,543 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,544 - Hive: Setup ranger: command retry not enabled thus skipping if ranger admin is down !
2016-09-22 20:34:53,546 - HdfsResource['/ranger/audit'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'recursive_chmod': True, 'owner': 'hdfs', 'group': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:53,548 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/ranger/audit?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpssmhHh 2>/tmp/tmpactw3P''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,686 - call returned (0, '')
2016-09-22 20:34:53,688 - HdfsResource['/ranger/audit/hiveServer2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'recursive_chmod': True, 'owner': 'hive', 'group': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0700}
2016-09-22 20:34:53,690 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/ranger/audit/hiveServer2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpeulSAi 2>/tmp/tmplvahZM''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,825 - call returned (0, '')
2016-09-22 20:34:53,826 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
2016-09-22 20:34:53,828 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2016-09-22 20:34:53,954 - call returned (0, 'hive-server2 - 2.5.0.0-1245')
2016-09-22 20:34:53,954 - RangeradminV2: Skip ranger admin if it's down !
2016-09-22 20:34:54,477 - amb_ranger_admin user already exists.
2016-09-22 20:34:54,886 - Hive Repository Sandbox_hive exist
2016-09-22 20:34:54,887 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-security.xml'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:54,888 - Writing File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-security.xml'] because contents don't match
2016-09-22 20:34:54,889 - Directory['/etc/ranger/Sandbox_hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:54,889 - Directory['/etc/ranger/Sandbox_hive/policycache'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:54,893 - File['/etc/ranger/Sandbox_hive/policycache/hiveServer2_Sandbox_hive.json'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:54,894 - XmlConfig['ranger-hive-audit.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,918 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-audit.xml
2016-09-22 20:34:54,919 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-audit.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,939 - XmlConfig['ranger-hive-security.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,954 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-security.xml
2016-09-22 20:34:54,954 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-security.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,965 - XmlConfig['ranger-policymgr-ssl.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,975 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-policymgr-ssl.xml
2016-09-22 20:34:54,975 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-policymgr-ssl.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,986 - Execute[('/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_hive/cred.jceks', '-k', 'sslKeyStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslKeyStore created successfully!
2016-09-22 20:34:59,241 - Execute[('/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_hive/cred.jceks', '-k', 'sslTrustStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslTrustStore created successfully!
2016-09-22 20:35:01,494 - File['/etc/ranger/Sandbox_hive/cred.jceks'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2016-09-22 20:35:01,496 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /var/run/hive/hive-server.pid 1>/tmp/tmp2KqY0J 2>/tmp/tmpW8i5WY''] {'quiet': False}
2016-09-22 20:35:01,699 - call returned (0, '')
2016-09-22 20:35:01,701 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'hive --config /usr/hdp/current/hive-server2/conf/conf.server --service metatool -listFSRoot' 2>/dev/null | grep hdfs:// | cut -f1,2,3 -d '/' | grep -v 'hdfs://sandbox.hortonworks.com:8020' | head -1'] {}
2016-09-22 20:35:02,105 - call returned (0, '')
2016-09-22 20:35:02,105 - Execute['/var/lib/ambari-agent/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.err /var/run/hive/hive-server.pid /usr/hdp/current/hive-server2/conf/conf.server /var/log/hive'] {'environment': {'HIVE_BIN': 'hive', 'JAVA_HOME': '/usr/lib/jvm/java', 'HADOOP_HOME': '/usr/hdp/current/hadoop-client'}, 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 8296 >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-server2/bin:/usr/hdp/current/hadoop-client/bin']}
2016-09-22 20:35:02,351 - Execute['/usr/lib/jvm/java/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2/lib/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true' root [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
2016-09-22 20:35:04,020 - Execute['/usr/lib/jvm/java/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2-hive2/lib/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true' root [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
 

Command completed successfully!

avatar
Explorer

That's what I have, hope this was what you are looking for:

stderr: /var/lib/ambari-agent/data/errors-357.txt

None

stdout: /var/lib/ambari-agent/data/output-357.txt

2016-09-22 20:34:48,695 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:48,695 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:48,696 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:48,797 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:48,797 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:48,906 - checked_call returned (0, '')
2016-09-22 20:34:48,908 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:48,908 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:49,188 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:49,193 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:49,194 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:49,345 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:49,346 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:49,553 - checked_call returned (0, '')
2016-09-22 20:34:49,554 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:49,554 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:49,560 - Group['hadoop'] {}
2016-09-22 20:34:49,562 - Group['users'] {}
2016-09-22 20:34:49,562 - Group['zeppelin'] {}
2016-09-22 20:34:49,566 - Group['knox'] {}
2016-09-22 20:34:49,567 - Group['ranger'] {}
2016-09-22 20:34:49,567 - Group['spark'] {}
2016-09-22 20:34:49,568 - Group['livy'] {}
2016-09-22 20:34:49,570 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,581 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,595 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,596 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,597 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,606 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,610 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,611 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2016-09-22 20:34:49,612 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,612 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,613 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,614 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,615 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,615 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,617 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,620 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,621 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,622 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-09-22 20:34:49,623 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,625 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,627 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,628 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,629 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-09-22 20:34:49,630 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-22 20:34:49,632 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-09-22 20:34:49,715 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-09-22 20:34:49,715 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:49,716 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-22 20:34:49,719 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-09-22 20:34:49,815 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-09-22 20:34:49,815 - Group['hdfs'] {}
2016-09-22 20:34:49,816 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2016-09-22 20:34:49,816 - FS Type: 
2016-09-22 20:34:49,817 - Directory['/etc/hadoop'] {'mode': 0755}
2016-09-22 20:34:49,837 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,838 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-09-22 20:34:49,862 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2016-09-22 20:34:49,946 - Skipping Execute[('setenforce', '0')] due to not_if
2016-09-22 20:34:49,947 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:49,950 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2016-09-22 20:34:49,950 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2016-09-22 20:34:49,959 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2016-09-22 20:34:49,962 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2016-09-22 20:34:49,963 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:49,975 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,976 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2016-09-22 20:34:49,979 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2016-09-22 20:34:49,984 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2016-09-22 20:34:50,056 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2016-09-22 20:34:50,495 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-09-22 20:34:50,495 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-22 20:34:50,496 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-22 20:34:50,590 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-09-22 20:34:50,591 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-22 20:34:50,696 - checked_call returned (0, '')
2016-09-22 20:34:50,697 - Ensuring that hadoop has the correct symlink structure
2016-09-22 20:34:50,697 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-22 20:34:50,702 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2016-09-22 20:34:50,798 - call returned (0, 'hive-server2 - 2.5.0.0-1245')
2016-09-22 20:34:50,799 - Stack Feature Version Info: stack_version=2.5, version=2.5.0.0-1245, current_cluster_version=2.5.0.0-1245 -> 2.5.0.0-1245
2016-09-22 20:34:50,814 - HdfsResource['/user/hcat'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hcat', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:50,817 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpI_7GC8 2>/tmp/tmpn0lkho''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:50,981 - call returned (0, '')
2016-09-22 20:34:50,982 - Called copy_to_hdfs tarball: mapreduce
2016-09-22 20:34:50,982 - Default version is 2.5.0.0-1245
2016-09-22 20:34:50,982 - Source file: /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz
2016-09-22 20:34:50,982 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:50,983 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpKwBYNm 2>/tmp/tmp0aNFSA''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,121 - call returned (0, '')
2016-09-22 20:34:51,123 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,126 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp67ntEe 2>/tmp/tmpB6VQ56''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,364 - call returned (0, '')
2016-09-22 20:34:51,365 - DFS file /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz is identical to /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz, skipping the copying
2016-09-22 20:34:51,365 - Will attempt to copy mapreduce tarball from /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz.
2016-09-22 20:34:51,365 - Called copy_to_hdfs tarball: tez
2016-09-22 20:34:51,365 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,365 - Source file: /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz
2016-09-22 20:34:51,365 - HdfsResource['/hdp/apps/2.5.0.0-1245/tez'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,366 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/tez?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpcg2mVj 2>/tmp/tmpwYRv71''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,508 - call returned (0, '')
2016-09-22 20:34:51,512 - HdfsResource['/hdp/apps/2.5.0.0-1245/tez/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,514 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/tez/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRFcGmK 2>/tmp/tmpKrLY8H''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,643 - call returned (0, '')
2016-09-22 20:34:51,645 - DFS file /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz is identical to /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz, skipping the copying
2016-09-22 20:34:51,645 - Will attempt to copy tez tarball from /usr/hdp/2.5.0.0-1245/tez/lib/tez.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/tez/tez.tar.gz.
2016-09-22 20:34:51,646 - Called copy_to_hdfs tarball: pig
2016-09-22 20:34:51,646 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,646 - Source file: /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz
2016-09-22 20:34:51,646 - HdfsResource['/hdp/apps/2.5.0.0-1245/pig'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,647 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/pig?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpYE0UtS 2>/tmp/tmpFIV_Au''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,797 - call returned (0, '')
2016-09-22 20:34:51,801 - HdfsResource['/hdp/apps/2.5.0.0-1245/pig/pig.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/pig/pig.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:51,803 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/pig/pig.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpkapHOv 2>/tmp/tmpCJVOT3''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:51,951 - call returned (0, '')
2016-09-22 20:34:51,952 - DFS file /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz is identical to /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz, skipping the copying
2016-09-22 20:34:51,953 - Will attempt to copy pig tarball from /usr/hdp/2.5.0.0-1245/pig/pig.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/pig/pig.tar.gz.
2016-09-22 20:34:51,953 - Called copy_to_hdfs tarball: hive
2016-09-22 20:34:51,953 - Default version is 2.5.0.0-1245
2016-09-22 20:34:51,953 - Source file: /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz
2016-09-22 20:34:51,953 - HdfsResource['/hdp/apps/2.5.0.0-1245/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:51,954 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpt9Czrz 2>/tmp/tmpXpmA10''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,098 - call returned (0, '')
2016-09-22 20:34:52,100 - HdfsResource['/hdp/apps/2.5.0.0-1245/hive/hive.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hive/hive.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,101 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/hive/hive.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpIklHh5 2>/tmp/tmp7TmFWB''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,226 - call returned (0, '')
2016-09-22 20:34:52,228 - DFS file /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz is identical to /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz, skipping the copying
2016-09-22 20:34:52,229 - Will attempt to copy hive tarball from /usr/hdp/2.5.0.0-1245/hive/hive.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/hive/hive.tar.gz.
2016-09-22 20:34:52,230 - Called copy_to_hdfs tarball: sqoop
2016-09-22 20:34:52,230 - Default version is 2.5.0.0-1245
2016-09-22 20:34:52,230 - Source file: /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz
2016-09-22 20:34:52,231 - HdfsResource['/hdp/apps/2.5.0.0-1245/sqoop'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:52,232 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/sqoop?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRAP_QA 2>/tmp/tmp_8jqW9''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,368 - call returned (0, '')
2016-09-22 20:34:52,370 - HdfsResource['/hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,371 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpIH2gWB 2>/tmp/tmpwcrDz3''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,511 - call returned (0, '')
2016-09-22 20:34:52,513 - DFS file /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz is identical to /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz, skipping the copying
2016-09-22 20:34:52,514 - Will attempt to copy sqoop tarball from /usr/hdp/2.5.0.0-1245/sqoop/sqoop.tar.gz to DFS at /hdp/apps/2.5.0.0-1245/sqoop/sqoop.tar.gz.
2016-09-22 20:34:52,514 - Called copy_to_hdfs tarball: hadoop_streaming
2016-09-22 20:34:52,515 - Default version is 2.5.0.0-1245
2016-09-22 20:34:52,515 - Source file: /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar , Dest file in HDFS: /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar
2016-09-22 20:34:52,515 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0555}
2016-09-22 20:34:52,516 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp6JnXdt 2>/tmp/tmpaqgOJU''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,646 - call returned (0, '')
2016-09-22 20:34:52,649 - HdfsResource['/hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar', 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0444}
2016-09-22 20:34:52,650 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp4iplqc 2>/tmp/tmpE5XUQz''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,794 - call returned (0, '')
2016-09-22 20:34:52,797 - DFS file /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar is identical to /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar, skipping the copying
2016-09-22 20:34:52,797 - Will attempt to copy hadoop_streaming tarball from /usr/hdp/2.5.0.0-1245/hadoop-mapreduce/hadoop-streaming.jar to DFS at /hdp/apps/2.5.0.0-1245/mapreduce/hadoop-streaming.jar.
2016-09-22 20:34:52,798 - HdfsResource['/apps/hive/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0777}
2016-09-22 20:34:52,799 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/apps/hive/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpTyMYcl 2>/tmp/tmpXf1Iqg''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:52,934 - call returned (0, '')
2016-09-22 20:34:52,935 - Skipping the operation for not managed DFS directory /apps/hive/warehouse since immutable_paths contains it.
2016-09-22 20:34:52,936 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:52,938 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpco40Q8 2>/tmp/tmpRE1mEg''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,074 - call returned (0, '')
2016-09-22 20:34:53,075 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
2016-09-22 20:34:53,076 - Directory['/etc/hive'] {'mode': 0755}
2016-09-22 20:34:53,076 - Directories to fill with configs: ['/usr/hdp/current/hive-server2/conf', '/usr/hdp/current/hive-server2/conf/conf.server']
2016-09-22 20:34:53,077 - Directory['/usr/hdp/current/hive-server2/conf'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
2016-09-22 20:34:53,080 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,098 - Generating config: /usr/hdp/current/hive-server2/conf/mapred-site.xml
2016-09-22 20:34:53,099 - File['/usr/hdp/current/hive-server2/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,187 - File['/usr/hdp/current/hive-server2/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,189 - File['/usr/hdp/current/hive-server2/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,189 - File['/usr/hdp/current/hive-server2/conf/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,190 - File['/usr/hdp/current/hive-server2/conf/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,190 - Directory['/usr/hdp/current/hive-server2/conf/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
2016-09-22 20:34:53,190 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,206 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/mapred-site.xml
2016-09-22 20:34:53,206 - File['/usr/hdp/current/hive-server2/conf/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,275 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,276 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,276 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,277 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,279 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'hidden': {'javax.jdo.option.ConnectionPassword': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}, 'javax.jdo.option.ConnectionPassword': {'hidden': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,287 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/hive-site.xml
2016-09-22 20:34:53,288 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,470 - Generating Atlas Hook config file /usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties
2016-09-22 20:34:53,471 - PropertiesFile['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2016-09-22 20:34:53,476 - Generating properties file: /usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties
2016-09-22 20:34:53,476 - File['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:53,497 - Writing File['/usr/hdp/current/hive-server2/conf/conf.server/atlas-application.properties'] because contents don't match
2016-09-22 20:34:53,498 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:53,508 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/hiveserver2-site.xml
2016-09-22 20:34:53,508 - File['/usr/hdp/current/hive-server2/conf/conf.server/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2016-09-22 20:34:53,523 - File['/usr/hdp/current/hive-server2/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,523 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2016-09-22 20:34:53,526 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2016-09-22 20:34:53,527 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://sandbox.hortonworks.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2016-09-22 20:34:53,527 - Not downloading the file from http://sandbox.hortonworks.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2016-09-22 20:34:53,534 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
2016-09-22 20:34:53,541 - File['/usr/hdp/current/hive-server2/conf/conf.server/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop'}
2016-09-22 20:34:53,542 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,543 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,543 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2016-09-22 20:34:53,544 - Hive: Setup ranger: command retry not enabled thus skipping if ranger admin is down !
2016-09-22 20:34:53,546 - HdfsResource['/ranger/audit'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'recursive_chmod': True, 'owner': 'hdfs', 'group': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
2016-09-22 20:34:53,548 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/ranger/audit?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpssmhHh 2>/tmp/tmpactw3P''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,686 - call returned (0, '')
2016-09-22 20:34:53,688 - HdfsResource['/ranger/audit/hiveServer2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'recursive_chmod': True, 'owner': 'hive', 'group': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0700}
2016-09-22 20:34:53,690 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://sandbox.hortonworks.com:50070/webhdfs/v1/ranger/audit/hiveServer2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpeulSAi 2>/tmp/tmplvahZM''] {'logoutput': None, 'quiet': False}
2016-09-22 20:34:53,825 - call returned (0, '')
2016-09-22 20:34:53,826 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://sandbox.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
2016-09-22 20:34:53,828 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2016-09-22 20:34:53,954 - call returned (0, 'hive-server2 - 2.5.0.0-1245')
2016-09-22 20:34:53,954 - RangeradminV2: Skip ranger admin if it's down !
2016-09-22 20:34:54,477 - amb_ranger_admin user already exists.
2016-09-22 20:34:54,886 - Hive Repository Sandbox_hive exist
2016-09-22 20:34:54,887 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-security.xml'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:54,888 - Writing File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-security.xml'] because contents don't match
2016-09-22 20:34:54,889 - Directory['/etc/ranger/Sandbox_hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:54,889 - Directory['/etc/ranger/Sandbox_hive/policycache'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-09-22 20:34:54,893 - File['/etc/ranger/Sandbox_hive/policycache/hiveServer2_Sandbox_hive.json'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2016-09-22 20:34:54,894 - XmlConfig['ranger-hive-audit.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,918 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-audit.xml
2016-09-22 20:34:54,919 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-audit.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,939 - XmlConfig['ranger-hive-security.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,954 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-security.xml
2016-09-22 20:34:54,954 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-hive-security.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,965 - XmlConfig['ranger-policymgr-ssl.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/conf.server', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2016-09-22 20:34:54,975 - Generating config: /usr/hdp/current/hive-server2/conf/conf.server/ranger-policymgr-ssl.xml
2016-09-22 20:34:54,975 - File['/usr/hdp/current/hive-server2/conf/conf.server/ranger-policymgr-ssl.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2016-09-22 20:34:54,986 - Execute[('/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_hive/cred.jceks', '-k', 'sslKeyStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslKeyStore created successfully!
2016-09-22 20:34:59,241 - Execute[('/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.0.0-1245/ranger-hive-plugin/install/lib/*', '-f', '/etc/ranger/Sandbox_hive/cred.jceks', '-k', 'sslTrustStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/lib/jvm/java'}, 'sudo': True}
Using Java:/usr/lib/jvm/java/bin/java
Alias sslTrustStore created successfully!
2016-09-22 20:35:01,494 - File['/etc/ranger/Sandbox_hive/cred.jceks'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2016-09-22 20:35:01,496 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /var/run/hive/hive-server.pid 1>/tmp/tmp2KqY0J 2>/tmp/tmpW8i5WY''] {'quiet': False}
2016-09-22 20:35:01,699 - call returned (0, '')
2016-09-22 20:35:01,701 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'hive --config /usr/hdp/current/hive-server2/conf/conf.server --service metatool -listFSRoot' 2>/dev/null | grep hdfs:// | cut -f1,2,3 -d '/' | grep -v 'hdfs://sandbox.hortonworks.com:8020' | head -1'] {}
2016-09-22 20:35:02,105 - call returned (0, '')
2016-09-22 20:35:02,105 - Execute['/var/lib/ambari-agent/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.err /var/run/hive/hive-server.pid /usr/hdp/current/hive-server2/conf/conf.server /var/log/hive'] {'environment': {'HIVE_BIN': 'hive', 'JAVA_HOME': '/usr/lib/jvm/java', 'HADOOP_HOME': '/usr/hdp/current/hadoop-client'}, 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 8296 >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-server2/bin:/usr/hdp/current/hadoop-client/bin']}
2016-09-22 20:35:02,351 - Execute['/usr/lib/jvm/java/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2/lib/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true' root [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
2016-09-22 20:35:04,020 - Execute['/usr/lib/jvm/java/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2-hive2/lib/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true' root [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
 

Command completed successfully!

avatar
New Contributor
  1. Click the System DSN tab.
  2. Click the Add button.
  3. Select the appropriate driver for your Hadoop distribution.
  4. Complete the information in the DSN Setup dialog box.
  5. Click the Test button. Test results display with either, “TESTS COMPLETED SUCCESSFULLY” or “TEST COMPLETED WITH ERROR.”

avatar
New Contributor

@Joav Bally the Hive server stops because of high Hiveserver 2 Heap Size.

  1. Go to Hive Services
  2. Click Config and scroll to the bottom
  3. In Storage section. Change the HiveServer2 heap Size.(Click the refresh icon that displays on hovering to set the recommended)
  4. Save the Current HDP and restart the hive service

This will solve the issue.