Member since
01-03-2018
89
Posts
3
Kudos Received
0
Solutions
09-13-2018
10:36 PM
@Shu We have headers coming in multiple sized like in lower and upper case. As we are converting CSV to Json, we are first converting the csv schema into Avro Schema where avro is not accepting upper case column name. Kindly advice how to convert upper case into lower case. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache NiFi
09-13-2018
10:25 PM
Many thanks, it worked.
... View more
09-13-2018
09:14 PM
@Shu Followed and implemented but getting the error at the same place where you are showing in your snapshot. Kindly advice how to fix. UpdateAttribute_Header UpdateAttribute_Data
... View more
09-13-2018
06:38 PM
@Shu Thanks, the merged worked but header came as a last row. How to prioritize the header FlowFile to come top as data header (as column names)
... View more
09-13-2018
04:23 PM
@Shu @Matt Burgess Hi, We have 2 files coming from different locations. 1) First one is coming containing Header (column names only) 2) Second one is having Data, in the same column sequence. Aim: We need to merge both in one output file, where Header comes on top (in first row) and data from 2 row on wards. Looking forward. Cheers
... View more
Labels:
- Labels:
-
Apache NiFi
08-21-2018
04:35 PM
Issue Resolved for me. In HDP 3.0, please use PutHive3Streaming, PutHive3QL and SelectHiveQL. Cheers.
... View more
08-21-2018
04:34 PM
Issue Resolved. In HDP 3.0, please use PutHive3Streaming, PutHive3QL and SelectHiveQL. Cheers.
... View more
08-21-2018
04:33 PM
In HDP 3.0, please use PutHive3Streaming, PutHive3QL and SelectHiveQL. Cheers.
... View more
07-17-2018
03:13 PM
HiveServer2 Interactive Service is still not running for me. Done everything possible. Can anyone step ahead and help me and above gentlemen to solve this irritating issue. Having following error: stderr: /var/lib/ambari-agent/data/errors-249.txt /var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py:537: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
Logger.info(e.message)
2018-07-17 10:09:06,078 - LLAP app 'llap0' deployment unsuccessful.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 612, in <module>
HiveServerInteractive().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 119, in start
raise Fail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.")
resource_management.core.exceptions.Fail: Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED. stdout: /var/lib/ambari-agent/data/output-249.txt 2018-07-17 10:01:11,976 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:11,986 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,255 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:12,258 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,260 - Group['livy'] {}
2018-07-17 10:01:12,262 - Group['spark'] {}
2018-07-17 10:01:12,262 - Group['hdfs'] {}
2018-07-17 10:01:12,263 - Group['hadoop'] {}
2018-07-17 10:01:12,263 - Group['users'] {}
2018-07-17 10:01:12,264 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,265 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,266 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,267 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,269 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,270 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-07-17 10:01:12,271 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,272 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,274 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,275 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-07-17 10:01:12,276 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,277 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-07-17 10:01:12,279 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,280 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,281 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,283 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,283 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-07-17 10:01:12,293 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-07-17 10:01:12,328 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-07-17 10:01:12,328 - Group['hdfs'] {}
2018-07-17 10:01:12,329 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
2018-07-17 10:01:12,330 - FS Type:
2018-07-17 10:01:12,330 - Directory['/etc/hadoop'] {'mode': 0755}
2018-07-17 10:01:12,368 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,377 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-07-17 10:01:12,411 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-07-17 10:01:12,453 - Skipping Execute[('setenforce', '0')] due to not_if
2018-07-17 10:01:12,453 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-07-17 10:01:12,456 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-07-17 10:01:12,456 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-07-17 10:01:12,464 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2018-07-17 10:01:12,477 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2018-07-17 10:01:12,493 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:12,527 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,528 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-07-17 10:01:12,529 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,544 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:12,566 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-07-17 10:01:12,958 - MariaDB RedHat Support: false
2018-07-17 10:01:12,962 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,966 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-07-17 10:01:12,990 - call returned (0, 'hive-server2 - 2.6.3.0-235')
2018-07-17 10:01:12,991 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:12,998 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://hdp-1-nn.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-07-17 10:01:12,999 - Not downloading the file from http://hdp-1-nn.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-07-17 10:01:13,000 - checked_call[('/usr/java/jdk/bin/java', '-cp', '/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', 'jceks://file/var/lib/ambari-agent/cred/conf/hive_server_interactive/hive-site.jceks')] {}
2018-07-17 10:01:14,969 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nJul 17, 2018 10:01:14 AM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nwelcome1')
2018-07-17 10:01:14,982 - HdfsResource['/apps/hive/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0777}
2018-07-17 10:01:14,985 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/apps/hive/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpDNliSV 2>/tmp/tmp0ylQls''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,015 - call returned (0, '')
2018-07-17 10:01:15,015 - Skipping the operation for not managed DFS directory /apps/hive/warehouse since immutable_paths contains it.
2018-07-17 10:01:15,016 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2018-07-17 10:01:15,017 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp_uwRiQ 2>/tmp/tmpPTqVUr''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,048 - call returned (0, '')
2018-07-17 10:01:15,050 - Called copy_to_hdfs tarball: tez_hive2
2018-07-17 10:01:15,050 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:15,050 - Tarball version was calcuated as 2.6.3.0-235. Use Command Version: True
2018-07-17 10:01:15,050 - Source file: /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz
2018-07-17 10:01:15,051 - HdfsResource['/hdp/apps/2.6.3.0-235/tez_hive2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-07-17 10:01:15,052 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/hdp/apps/2.6.3.0-235/tez_hive2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpQSzSgR 2>/tmp/tmp4LPDgH''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,083 - call returned (0, '')
2018-07-17 10:01:15,084 - HdfsResource['/hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-07-17 10:01:15,085 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpce83lX 2>/tmp/tmpbTyAnn''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,112 - call returned (0, '')
2018-07-17 10:01:15,113 - DFS file /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz is identical to /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz, skipping the copying
2018-07-17 10:01:15,113 - Will attempt to copy tez_hive2 tarball from /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz to DFS at /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz.
2018-07-17 10:01:15,114 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2018-07-17 10:01:15,114 - Directory['/etc/hive2'] {'mode': 0755}
2018-07-17 10:01:15,114 - Directories to fill with configs: ['/usr/hdp/current/hive-server2-hive2/conf', '/usr/hdp/current/hive-server2-hive2/conf/conf.server']
2018-07-17 10:01:15,116 - Directory['/etc/hive2/2.6.3.0-235/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-07-17 10:01:15,117 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.3.0-235/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,134 - Generating config: /etc/hive2/2.6.3.0-235/0/mapred-site.xml
2018-07-17 10:01:15,134 - File['/etc/hive2/2.6.3.0-235/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,199 - Writing File['/etc/hive2/2.6.3.0-235/0/mapred-site.xml'] because contents don't match
2018-07-17 10:01:15,200 - File['/etc/hive2/2.6.3.0-235/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,201 - File['/etc/hive2/2.6.3.0-235/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,201 - File['/etc/hive2/2.6.3.0-235/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,202 - Directory['/etc/hive2/2.6.3.0-235/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
2018-07-17 10:01:15,203 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.3.0-235/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,215 - Generating config: /etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml
2018-07-17 10:01:15,215 - File['/etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,271 - Writing File['/etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml'] because contents don't match
2018-07-17 10:01:15,272 - File['/etc/hive2/2.6.3.0-235/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,273 - File['/etc/hive2/2.6.3.0-235/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,273 - File['/etc/hive2/2.6.3.0-235/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,274 - Converted 'hive.llap.io.memory.size' value from '3072 MB' to '3221225472 Bytes' before writing it to config file.
2018-07-17 10:01:15,274 - Skipping setup for Atlas Hook, as it is disabled/ not supported.
2018-07-17 10:01:15,274 - No change done to Hive2/hive-site.xml 'hive.exec.post.hooks' value.
2018-07-17 10:01:15,274 - Retrieved 'tez/tez-site' for merging with 'tez_hive2/tez-interactive-site'.
2018-07-17 10:01:15,274 - XmlConfig['tez-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/tez_hive2/conf', 'mode': 0664, 'configuration_attributes': {'final': {'tez.runtime.shuffle.ssl.enable': 'true'}}, 'owner': 'tez', 'configurations': ...}
2018-07-17 10:01:15,287 - Generating config: /etc/tez_hive2/conf/tez-site.xml
2018-07-17 10:01:15,287 - File['/etc/tez_hive2/conf/tez-site.xml'] {'owner': 'tez', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0664, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,413 - Retrieved 'hiveserver2-site' for merging with 'hiveserver2-interactive-site'.
2018-07-17 10:01:15,414 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,426 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hive-site.xml
2018-07-17 10:01:15,426 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,685 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.security.authorization.enabled': 'false', 'hive.service.metrics.hadoop2.component': 'hiveserver2', 'hive.async.log.enabled': 'false'}}
2018-07-17 10:01:15,697 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml
2018-07-17 10:01:15,697 - File['/usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,707 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,713 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,717 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,722 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,725 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,728 - File['/usr/hdp/current/hive-server2-hive2/conf/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,735 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,741 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,746 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,748 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_server_interactive/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-07-17 10:01:15,749 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] because contents don't match
2018-07-17 10:01:15,750 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,761 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml
2018-07-17 10:01:15,761 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:16,028 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.security.authorization.enabled': 'false', 'hive.service.metrics.hadoop2.component': 'hiveserver2', 'hive.async.log.enabled': 'false'}}
2018-07-17 10:01:16,039 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml
2018-07-17 10:01:16,039 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:16,049 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,055 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,059 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,063 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,066 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,069 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,076 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,081 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,087 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,088 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-07-17 10:01:16,090 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-07-17 10:01:16,091 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://hdp-1-nn.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-07-17 10:01:16,092 - Not downloading the file from http://hdp-1-nn.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-07-17 10:01:16,094 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_interactive_script'] {'content': Template('startHiveserver2Interactive.sh.j2'), 'mode': 0755}
2018-07-17 10:01:16,095 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,096 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,096 - Directory['/var/lib/hive2'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,099 - Determining previous run 'LLAP package' folder(s) to be deleted ....
2018-07-17 10:01:16,099 - Previous run 'LLAP package' folder(s) to be deleted = ['llap-slider2018-07-17_02-41-57']
2018-07-17 10:01:16,099 - Directory['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_02-41-57'] {'action': ['delete'], 'ignore_failures': True}
2018-07-17 10:01:16,100 - Removing directory Directory['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_02-41-57'] and all its content
2018-07-17 10:01:16,125 - Starting LLAP
2018-07-17 10:01:16,125 - Setting slider_placement : 0, as llap_daemon_container_size : 9035 > 0.5 * YARN NodeManager Memory(18065)
2018-07-17 10:01:16,129 - LLAP start command: /usr/hdp/current/hive-server2-hive2/bin/hive --service llap --slider-am-container-mb 1024 --size 9035m --cache 3072m --xmx 2457m --loglevel INFO --output /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16 --slider-placement 0 --skiphadoopversion --skiphbasecp --instances 1 --logger query-routing --args " -XX:+AlwaysPreTouch -Xss512k -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:InitiatingHeapOccupancyPercent=40 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=200 -XX:MetaspaceSize=1024m"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]
WARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/java/jdk],process jre=[/usr/java/jdk/jre]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
Prepared /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh for running LLAP on Slider
2018-07-17 10:01:39,078 - Run file path: /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh
2018-07-17 10:01:39,079 - Execute['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh'] {'logoutput': True, 'user': 'hive'}
2018-07-17 10:01:43,244 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:43,264 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:43,484 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:43,637 [main] ERROR main.ServiceLauncher - Unknown application instance : llap0
(definition not found at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/app_config.json
2018-07-17 10:01:43,639 [main] INFO util.ExitUtil - Exiting with status 69
2018-07-17 10:01:47,557 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:47,580 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:47,760 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:47,867 [main] ERROR main.ServiceLauncher - Unknown application instance : llap0
(definition not found at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/app_config.json
2018-07-17 10:01:47,869 [main] INFO util.ExitUtil - Exiting with status 69
2018-07-17 10:01:52,078 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:52,099 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:52,292 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:52,526 [main] INFO imps.CuratorFrameworkImpl - Starting
2018-07-17 10:01:52,589 [main-EventThread] INFO state.ConnectionStateManager - State change: CONNECTED
2018-07-17 10:01:52,615 [main] INFO client.SliderClient - Destroyed cluster llap0
2018-07-17 10:01:52,618 [main] INFO util.ExitUtil - Exiting with status 0
2018-07-17 10:01:56,788 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:56,805 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:57,017 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:57,174 [main] INFO client.SliderClient - Installing package file:/var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/llap-17Jul2018.zip to hdfs://hdp-1-nn.com:8020/user/hive/.slider/package/LLAP/llap-17Jul2018.zip (overwrite set to true)
2018-07-17 10:01:59,249 [main] INFO tools.SliderUtils - Reading metainfo.xml of size 1998
2018-07-17 10:01:59,251 [main] INFO client.SliderClient - Found XML metainfo file in package
2018-07-17 10:01:59,263 [main] INFO client.SliderClient - Creating summary metainfo file
2018-07-17 10:01:59,297 [main] INFO client.SliderClient - Set application.def in your app config JSON to .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:01:59,298 [main] INFO util.ExitUtil - Exiting with status 0
2018-07-17 10:02:02,707 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:02:02,736 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:02:03,110 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:02:04,458 [main] INFO agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:04,464 [main] INFO agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:04,971 [main] INFO agent.AgentUtils - Got metainfo from summary file
2018-07-17 10:02:05,078 [main] INFO client.SliderClient - No credentials requested
2018-07-17 10:02:05,288 [main] INFO agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:05,309 [main] INFO agent.AgentUtils - Got metainfo from summary file
2018-07-17 10:02:05,401 [main] INFO launch.AbstractLauncher - Setting yarn.resourcemanager.am.retry-count-window-ms to 300000
2018-07-17 10:02:05,406 [main] INFO launch.AbstractLauncher - Log include patterns: .*\.done
2018-07-17 10:02:05,409 [main] INFO launch.AbstractLauncher - Log exclude patterns:
2018-07-17 10:02:05,410 [main] INFO launch.AbstractLauncher - Modified log include patterns: .*\.done
2018-07-17 10:02:05,410 [main] INFO launch.AbstractLauncher - Modified log exclude patterns:
2018-07-17 10:02:05,770 [main] INFO slideram.SliderAMClientProvider - Loading all dependencies for AM.
2018-07-17 10:02:05,772 [main] INFO tools.CoreFileSystem - Loading all dependencies from /hdp/apps/2.6.3.0-235/slider/slider.tar.gz
2018-07-17 10:02:05,775 [main] INFO agent.AgentClientProvider - Automatically uploading the agent tarball at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/tmp/application_1531803503758_0001/agent
2018-07-17 10:02:05,875 [main] INFO agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:05,915 [main] INFO client.SliderClient - Using queue llap for the application instance.
2018-07-17 10:02:05,915 [main] INFO client.SliderClient - Submitting application application_1531803503758_0001
2018-07-17 10:02:05,918 [main] INFO launch.AppMasterLauncher - Submitting application to Resource Manager
2018-07-17 10:02:06,367 [main] INFO impl.YarnClientImpl - Submitted application application_1531803503758_0001
2018-07-17 10:02:06,370 [main] INFO util.ExitUtil - Exiting with status 0
2018-07-17 10:02:06,564 - Submitted LLAP app name : llap0
2018-07-17 10:02:06,564 -
2018-07-17 10:02:06,565 - LLAP status command : /usr/hdp/current/hive-server2-hive2/bin/hive --service llapstatus -w -r 0.8 -i 2 -t 400
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
LLAPSTATUS WatchMode with timeout=400 s
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001.
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
{
"amInfo" : {
"appName" : "llap0",
"appType" : "org-apache-slider",
"appId" : "application_1531803503758_0001",
"containerId" : "container_e05_1531803503758_0001_01_000001",
"hostname" : "hdp-3-dn1.com",
"amWebUrl" : "http://hdp-3-dn1.com:45298/"
},
"state" : "LAUNCHING",
"originalConfigurationPath" : "hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/snapshot",
"generatedConfigurationPath" : "hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/generated",
"desiredInstances" : 1,
"liveInstances" : 0,
"appStartTime" : 1531803742224,
"runningThresholdAchieved" : false
}
WARN cli.LlapStatusServiceDriver: Watch timeout 400s exhausted before desired state RUNNING is attained.
2018-07-17 10:09:06,078 - LLAP app 'llap0' current state is LAUNCHING.
2018-07-17 10:09:06,078 - LLAP app 'llap0' current state is LAUNCHING.
2018-07-17 10:09:06,078 - LLAP app 'llap0' deployment unsuccessful.
2018-07-17 10:09:06,078 - Stopping LLAP
2018-07-17 10:09:06,079 - call[['slider', 'stop', 'llap0']] {'logoutput': True, 'user': 'hive', 'stderr': -1}
2018-07-17 10:09:09,551 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:09:09,567 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:09:09,770 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:09:10,176 [main] INFO util.ExitUtil - Exiting with status 0
2018-07-17 10:09:11,164 - call returned (0, '2018-07-17 10:09:09,551 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.\n2018-07-17 10:09:09,567 [main] INFO client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050\n2018-07-17 10:09:09,770 [main] INFO client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200\n2018-07-17 10:09:10,176 [main] INFO util.ExitUtil - Exiting with status 0', '')
2018-07-17 10:09:11,165 - Stopped llap0 application on Slider successfully
2018-07-17 10:09:11,165 - Execute[('slider', 'destroy', 'llap0', '--force')] {'ignore_failures': True, 'user': 'hive', 'timeout': 30}
Command failed after 1 tries ----------------------------------------------------------------------------------------- Current Hortonworks Multinode Cluster Status VMs Spec
VM#1: Active NameNode (32 GB RAM & 2 processors/ CPU) VM#2: Standby NameNode (12 GB RAM & 1 processors/ CPU) VM#3: DataNode (12 GB RAM & 1 processors/ CPU) Other details:
OS: Linux 6.5 HDP 2.6.3 + Ambari 2.6.0.0 HDF 3.0.2 (only NiFi with min 3 GB and max 4 GB, No SSL) --------------------------------------------------------------------------------------------------- For LLAP, did following things:
Pre-emption = Enabled Capacity Schedule:
default: min 50% and max 50% Added a new queue: llap with min 50% and max 50% Memory allocated for all YARN containers on a node = 12 GB
Minimum Container Size (Memory) = 1 GB
Maximum Container Size (Memory) = 12 GB
Tez Container Size = 3 GB
HiveServer2 Heap Size = 2 GB
Metastore Heap Size= 2 GB
Client Heap Size = 1 GB
Enabled LLAP
Interactive Query Queue = llap Number of nodes used by Hive's LLAP = 1 Maximum Total Concurrent Queries = 1 Memory per Daemon = 10240 In-Memory Cache per Daemon = 7168
Number of executors per LLAP Daemon = 1 Installed LLAP on Active NameNode as it took it as default HiveServer2 Interactive = failed ------------------------------------ Not sure what i am missing. Looks like have done most of the things. Must be missing something special. Looking forward for solution. Cheers.....
... View more
06-23-2018
01:48 PM
I am facing the same problem. Following all the above steps but still unable to get delete command running. Note: I have table with no sorting.
... View more