Support Questions
Find answers, ask questions, and share your expertise

Cannot start HiveServer2 Interactive (LLAP)

Rising Star

On fresh installed HDP-2.5 I can’t start HiveServer2 Interactive. Cluster is High Available. I tried to install HiveServer2 Interactive on both ActiveNN and StandbyNN, but with the same unsuccessful result. I didn't find any obvious exeptions in logs.

Here stderr:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 512, in check_llap_app_status
    status = do_retries()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/decorator.py", line 55, in wrapper
    return function(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 505, in do_retries
    raise Fail(status_str)
Fail: LLAP app 'llap0' current state is COMPLETE.
2016-09-07 20:37:48,705 - LLAP app 'llap0' deployment unsuccessful.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 535, in <module>
    HiveServerInteractive().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 720, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 123, in start
    raise Fail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.")
resource_management.core.exceptions.Fail: Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.

sdtout too long, so here some some excerpts:

2016-09-07 20:31:49,638 - Starting LLAP
2016-09-07 20:31:49,643 - Command: /usr/hdp/current/hive-server2-hive2/bin/hive --service llap --instances 1 --slider-am-container-mb 5120 --size 30720m  --cache 0m --xmx 29696m --loglevel INFO --output /var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49 --args " -XX:+AlwaysPreTouch -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:MetaspaceSize=1024m -XX:InitiatingHeapOccupancyPercent=80 -XX:MaxGCPauseMillis=200"
2016-09-07 20:31:49,643 - checked_call['/usr/hdp/current/hive-server2-hive2/bin/hive --service llap --instances 1 --slider-am-container-mb 5120 --size 30720m  --cache 0m --xmx 29696m --loglevel INFO --output /var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49 --args " -XX:+AlwaysPreTouch -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:MetaspaceSize=1024m -XX:InitiatingHeapOccupancyPercent=80 -XX:MaxGCPauseMillis=200"'] {'logoutput': True, 'user': 'hive', 'stderr': -1}
which: no hbase in (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
INFO cli.LlapServiceDriver: LLAP service driver invoked with arguments=--hiveconf
INFO conf.HiveConf: Found configuration file file:/etc/hive2/2.5.0.0-1245/0/conf.server/hive-site.xml
WARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist
WARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]
WARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist
INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
INFO metastore.ObjectStore: ObjectStore, initialize called
WARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist
INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,Database,Type,FieldSchema,Order"
INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
INFO metastore.ObjectStore: Initialized ObjectStore
INFO metastore.HiveMetaStore: Added admin role in metastore
INFO metastore.HiveMetaStore: Added public role in metastore
INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
INFO metastore.HiveMetaStore: 0: get_all_functions
INFO HiveMetaStore.audit: ugi=hive	ip=unknown-ip-addr	cmd=get_all_functions	
WARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/jdk64/jdk1.8.0_77],process jre=[/usr/jdk64/jdk1.8.0_77/jre]
INFO cli.LlapServiceDriver: Using [/usr/jdk64/jdk1.8.0_77] for JAVA_HOME
INFO cli.LlapServiceDriver: Copied hadoop metrics2 properties file from file:/etc/hive2/2.5.0.0-1245/0/conf.server/hadoop-metrics2-llapdaemon.properties
INFO cli.LlapServiceDriver: LLAP service driver finished
Prepared /var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49/run.sh for running LLAP on Slider
2016-09-07 20:32:18,650 - checked_call returned (0, 'Prepared /var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49/run.sh for running LLAP on Slider', 'which: no hbase in (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent)\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nSLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]\nINFO cli.LlapServiceDriver: LLAP service driver invoked with arguments=--hiveconf\nINFO conf.HiveConf: Found configuration file file:/etc/hive2/2.5.0.0-1245/0/conf.server/hive-site.xml\nWARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist\nWARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]\nWARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist\nINFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore\nINFO metastore.ObjectStore: ObjectStore, initialize called\nWARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist\nINFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,Database,Type,FieldSchema,Order"\nINFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL\nINFO metastore.ObjectStore: Initialized ObjectStore\nINFO metastore.HiveMetaStore: Added admin role in metastore\nINFO metastore.HiveMetaStore: Added public role in metastore\nINFO metastore.HiveMetaStore: No user is added in admin role, since config is empty\nINFO metastore.HiveMetaStore: 0: get_all_functions\nINFO HiveMetaStore.audit: ugi=hive\tip=unknown-ip-addr\tcmd=get_all_functions\t\nWARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/jdk64/jdk1.8.0_77],process jre=[/usr/jdk64/jdk1.8.0_77/jre]\nINFO cli.LlapServiceDriver: Using [/usr/jdk64/jdk1.8.0_77] for JAVA_HOME\nINFO cli.LlapServiceDriver: Copied hadoop metrics2 properties file from file:/etc/hive2/2.5.0.0-1245/0/conf.server/hadoop-metrics2-llapdaemon.properties\nINFO cli.LlapServiceDriver: LLAP service driver finished')
2016-09-07 20:32:18,651 - Run file path: /var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49/run.sh
2016-09-07 20:32:18,652 - Execute['/var/lib/ambari-agent/tmp/llap-slider2016-09-07_17-31-49/run.sh'] {'user': 'hive'}
2016-09-07 20:32:48,625 - Submitted LLAP app name : llap0
2016-09-07 20:32:48,627 - checked_call['/usr/hdp/current/hive-server2-hive2/bin/hive --service llapstatus --name llap0 --findAppTimeout 0'] {'logoutput': False, 'user': 'hive', 'stderr': -1}
2016-09-07 20:32:59,607 - checked_call returned (0, '{\n  "amInfo" : {\n    "appName" : "llap0",\n    "appType" : "org-apache-slider",\n    "appId" : "application_1473264739795_0004"\n  },\n  "state" : "LAUNCHING",\n  "appStartTime" : 1473269567664\n}', 'which: no hbase in (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent)\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nSLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]\nINFO cli.LlapStatusServiceDriver: LLAP status invoked with arguments = --hiveconf\nINFO conf.HiveConf: Found configuration file file:/etc/hive2/2.5.0.0-1245/0/conf.server/hive-site.xml\nWARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist\nINFO impl.TimelineClientImpl: Timeline service address: http://hdp-nn1.co.vectis.local:8188/ws/v1/timeline/\nINFO client.AHSProxy: Connecting to Application History server at hdp-nn1.co.vectis.local/10.255.242.180:10200\nINFO cli.LlapStatusServiceDriver: LLAP status finished')
2016-09-07 20:32:59,608 - Received 'llapstatus' command 'output' : {
  "amInfo" : {
    "appName" : "llap0",
    "appType" : "org-apache-slider",
    "appId" : "application_1473264739795_0004"
  },
  "state" : "LAUNCHING",
  "appStartTime" : 1473269567664
}
2016-09-07 20:32:59,608 - Marker index for start of JSON data for 'llapsrtatus' comamnd : 0
2016-09-07 20:32:59,610 - LLAP app 'llap0' current state is LAUNCHING.
2016-09-07 20:32:59,611 - Will retry 19 time(s), caught exception: LLAP app 'llap0' current state is LAUNCHING.. Sleeping for 2 sec(s)
2016-09-07 20:33:01,614 - checked_call['/usr/hdp/current/hive-server2-hive2/bin/hive --service llapstatus --name llap0 --findAppTimeout 0'] {'logoutput': False, 'user': 'hive', 'stderr': -1}
2016-09-07 20:33:15,295 - checked_call returned (0, '{\n  "amInfo" : {\n    "appName" : "llap0",\n    "appType" : "org-apache-slider",\n    "appId" : "application_1473264739795_0004",\n    "containerId" : "container_e12_1473264739795_0004_01_000001",\n    "hostname" : "hdp-dn2.co.vectis.local",\n    "amWebUrl" : "http://hdp-dn2.co.vectis.local:40485/"\n  },\n  "state" : "LAUNCHING",\n  "originalConfigurationPath" : "hdfs://prodcluster/user/hive/.slider/cluster/llap0/snapshot",\n  "generatedConfigurationPath" : "hdfs://prodcluster/user/hive/.slider/cluster/llap0/generated",\n  "desiredInstances" : 1,\n  "liveInstances" : 0,\n  "appStartTime" : 1473269583908\n}', 'which: no hbase in (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent)\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/hdp/2.5.0.0-1245/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nSLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]\nINFO cli.LlapStatusServiceDriver: LLAP status invoked with arguments = --hiveconf\nINFO conf.HiveConf: Found configuration file file:/etc/hive2/2.5.0.0-1245/0/conf.server/hive-site.xml\nWARN conf.HiveConf: HiveConf of name hive.llap.daemon.allow.permanent.fns does not exist\nINFO impl.TimelineClientImpl: Timeline service address: http://hdp-nn1.co.vectis.local:8188/ws/v1/timeline/\nINFO client.AHSProxy: Connecting to Application History server at hdp-nn1.co.vectis.local/10.255.242.180:10200\nWARN curator.CuratorZookeeperClient: session timeout [10000] is less than connection timeout [15000]\nINFO impl.LlapZookeeperRegistryImpl: Llap Zookeeper Registry is enabled with registryid: llap0\nINFO impl.LlapRegistryService: Using LLAP registry type org.apache.hadoop.hive.llap.registry.impl.LlapZookeeperRegistryImpl@4e6f2bb5\nINFO impl.LlapZookeeperRegistryImpl: UGI security is not enabled, or non-daemon environment. Skipping setting up ZK auth.\nINFO imps.CuratorFrameworkImpl: Starting\nINFO impl.LlapRegistryService: Using LLAP registry (client) type: Service LlapRegistryService in state LlapRegistryService: STARTED\nINFO state.ConnectionStateManager: State change: CONNECTED\nINFO cli.LlapStatusServiceDriver: No information found in the LLAP registry\nINFO cli.LlapStatusServiceDriver: LLAP status finished')
2016-09-07 20:33:15,295 - Received 'llapstatus' command 'output' : {
  "amInfo" : {
    "appName" : "llap0",
    "appType" : "org-apache-slider",
    "appId" : "application_1473264739795_0004",
    "containerId" : "container_e12_1473264739795_0004_01_000001",
    "hostname" : "hdp-dn2.co.vectis.local",
    "amWebUrl" : "http://hdp-dn2.co.vectis.local:40485/"
  },
  "state" : "LAUNCHING",
  "originalConfigurationPath" : "hdfs://prodcluster/user/hive/.slider/cluster/llap0/snapshot",
  "generatedConfigurationPath" : "hdfs://prodcluster/user/hive/.slider/cluster/llap0/generated",
  "desiredInstances" : 1,
  "liveInstances" : 0,
  "appStartTime" : 1473269583908
}
1 ACCEPTED SOLUTION

Rising Star

Solved it! The problem was with the parameters:

hive.llap.daemon.yarn.container.mb 
llap_heap_size 

Ambari sets default value of llap_heap_size about 96% of hive.llap.daemon.yarn.container.mb (when I move slider "% of Cluster Capacity"), although it should be about 80%. Manual setting the correct parameters allowed to start the HiveServer2 Interactive.

View solution in original post

12 REPLIES 12

Rising Star

Solved it! The problem was with the parameters:

hive.llap.daemon.yarn.container.mb 
llap_heap_size 

Ambari sets default value of llap_heap_size about 96% of hive.llap.daemon.yarn.container.mb (when I move slider "% of Cluster Capacity"), although it should be about 80%. Manual setting the correct parameters allowed to start the HiveServer2 Interactive.

Rising Star

Hi @Alena Melnikova How do you solve it? I had started once , but failed others attempts with the same

"current state is LAUNCHING"

Expert Contributor

@Alena Melnikova, @Huahua Wei - i'm getting the same issue, and On my cluster, llap_heap_size is ~80% hive.llap.daemon.yarn.container.mb, but HiveServer Interactive is not starting up. Any ideas on what else needs to be done ?

------error ---------

  1. method(env)
  2. File"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 123,in start
  3. raiseFail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.")
  4. resource_management.core.exceptions.Fail:Skipping START of HiveServerInteractive since LLAP app couldn't be STARTED.

----------------------

hiveserver2-interactive-notstartingup-0419.pdf

New Contributor

you can try to update your openssl

New Contributor

The value of hive.llap.daemon.yarn.container.mb can be found in /etc/hive2/conf/hive-site.xml. Or in Ambari Hive -> Config -> Settings -> Interactive Query -> "Memory per daemon". The "LLAP heap size" can be modified in Ambari under the "Advanced hive-interactive-env" section.

Explorer

I'm also having same the problem.

Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 512, in check_llap_app_status status = do_retries() File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/decorator.py", line 55, in wrapper return function(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 505, in do_retries raise Fail(status_str) Fail: LLAP app 'llap0' current state is LAUNCHING. 2016-10-14 08:16:01,061 - LLAP app 'llap0' deployment unsuccessful. Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 535, in <module> HiveServerInteractive().execute() File "/usr/lib/python2.6/site-packages/resouroutput-702.txtce_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 123, in start raise Fail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.") resource_management.core.exceptions.Fail: Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED. ~

New Contributor

if nothing worked , you can try update your openssl

New Contributor

I have updated openssl and made the changes to both. It is still not working does anyone have any other ideas on getting HIve Interactive with LLAP working?

  1. hive.llap.daemon.yarn.container.mb
  2. llap_heap_size

I've the same issue. Any one can advice?

Super Collaborator

I have other error when i try to start it

2018-05-17 03:04:51,409 - Removing directory Directory['/var/lib/ambari-agent/tmp/llap-slider2018-05-17_07-14-19'] and all its content
2018-05-17 03:04:51,443 - Starting LLAP
2018-05-17 03:04:51,444 - Setting slider_placement: 4, as llap_daemon_container_size : 50000 <= 0.5 * YARN NodeManager Memory(101376)
2018-05-17 03:04:51,446 - LLAP start command: /usr/hdp/current/hive-server2-hive2/bin/hive --service llap --slider-am-container-mb 6656 --size 50000m --cache 15360m --xmx 20000m --loglevel INFO  --output /var/lib/ambari-agent/tmp/llap-slider2018-05-17_08-04-51 --slider-placement 4 --skiphadoopversion --skiphbasecp --instances 3 --logger query-routing --args " -XX:+AlwaysPreTouch -Xss512k -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:InitiatingHeapOccupancyPercent=40 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=200 -XX:MetaspaceSize=1024m"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value


WARN cli.LlapStatusServiceDriver: Failed to parse application diagnostics from Yarn Diagnostics - Application application_1526542647534_0001 failed 2 times in previous 300000 milliseconds due to AM Container for appattempt_1526542647534_0001_000002 exited with  exitCode: -1000
For more detailed output, check the application tracking page: http://lbkbd1.liberbank.cloud:8088/cluster/app/application_1526542647534_0001 Then click on links to logs of each attempt.
Diagnostics: Not able to initialize app directories in any of the configured local directories for app application_1526542647534_0001
Failing this attempt. Failing the application.
WARN cli.LlapStatusServiceDriver: AppDiagnostics not available for YARN application report
ERROR types.ApplicationDiagnostics: Exception while parsing json : org.codehaus.jackson.JsonParseException: Unexpected character ('A' (code 65)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@659925f4; line: 1, column: 2]
Application application_1526542647534_0001 failed 2 times in previous 300000 milliseconds due to AM Container for appattempt_1526542647534_0001_000002 exited with  exitCode: -1000
For more detailed output, check the application tracking page: http://lbkbd1.liberbank.cloud:8088/cluster/app/application_1526542647534_0001 Then click on links to logs of each attempt.
Diagnostics: Not able to initialize app directories in any of the configured local directories for app application_1526542647534_0001
Failing this attempt. Failing the application.
org.codehaus.jackson.JsonParseException: Unexpected character ('A' (code 65)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@659925f4; line: 1, column: 2]


HiveServer2 Interactive Service is still not running for me.

Done everything possible. Can anyone step ahead and help me and above gentlemen to solve this irritating issue.

Having following error:

stderr: /var/lib/ambari-agent/data/errors-249.txt

/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py:537: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
  Logger.info(e.message)
2018-07-17 10:09:06,078 - LLAP app 'llap0' deployment unsuccessful.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 612, in <module>
    HiveServerInteractive().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 119, in start
    raise Fail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.")
resource_management.core.exceptions.Fail: Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.

stdout: /var/lib/ambari-agent/data/output-249.txt

2018-07-17 10:01:11,976 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:11,986 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,255 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:12,258 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,260 - Group['livy'] {}
2018-07-17 10:01:12,262 - Group['spark'] {}
2018-07-17 10:01:12,262 - Group['hdfs'] {}
2018-07-17 10:01:12,263 - Group['hadoop'] {}
2018-07-17 10:01:12,263 - Group['users'] {}
2018-07-17 10:01:12,264 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,265 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,266 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,267 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,269 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,270 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-07-17 10:01:12,271 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,272 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,274 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,275 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-07-17 10:01:12,276 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,277 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-07-17 10:01:12,279 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,280 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,281 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,283 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-17 10:01:12,283 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-07-17 10:01:12,293 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-07-17 10:01:12,328 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-07-17 10:01:12,328 - Group['hdfs'] {}
2018-07-17 10:01:12,329 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
2018-07-17 10:01:12,330 - FS Type: 
2018-07-17 10:01:12,330 - Directory['/etc/hadoop'] {'mode': 0755}
2018-07-17 10:01:12,368 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,377 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-07-17 10:01:12,411 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-07-17 10:01:12,453 - Skipping Execute[('setenforce', '0')] due to not_if
2018-07-17 10:01:12,453 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-07-17 10:01:12,456 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-07-17 10:01:12,456 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-07-17 10:01:12,464 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2018-07-17 10:01:12,477 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2018-07-17 10:01:12,493 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:12,527 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,528 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-07-17 10:01:12,529 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-07-17 10:01:12,544 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:12,566 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-07-17 10:01:12,958 - MariaDB RedHat Support: false
2018-07-17 10:01:12,962 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2018-07-17 10:01:12,966 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-07-17 10:01:12,990 - call returned (0, 'hive-server2 - 2.6.3.0-235')
2018-07-17 10:01:12,991 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:12,998 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://hdp-1-nn.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-07-17 10:01:12,999 - Not downloading the file from http://hdp-1-nn.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-07-17 10:01:13,000 - checked_call[('/usr/java/jdk/bin/java', '-cp', '/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', 'jceks://file/var/lib/ambari-agent/cred/conf/hive_server_interactive/hive-site.jceks')] {}
2018-07-17 10:01:14,969 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nJul 17, 2018 10:01:14 AM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nwelcome1')
2018-07-17 10:01:14,982 - HdfsResource['/apps/hive/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0777}
2018-07-17 10:01:14,985 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/apps/hive/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpDNliSV 2>/tmp/tmp0ylQls''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,015 - call returned (0, '')
2018-07-17 10:01:15,015 - Skipping the operation for not managed DFS directory /apps/hive/warehouse since immutable_paths contains it.
2018-07-17 10:01:15,016 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2018-07-17 10:01:15,017 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp_uwRiQ 2>/tmp/tmpPTqVUr''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,048 - call returned (0, '')
2018-07-17 10:01:15,050 - Called copy_to_hdfs tarball: tez_hive2
2018-07-17 10:01:15,050 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2018-07-17 10:01:15,050 - Tarball version was calcuated as 2.6.3.0-235. Use Command Version: True
2018-07-17 10:01:15,050 - Source file: /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz
2018-07-17 10:01:15,051 - HdfsResource['/hdp/apps/2.6.3.0-235/tez_hive2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-07-17 10:01:15,052 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/hdp/apps/2.6.3.0-235/tez_hive2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpQSzSgR 2>/tmp/tmp4LPDgH''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,083 - call returned (0, '')
2018-07-17 10:01:15,084 - HdfsResource['/hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-07-17 10:01:15,085 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://hdp-1-nn.com:50070/webhdfs/v1/hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpce83lX 2>/tmp/tmpbTyAnn''] {'logoutput': None, 'quiet': False}
2018-07-17 10:01:15,112 - call returned (0, '')
2018-07-17 10:01:15,113 - DFS file /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz is identical to /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz, skipping the copying
2018-07-17 10:01:15,113 - Will attempt to copy tez_hive2 tarball from /usr/hdp/2.6.3.0-235/tez_hive2/lib/tez.tar.gz to DFS at /hdp/apps/2.6.3.0-235/tez_hive2/tez.tar.gz.
2018-07-17 10:01:15,114 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://hdp-1-nn.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2018-07-17 10:01:15,114 - Directory['/etc/hive2'] {'mode': 0755}
2018-07-17 10:01:15,114 - Directories to fill with configs: ['/usr/hdp/current/hive-server2-hive2/conf', '/usr/hdp/current/hive-server2-hive2/conf/conf.server']
2018-07-17 10:01:15,116 - Directory['/etc/hive2/2.6.3.0-235/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-07-17 10:01:15,117 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.3.0-235/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,134 - Generating config: /etc/hive2/2.6.3.0-235/0/mapred-site.xml
2018-07-17 10:01:15,134 - File['/etc/hive2/2.6.3.0-235/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,199 - Writing File['/etc/hive2/2.6.3.0-235/0/mapred-site.xml'] because contents don't match
2018-07-17 10:01:15,200 - File['/etc/hive2/2.6.3.0-235/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,201 - File['/etc/hive2/2.6.3.0-235/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,201 - File['/etc/hive2/2.6.3.0-235/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,202 - Directory['/etc/hive2/2.6.3.0-235/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
2018-07-17 10:01:15,203 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.3.0-235/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,215 - Generating config: /etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml
2018-07-17 10:01:15,215 - File['/etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,271 - Writing File['/etc/hive2/2.6.3.0-235/0/conf.server/mapred-site.xml'] because contents don't match
2018-07-17 10:01:15,272 - File['/etc/hive2/2.6.3.0-235/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,273 - File['/etc/hive2/2.6.3.0-235/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,273 - File['/etc/hive2/2.6.3.0-235/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:15,274 - Converted 'hive.llap.io.memory.size' value from '3072 MB' to '3221225472 Bytes' before writing it to config file.
2018-07-17 10:01:15,274 - Skipping setup for Atlas Hook, as it is disabled/ not supported.
2018-07-17 10:01:15,274 - No change done to Hive2/hive-site.xml 'hive.exec.post.hooks' value.
2018-07-17 10:01:15,274 - Retrieved 'tez/tez-site' for merging with 'tez_hive2/tez-interactive-site'.
2018-07-17 10:01:15,274 - XmlConfig['tez-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/tez_hive2/conf', 'mode': 0664, 'configuration_attributes': {'final': {'tez.runtime.shuffle.ssl.enable': 'true'}}, 'owner': 'tez', 'configurations': ...}
2018-07-17 10:01:15,287 - Generating config: /etc/tez_hive2/conf/tez-site.xml
2018-07-17 10:01:15,287 - File['/etc/tez_hive2/conf/tez-site.xml'] {'owner': 'tez', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0664, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,413 - Retrieved 'hiveserver2-site' for merging with 'hiveserver2-interactive-site'.
2018-07-17 10:01:15,414 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,426 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hive-site.xml
2018-07-17 10:01:15,426 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,685 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.security.authorization.enabled': 'false', 'hive.service.metrics.hadoop2.component': 'hiveserver2', 'hive.async.log.enabled': 'false'}}
2018-07-17 10:01:15,697 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml
2018-07-17 10:01:15,697 - File['/usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-07-17 10:01:15,707 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,713 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,717 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,722 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,725 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,728 - File['/usr/hdp/current/hive-server2-hive2/conf/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,735 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,741 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,746 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-07-17 10:01:15,748 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_server_interactive/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-07-17 10:01:15,749 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] because contents don't match
2018-07-17 10:01:15,750 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-07-17 10:01:15,761 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml
2018-07-17 10:01:15,761 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:16,028 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.security.authorization.enabled': 'false', 'hive.service.metrics.hadoop2.component': 'hiveserver2', 'hive.async.log.enabled': 'false'}}
2018-07-17 10:01:16,039 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml
2018-07-17 10:01:16,039 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-07-17 10:01:16,049 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,055 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,059 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,063 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,066 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,069 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,076 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,081 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,087 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-07-17 10:01:16,088 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-07-17 10:01:16,090 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-07-17 10:01:16,091 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://hdp-1-nn.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-07-17 10:01:16,092 - Not downloading the file from http://hdp-1-nn.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-07-17 10:01:16,094 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_interactive_script'] {'content': Template('startHiveserver2Interactive.sh.j2'), 'mode': 0755}
2018-07-17 10:01:16,095 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,096 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,096 - Directory['/var/lib/hive2'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-07-17 10:01:16,099 - Determining previous run 'LLAP package' folder(s) to be deleted ....
2018-07-17 10:01:16,099 - Previous run 'LLAP package' folder(s) to be deleted = ['llap-slider2018-07-17_02-41-57']
2018-07-17 10:01:16,099 - Directory['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_02-41-57'] {'action': ['delete'], 'ignore_failures': True}
2018-07-17 10:01:16,100 - Removing directory Directory['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_02-41-57'] and all its content
2018-07-17 10:01:16,125 - Starting LLAP
2018-07-17 10:01:16,125 - Setting slider_placement : 0, as llap_daemon_container_size : 9035 > 0.5 * YARN NodeManager Memory(18065)
2018-07-17 10:01:16,129 - LLAP start command: /usr/hdp/current/hive-server2-hive2/bin/hive --service llap --slider-am-container-mb 1024 --size 9035m --cache 3072m --xmx 2457m --loglevel INFO  --output /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16 --slider-placement 0 --skiphadoopversion --skiphbasecp --instances 1 --logger query-routing --args " -XX:+AlwaysPreTouch -Xss512k -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:InitiatingHeapOccupancyPercent=40 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=200 -XX:MetaspaceSize=1024m"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]
WARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/java/jdk],process jre=[/usr/java/jdk/jre]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
Prepared /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh for running LLAP on Slider
2018-07-17 10:01:39,078 - Run file path: /var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh
2018-07-17 10:01:39,079 - Execute['/var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/run.sh'] {'logoutput': True, 'user': 'hive'}
2018-07-17 10:01:43,244 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:43,264 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:43,484 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:43,637 [main] ERROR main.ServiceLauncher - Unknown application instance : llap0
 (definition not found at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/app_config.json
2018-07-17 10:01:43,639 [main] INFO  util.ExitUtil - Exiting with status 69
2018-07-17 10:01:47,557 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:47,580 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:47,760 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:47,867 [main] ERROR main.ServiceLauncher - Unknown application instance : llap0
 (definition not found at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/app_config.json
2018-07-17 10:01:47,869 [main] INFO  util.ExitUtil - Exiting with status 69
2018-07-17 10:01:52,078 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:52,099 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:52,292 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:52,526 [main] INFO  imps.CuratorFrameworkImpl - Starting
2018-07-17 10:01:52,589 [main-EventThread] INFO  state.ConnectionStateManager - State change: CONNECTED
2018-07-17 10:01:52,615 [main] INFO  client.SliderClient - Destroyed cluster llap0
2018-07-17 10:01:52,618 [main] INFO  util.ExitUtil - Exiting with status 0
2018-07-17 10:01:56,788 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:01:56,805 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:01:57,017 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:01:57,174 [main] INFO  client.SliderClient - Installing package file:/var/lib/ambari-agent/tmp/llap-slider2018-07-17_05-01-16/llap-17Jul2018.zip to hdfs://hdp-1-nn.com:8020/user/hive/.slider/package/LLAP/llap-17Jul2018.zip (overwrite set to true)
2018-07-17 10:01:59,249 [main] INFO  tools.SliderUtils - Reading metainfo.xml of size 1998
2018-07-17 10:01:59,251 [main] INFO  client.SliderClient - Found XML metainfo file in package
2018-07-17 10:01:59,263 [main] INFO  client.SliderClient - Creating summary metainfo file
2018-07-17 10:01:59,297 [main] INFO  client.SliderClient - Set application.def in your app config JSON to .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:01:59,298 [main] INFO  util.ExitUtil - Exiting with status 0
2018-07-17 10:02:02,707 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:02:02,736 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:02:03,110 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:02:04,458 [main] INFO  agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:04,464 [main] INFO  agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:04,971 [main] INFO  agent.AgentUtils - Got metainfo from summary file
2018-07-17 10:02:05,078 [main] INFO  client.SliderClient - No credentials requested
2018-07-17 10:02:05,288 [main] INFO  agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:05,309 [main] INFO  agent.AgentUtils - Got metainfo from summary file
2018-07-17 10:02:05,401 [main] INFO  launch.AbstractLauncher - Setting yarn.resourcemanager.am.retry-count-window-ms to 300000
2018-07-17 10:02:05,406 [main] INFO  launch.AbstractLauncher - Log include patterns: .*\.done
2018-07-17 10:02:05,409 [main] INFO  launch.AbstractLauncher - Log exclude patterns: 
2018-07-17 10:02:05,410 [main] INFO  launch.AbstractLauncher - Modified log include patterns: .*\.done
2018-07-17 10:02:05,410 [main] INFO  launch.AbstractLauncher - Modified log exclude patterns: 
2018-07-17 10:02:05,770 [main] INFO  slideram.SliderAMClientProvider - Loading all dependencies for AM.
2018-07-17 10:02:05,772 [main] INFO  tools.CoreFileSystem - Loading all dependencies from /hdp/apps/2.6.3.0-235/slider/slider.tar.gz
2018-07-17 10:02:05,775 [main] INFO  agent.AgentClientProvider - Automatically uploading the agent tarball at hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/tmp/application_1531803503758_0001/agent
2018-07-17 10:02:05,875 [main] INFO  agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-17Jul2018.zip
2018-07-17 10:02:05,915 [main] INFO  client.SliderClient - Using queue llap for the application instance.
2018-07-17 10:02:05,915 [main] INFO  client.SliderClient - Submitting application application_1531803503758_0001
2018-07-17 10:02:05,918 [main] INFO  launch.AppMasterLauncher - Submitting application to Resource Manager
2018-07-17 10:02:06,367 [main] INFO  impl.YarnClientImpl - Submitted application application_1531803503758_0001
2018-07-17 10:02:06,370 [main] INFO  util.ExitUtil - Exiting with status 0
2018-07-17 10:02:06,564 - Submitted LLAP app name : llap0
2018-07-17 10:02:06,564 - 
2018-07-17 10:02:06,565 - LLAP status command : /usr/hdp/current/hive-server2-hive2/bin/hive --service llapstatus -w -r 0.8 -i 2 -t 400
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value

LLAPSTATUS WatchMode with timeout=400 s
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001.
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1531803503758_0001. Started 0/1 instances
--------------------------------------------------------------------------------
{
  "amInfo" : {
    "appName" : "llap0",
    "appType" : "org-apache-slider",
    "appId" : "application_1531803503758_0001",
    "containerId" : "container_e05_1531803503758_0001_01_000001",
    "hostname" : "hdp-3-dn1.com",
    "amWebUrl" : "http://hdp-3-dn1.com:45298/"
  },
  "state" : "LAUNCHING",
  "originalConfigurationPath" : "hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/snapshot",
  "generatedConfigurationPath" : "hdfs://hdp-1-nn.com:8020/user/hive/.slider/cluster/llap0/generated",
  "desiredInstances" : 1,
  "liveInstances" : 0,
  "appStartTime" : 1531803742224,
  "runningThresholdAchieved" : false
}
WARN cli.LlapStatusServiceDriver: Watch timeout 400s exhausted before desired state RUNNING is attained.
2018-07-17 10:09:06,078 - LLAP app 'llap0' current state is LAUNCHING.
2018-07-17 10:09:06,078 - LLAP app 'llap0' current state is LAUNCHING.
2018-07-17 10:09:06,078 - LLAP app 'llap0' deployment unsuccessful.
2018-07-17 10:09:06,078 - Stopping LLAP
2018-07-17 10:09:06,079 - call[['slider', 'stop', 'llap0']] {'logoutput': True, 'user': 'hive', 'stderr': -1}
2018-07-17 10:09:09,551 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2018-07-17 10:09:09,567 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050
2018-07-17 10:09:09,770 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200
2018-07-17 10:09:10,176 [main] INFO  util.ExitUtil - Exiting with status 0
2018-07-17 10:09:11,164 - call returned (0, '2018-07-17 10:09:09,551 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.\n2018-07-17 10:09:09,567 [main] INFO  client.RMProxy - Connecting to ResourceManager at hdp-1-nn.com/192.168.100.5:8050\n2018-07-17 10:09:09,770 [main] INFO  client.AHSProxy - Connecting to Application History server at hdp-1-nn.com/192.168.100.5:10200\n2018-07-17 10:09:10,176 [main] INFO  util.ExitUtil - Exiting with status 0', '')
2018-07-17 10:09:11,165 - Stopped llap0 application on Slider successfully
2018-07-17 10:09:11,165 - Execute[('slider', 'destroy', 'llap0', '--force')] {'ignore_failures': True, 'user': 'hive', 'timeout': 30}

Command failed after 1 tries

-----------------------------------------------------------------------------------------

Current Hortonworks Multinode Cluster Status

VMs Spec

  • VM#1: Active NameNode (32 GB RAM & 2 processors/ CPU)
  • VM#2: Standby NameNode (12 GB RAM & 1 processors/ CPU)
  • VM#3: DataNode (12 GB RAM & 1 processors/ CPU)

Other details:

  • OS: Linux 6.5
  • HDP 2.6.3 + Ambari 2.6.0.0
  • HDF 3.0.2 (only NiFi with min 3 GB and max 4 GB, No SSL)

---------------------------------------------------------------------------------------------------

For LLAP, did following things:

  • Pre-emption = Enabled
  • Capacity Schedule:
    • default: min 50% and max 50%
    • Added a new queue: llap with min 50% and max 50%
  • Memory allocated for all YARN containers on a node = 12 GB
  • Minimum Container Size (Memory) = 1 GB
  • Maximum Container Size (Memory) = 12 GB
  • Tez Container Size = 3 GB
  • HiveServer2 Heap Size = 2 GB
  • Metastore Heap Size= 2 GB
  • Client Heap Size = 1 GB
  • Enabled LLAP
    • Interactive Query Queue = llap
    • Number of nodes used by Hive's LLAP = 1
    • Maximum Total Concurrent Queries = 1
    • Memory per Daemon = 10240
    • In-Memory Cache per Daemon = 7168
    • Number of executors per LLAP Daemon = 1

  • Installed LLAP on Active NameNode as it took it as default
  • HiveServer2 Interactive = failed

------------------------------------

Not sure what i am missing. Looks like have done most of the things. Must be missing something special.

Looking forward for solution.

Cheers.....

Cloudera Employee

Hi, please refer to this url for LLAP Sizing and Setup.

https://community.hortonworks.com/articles/149486/llap-sizing-and-setup.html

; ;