Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP310 Spark2 Thrift Server failed (Kerberos)

HDP310 Spark2 Thrift Server failed (Kerberos)

New Contributor

2020-03-29 01:12:29,077 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2020-03-29 01:12:29,091 - Directory['/var/run/spark2'] {'owner': 'spark', 'create_parents': True, 'group': 'hadoop', 'mode': 0775}
2020-03-29 01:12:29,092 - Directory['/var/log/spark2'] {'owner': 'spark', 'group': 'hadoop', 'create_parents': True, 'mode': 0775}
2020-03-29 01:12:29,093 - Directory['/var/lib/spark2'] {'owner': 'spark', 'group': 'hadoop', 'create_parents': True, 'mode': 0775}
2020-03-29 01:12:29,093 - Directory['/var/lib/spark2/shs_db'] {'owner': 'spark', 'group': 'hadoop', 'create_parents': True, 'mode': 0775}
2020-03-29 01:12:29,094 - HdfsResource['/user/spark'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mycluster', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-phsker@REALM', 'user': 'hdfs', 'owner': 'spark', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'hdfs://mycluster/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0775}
2020-03-29 01:12:29,095 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-phsker@REALM'] {'user': 'hdfs'}
2020-03-29 01:12:29,173 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://test-296.local:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpIoCWcQ 2>/tmp/tmpJTiwOG''] {'quiet': False}
2020-03-29 01:12:29,261 - call returned (0, '')
2020-03-29 01:12:29,262 - get_user_call_output returned (0, u'{\n "beans" : [ {\n "name" : "Hadoop:service=NameNode,name=FSNamesystem",\n "modelerType" : "FSNamesystem",\n "tag.Context" : "dfs",\n "tag.HAState" : "active",\n "tag.TotalSyncTimes" : "239 25 ",\n "tag.Hostname" : "test-296.local",\n "MissingBlocks" : 0,\n "MissingReplOneBlocks" : 0,\n "ExpiredHeartbeats" : 0,\n "TransactionsSinceLastCheckpoint" : 5963,\n "TransactionsSinceLastLogRoll" : 37,\n "LastWrittenTransactionId" : 5963,\n "LastCheckpointTime" : 1585465293000,\n "CapacityTotal" : 155640135680,\n "CapacityTotalGB" : 145.0,\n "CapacityUsed" : 8745466874,\n "CapacityUsedGB" : 8.0,\n "CapacityRemaining" : 96322375972,\n "ProvidedCapacityTotal" : 0,\n "CapacityRemainingGB" : 90.0,\n "CapacityUsedNonDFS" : 44881605638,\n "TotalLoad" : 48,\n "SnapshottableDirectories" : 0,\n "Snapshots" : 0,\n "NumEncryptionZones" : 0,\n "LockQueueLength" : 0,\n "BlocksTotal" : 272,\n "NumFilesUnderConstruction" : 12,\n "NumActiveClients" : 7,\n "FilesTotal" : 1143,\n "PendingReplicationBlocks" : 0,\n "PendingReconstructionBlocks" : 0,\n "UnderReplicatedBlocks" : 0,\n "LowRedundancyBlocks" : 0,\n "CorruptBlocks" : 0,\n "ScheduledReplicationBlocks" : 0,\n "PendingDeletionBlocks" : 138,\n "LowRedundancyReplicatedBlocks" : 0,\n "CorruptReplicatedBlocks" : 0,\n "MissingReplicatedBlocks" : 0,\n "MissingReplicationOneBlocks" : 0,\n "HighestPriorityLowRedundancyReplicatedBlocks" : 0,\n "HighestPriorityLowRedundancyECBlocks" : 0,\n "BytesInFutureReplicatedBlocks" : 0,\n "PendingDeletionReplicatedBlocks" : 138,\n "TotalReplicatedBlocks" : 272,\n "LowRedundancyECBlockGroups" : 0,\n "CorruptECBlockGroups" : 0,\n "MissingECBlockGroups" : 0,\n "BytesInFutureECBlockGroups" : 0,\n "PendingDeletionECBlocks" : 0,\n "TotalECBlockGroups" : 0,\n "ExcessBlocks" : 58,\n "NumTimedOutPendingReconstructions" : 24,\n "PostponedMisreplicatedBlocks" : 0,\n "PendingDataNodeMessageCount" : 0,\n "MillisSinceLastLoadedEdits" : 0,\n "BlockCapacity" : 2097152,\n "NumLiveDataNodes" : 5,\n "NumDeadDataNodes" : 0,\n "NumDecomLiveDataNodes" : 0,\n "NumDecomDeadDataNodes" : 0,\n "VolumeFailuresTotal" : 0,\n "EstimatedCapacityLostTotal" : 0,\n "NumDecommissioningDataNodes" : 0,\n "StaleDataNodes" : 0,\n "NumStaleStorages" : 0,\n "TotalSyncCount" : 37,\n "NumInMaintenanceLiveDataNodes" : 0,\n "NumInMaintenanceDeadDataNodes" : 0,\n "NumEnteringMaintenanceDataNodes" : 0\n } ]\n}', u'')
2020-03-29 01:12:29,263 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://test-297.local:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpMQrtQI 2>/tmp/tmpqryck8''] {'quiet': False}
2020-03-29 01:12:29,349 - call returned (0, '')
2020-03-29 01:12:29,350 - get_user_call_output returned (0, u'{\n "beans" : [ {\n "name" : "Hadoop:service=NameNode,name=FSNamesystem",\n "modelerType" : "FSNamesystem",\n "tag.Context" : "dfs",\n "tag.HAState" : "standby",\n "tag.TotalSyncTimes" : "",\n "tag.Hostname" : "test-297.local",\n "MissingBlocks" : 0,\n "MissingReplOneBlocks" : 0,\n "ExpiredHeartbeats" : 0,\n "TransactionsSinceLastCheckpoint" : 5926,\n "TransactionsSinceLastLogRoll" : 0,\n "LastWrittenTransactionId" : 5179,\n "LastCheckpointTime" : 1585465300000,\n "CapacityTotal" : 155640135680,\n "CapacityTotalGB" : 145.0,\n "CapacityUsed" : 8745466874,\n "CapacityUsedGB" : 8.0,\n "CapacityRemaining" : 96322400548,\n "ProvidedCapacityTotal" : 0,\n "CapacityRemainingGB" : 90.0,\n "CapacityUsedNonDFS" : 44881581062,\n "TotalLoad" : 48,\n "SnapshottableDirectories" : 0,\n "Snapshots" : 0,\n "NumEncryptionZones" : 0,\n "LockQueueLength" : 0,\n "BlocksTotal" : 272,\n "NumFilesUnderConstruction" : 12,\n "NumActiveClients" : 7,\n "FilesTotal" : 1143,\n "PendingReplicationBlocks" : 0,\n "PendingReconstructionBlocks" : 0,\n "UnderReplicatedBlocks" : 0,\n "LowRedundancyBlocks" : 0,\n "CorruptBlocks" : 0,\n "ScheduledReplicationBlocks" : 0,\n "PendingDeletionBlocks" : 0,\n "LowRedundancyReplicatedBlocks" : 0,\n "CorruptReplicatedBlocks" : 0,\n "MissingReplicatedBlocks" : 0,\n "MissingReplicationOneBlocks" : 0,\n "HighestPriorityLowRedundancyReplicatedBlocks" : 0,\n "HighestPriorityLowRedundancyECBlocks" : 0,\n "BytesInFutureReplicatedBlocks" : 0,\n "PendingDeletionReplicatedBlocks" : 0,\n "TotalReplicatedBlocks" : 272,\n "LowRedundancyECBlockGroups" : 0,\n "CorruptECBlockGroups" : 0,\n "MissingECBlockGroups" : 0,\n "BytesInFutureECBlockGroups" : 0,\n "PendingDeletionECBlocks" : 0,\n "TotalECBlockGroups" : 0,\n "ExcessBlocks" : 0,\n "NumTimedOutPendingReconstructions" : 0,\n "PostponedMisreplicatedBlocks" : 0,\n "PendingDataNodeMessageCount" : 0,\n "MillisSinceLastLoadedEdits" : 116153,\n "BlockCapacity" : 2097152,\n "NumLiveDataNodes" : 5,\n "NumDeadDataNodes" : 0,\n "NumDecomLiveDataNodes" : 0,\n "NumDecomDeadDataNodes" : 0,\n "VolumeFailuresTotal" : 0,\n "EstimatedCapacityLostTotal" : 0,\n "NumDecommissioningDataNodes" : 0,\n "StaleDataNodes" : 0,\n "NumStaleStorages" : 0,\n "TotalSyncCount" : 0,\n "NumInMaintenanceLiveDataNodes" : 0,\n "NumInMaintenanceDeadDataNodes" : 0,\n "NumEnteringMaintenanceDataNodes" : 0\n } ]\n}', u'')
2020-03-29 01:12:29,351 - NameNode HA states: active_namenodes = [(u'nn1', 'test-296.local:50070')], standby_namenodes = [(u'nn2', 'test-297.local:50070')], unknown_namenodes = []
2020-03-29 01:12:29,352 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' --negotiate -u : '"'"'http://test-296.local:50070/webhdfs/v1/user/spark?op=GETFILESTATUS'"'"' 1>/tmp/tmpLuTKxB 2>/tmp/tmpP6895z''] {'logoutput': None, 'quiet': False}
2020-03-29 01:12:29,451 - call returned (0, '')
2020-03-29 01:12:29,452 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16387,"group":"hdfs","length":0,"modificationTime":1585465369649,"owner":"spark","pathSuffix":"","permission":"775","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2020-03-29 01:12:29,453 - HdfsResource['/apps/spark/warehouse'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mycluster', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-phsker@REALM', 'user': 'hdfs', 'owner': 'spark', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'hdfs://mycluster/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0777}
2020-03-29 01:12:29,454 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-phsker@REALM'] {'user': 'hdfs'}
2020-03-29 01:12:29,529 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://test-296.local:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmprPWeKQ 2>/tmp/tmp9j0Aug''] {'quiet': False}
2020-03-29 01:12:29,618 - call returned (0, '')
2020-03-29 01:12:29,618 - get_user_call_output returned (0, u'{\n "beans" : [ {\n "name" : "Hadoop:service=NameNode,name=FSNamesystem",\n "modelerType" : "FSNamesystem",\n "tag.Context" : "dfs",\n "tag.HAState" : "active",\n "tag.TotalSyncTimes" : "239 25 ",\n "tag.Hostname" : "test-296.local",\n "MissingBlocks" : 0,\n "MissingReplOneBlocks" : 0,\n "ExpiredHeartbeats" : 0,\n "TransactionsSinceLastCheckpoint" : 5963,\n "TransactionsSinceLastLogRoll" : 37,\n "LastWrittenTransactionId" : 5963,\n "LastCheckpointTime" : 1585465293000,\n "CapacityTotal" : 155640135680,\n "CapacityTotalGB" : 145.0,\n "CapacityUsed" : 8745466874,\n "CapacityUsedGB" : 8.0,\n "CapacityRemaining" : 96322375972,\n "ProvidedCapacityTotal" : 0,\n "CapacityRemainingGB" : 90.0,\n "CapacityUsedNonDFS" : 44881605638,\n "TotalLoad" : 48,\n "SnapshottableDirectories" : 0,\n "Snapshots" : 0,\n "NumEncryptionZones" : 0,\n "LockQueueLength" : 0,\n "BlocksTotal" : 272,\n "NumFilesUnderConstruction" : 12,\n "NumActiveClients" : 7,\n "FilesTotal" : 1143,\n "PendingReplicationBlocks" : 0,\n "PendingReconstructionBlocks" : 0,\n "UnderReplicatedBlocks" : 0,\n "LowRedundancyBlocks" : 0,\n "CorruptBlocks" : 0,\n "ScheduledReplicationBlocks" : 0,\n "PendingDeletionBlocks" : 138,\n "LowRedundancyReplicatedBlocks" : 0,\n "CorruptReplicatedBlocks" : 0,\n "MissingReplicatedBlocks" : 0,\n "MissingReplicationOneBlocks" : 0,\n "HighestPriorityLowRedundancyReplicatedBlocks" : 0,\n "HighestPriorityLowRedundancyECBlocks" : 0,\n "BytesInFutureReplicatedBlocks" : 0,\n "PendingDeletionReplicatedBlocks" : 138,\n "TotalReplicatedBlocks" : 272,\n "LowRedundancyECBlockGroups" : 0,\n "CorruptECBlockGroups" : 0,\n "MissingECBlockGroups" : 0,\n "BytesInFutureECBlockGroups" : 0,\n "PendingDeletionECBlocks" : 0,\n "TotalECBlockGroups" : 0,\n "ExcessBlocks" : 58,\n "NumTimedOutPendingReconstructions" : 24,\n "PostponedMisreplicatedBlocks" : 0,\n "PendingDataNodeMessageCount" : 0,\n "MillisSinceLastLoadedEdits" : 0,\n "BlockCapacity" : 2097152,\n "NumLiveDataNodes" : 5,\n "NumDeadDataNodes" : 0,\n "NumDecomLiveDataNodes" : 0,\n "NumDecomDeadDataNodes" : 0,\n "VolumeFailuresTotal" : 0,\n "EstimatedCapacityLostTotal" : 0,\n "NumDecommissioningDataNodes" : 0,\n "StaleDataNodes" : 0,\n "NumStaleStorages" : 0,\n "TotalSyncCount" : 37,\n "NumInMaintenanceLiveDataNodes" : 0,\n "NumInMaintenanceDeadDataNodes" : 0,\n "NumEnteringMaintenanceDataNodes" : 0\n } ]\n}', u'')
2020-03-29 01:12:29,620 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl --negotiate -u : -s '"'"'http://test-297.local:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpqkTKCx 2>/tmp/tmpwlycM5''] {'quiet': False}
2020-03-29 01:12:29,704 - call returned (0, '')
2020-03-29 01:12:29,705 - get_user_call_output returned (0, u'{\n "beans" : [ {\n "name" : "Hadoop:service=NameNode,name=FSNamesystem",\n "modelerType" : "FSNamesystem",\n "tag.Context" : "dfs",\n "tag.HAState" : "standby",\n "tag.TotalSyncTimes" : "",\n "tag.Hostname" : "test-297.local",\n "MissingBlocks" : 0,\n "MissingReplOneBlocks" : 0,\n "ExpiredHeartbeats" : 0,\n "TransactionsSinceLastCheckpoint" : 5926,\n "TransactionsSinceLastLogRoll" : 0,\n "LastWrittenTransactionId" : 5179,\n "LastCheckpointTime" : 1585465300000,\n "CapacityTotal" : 155640135680,\n "CapacityTotalGB" : 145.0,\n "CapacityUsed" : 8745466874,\n "CapacityUsedGB" : 8.0,\n "CapacityRemaining" : 96322400548,\n "ProvidedCapacityTotal" : 0,\n "CapacityRemainingGB" : 90.0,\n "CapacityUsedNonDFS" : 44881581062,\n "TotalLoad" : 48,\n "SnapshottableDirectories" : 0,\n "Snapshots" : 0,\n "NumEncryptionZones" : 0,\n "LockQueueLength" : 0,\n "BlocksTotal" : 272,\n "NumFilesUnderConstruction" : 12,\n "NumActiveClients" : 7,\n "FilesTotal" : 1143,\n "PendingReplicationBlocks" : 0,\n "PendingReconstructionBlocks" : 0,\n "UnderReplicatedBlocks" : 0,\n "LowRedundancyBlocks" : 0,\n "CorruptBlocks" : 0,\n "ScheduledReplicationBlocks" : 0,\n "PendingDeletionBlocks" : 0,\n "LowRedundancyReplicatedBlocks" : 0,\n "CorruptReplicatedBlocks" : 0,\n "MissingReplicatedBlocks" : 0,\n "MissingReplicationOneBlocks" : 0,\n "HighestPriorityLowRedundancyReplicatedBlocks" : 0,\n "HighestPriorityLowRedundancyECBlocks" : 0,\n "BytesInFutureReplicatedBlocks" : 0,\n "PendingDeletionReplicatedBlocks" : 0,\n "TotalReplicatedBlocks" : 272,\n "LowRedundancyECBlockGroups" : 0,\n "CorruptECBlockGroups" : 0,\n "MissingECBlockGroups" : 0,\n "BytesInFutureECBlockGroups" : 0,\n "PendingDeletionECBlocks" : 0,\n "TotalECBlockGroups" : 0,\n "ExcessBlocks" : 0,\n "NumTimedOutPendingReconstructions" : 0,\n "PostponedMisreplicatedBlocks" : 0,\n "PendingDataNodeMessageCount" : 0,\n "MillisSinceLastLoadedEdits" : 116153,\n "BlockCapacity" : 2097152,\n "NumLiveDataNodes" : 5,\n "NumDeadDataNodes" : 0,\n "NumDecomLiveDataNodes" : 0,\n "NumDecomDeadDataNodes" : 0,\n "VolumeFailuresTotal" : 0,\n "EstimatedCapacityLostTotal" : 0,\n "NumDecommissioningDataNodes" : 0,\n "StaleDataNodes" : 0,\n "NumStaleStorages" : 0,\n "TotalSyncCount" : 0,\n "NumInMaintenanceLiveDataNodes" : 0,\n "NumInMaintenanceDeadDataNodes" : 0,\n "NumEnteringMaintenanceDataNodes" : 0\n } ]\n}', u'')
2020-03-29 01:12:29,706 - NameNode HA states: active_namenodes = [(u'nn1', 'test-296.local:50070')], standby_namenodes = [(u'nn2', 'test-297.local:50070')], unknown_namenodes = []
2020-03-29 01:12:29,707 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' --negotiate -u : '"'"'http://test-296.local:50070/webhdfs/v1/apps/spark/warehouse?op=GETFILESTATUS'"'"' 1>/tmp/tmpuNyLiU 2>/tmp/tmpJ_bxGM''] {'logoutput': None, 'quiet': False}
2020-03-29 01:12:29,800 - call returned (0, '')
2020-03-29 01:12:29,801 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":16391,"group":"hdfs","length":0,"modificationTime":1585465311232,"owner":"spark","pathSuffix":"","permission":"777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2020-03-29 01:12:29,802 - HdfsResource[None] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://mycluster', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-phsker@REALM', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'immutable_paths': [u'hdfs://mycluster/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2020-03-29 01:12:29,808 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2020-03-29 01:12:29,809 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-spark2.json
2020-03-29 01:12:29,809 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-spark2.json'] {'content': Template('input.config-spark2.json.j2'), 'mode': 0644}
2020-03-29 01:12:29,810 - PropertiesFile['/usr/hdp/current/spark2-thriftserver/conf/spark-defaults.conf'] {'owner': 'spark', 'key_value_delimiter': ' ', 'group': 'spark', 'mode': 0644, 'properties': ...}
2020-03-29 01:12:29,815 - Generating properties file: /usr/hdp/current/spark2-thriftserver/conf/spark-defaults.conf
2020-03-29 01:12:29,815 - File['/usr/hdp/current/spark2-thriftserver/conf/spark-defaults.conf'] {'owner': 'spark', 'content': InlineTemplate(...), 'group': 'spark', 'mode': 0644, 'encoding': 'UTF-8'}
2020-03-29 01:12:29,838 - Writing File['/usr/hdp/current/spark2-thriftserver/conf/spark-defaults.conf'] because contents don't match
2020-03-29 01:12:29,842 - File['/usr/hdp/current/spark2-thriftserver/conf/spark-env.sh'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2020-03-29 01:12:29,842 - File['/usr/hdp/current/spark2-thriftserver/conf/log4j.properties'] {'content': ..., 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2020-03-29 01:12:29,845 - File['/usr/hdp/current/spark2-thriftserver/conf/metrics.properties'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2020-03-29 01:12:29,845 - XmlConfig['hive-site.xml'] {'owner': 'spark', 'group': 'spark', 'mode': 0644, 'conf_dir': '/usr/hdp/current/spark2-thriftserver/conf', 'configurations': ...}
2020-03-29 01:12:29,854 - Generating config: /usr/hdp/current/spark2-thriftserver/conf/hive-site.xml
2020-03-29 01:12:29,854 - File['/usr/hdp/current/spark2-thriftserver/conf/hive-site.xml'] {'owner': 'spark', 'content': InlineTemplate(...), 'group': 'spark', 'mode': 0644, 'encoding': 'UTF-8'}
2020-03-29 01:12:29,865 - PropertiesFile['/usr/hdp/current/spark2-thriftserver/conf/spark-thrift-sparkconf.conf'] {'owner': 'hive', 'key_value_delimiter': ' ', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2020-03-29 01:12:29,869 - Generating properties file: /usr/hdp/current/spark2-thriftserver/conf/spark-thrift-sparkconf.conf
2020-03-29 01:12:29,869 - File['/usr/hdp/current/spark2-thriftserver/conf/spark-thrift-sparkconf.conf'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2020-03-29 01:12:29,894 - Writing File['/usr/hdp/current/spark2-thriftserver/conf/spark-thrift-sparkconf.conf'] because contents don't match
2020-03-29 01:12:29,898 - File['/usr/hdp/current/spark2-thriftserver/conf/spark-thrift-fairscheduler.xml'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0755}
2020-03-29 01:12:29,898 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/spark.service.keytab spark/test-303.local@REALM; '] {'user': 'spark'}
2020-03-29 01:12:29,977 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/spark.service.keytab spark/test-303.local@REALM; '] {'user': 'spark'}
2020-03-29 01:12:30,053 - Execute['/usr/hdp/current/spark2-thriftserver/sbin/start-thriftserver.sh --properties-file /usr/hdp/current/spark2-thriftserver/conf/spark-thrift-sparkconf.conf '] {'environment': {'JAVA_HOME': u'/usr/java/default'}, 'not_if': 'ambari-sudo.sh -H -E test -f /var/run/spark2/spark-spark-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1.pid && ambari-sudo.sh -H -E pgrep -F /var/run/spark2/spark-spark-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1.pid', 'user': 'spark'}
2020-03-29 01:13:02,748 - Check connection to STS is created.
2020-03-29 01:13:02,749 - Execute['! /usr/hdp/current/spark2-thriftserver/bin/beeline -u 'jdbc:hive2://test-303.local:10016/default;principal=spark/test-303.local@REALM;transportMode=binary' -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL' -e 'Error: Could not open''] {'path': [u'/usr/hdp/current/spark2-thriftserver/bin/beeline'], 'user': 'spark', 'timeout': 60.0}
2020-03-29 01:13:04,157 - Connection to STS still is not created.
2020-03-29 01:13:04,157 - Check STS process status.
2020-03-29 01:13:04,158 - Process with pid 49901 is not running. Stale pid file at /var/run/spark2/spark-spark-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1.pid

Command failed after 1 tries

Don't have an account?
Coming from Hortonworks? Activate your account here