Member since
08-27-2018
8
Posts
1
Kudos Received
0
Solutions
10-16-2018
12:11 PM
How to use R in Spark with GUI? i know several ways... Zeppelin, Rstudio-sparklyr But Rstudio&sparklyr seems slow..(?) Are there different tools?
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Zeppelin
09-05-2018
02:59 AM
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 102, in
JobHistoryServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 55, in start
spark_service('jobhistoryserver', upgrade_type=upgrade_type, action='start')
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/spark_service.py", line 106, in spark_service
user = params.hive_user)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/current/hive-client/bin/schematool -dbType mysql -createCatalog spark -catalogDescription 'Default catalog, for Spark' -ifNotExists -catalogLocation hdfs://master.knu.com:8020/apps/spark/warehouse' returned 1. SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Create catalog spark at location hdfs://master.knu.com:8020/apps/spark/warehouse
Metastore connection URL: jdbc:mysql://slave1.knu.com/hive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException : Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
SQL Error code: 0
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
stdout:
2018-09-05 11:49:55,418 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:49:55,530 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf
2018-09-05 11:49:56,372 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:49:56,423 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf
2018-09-05 11:49:56,424 - Group['kms'] {}
2018-09-05 11:49:56,425 - Group['livy'] {}
2018-09-05 11:49:56,425 - Group['spark'] {}
2018-09-05 11:49:56,426 - Group['ranger'] {}
2018-09-05 11:49:56,426 - Group['hdfs'] {}
2018-09-05 11:49:56,426 - Group['zeppelin'] {}
2018-09-05 11:49:56,426 - Group['hadoop'] {}
2018-09-05 11:49:56,426 - Group['users'] {}
2018-09-05 11:49:56,426 - Group['knox'] {}
2018-09-05 11:49:56,427 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,428 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,429 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,430 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,431 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,431 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,474 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,475 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,476 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,477 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,477 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,478 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,479 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,480 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-09-05 11:49:56,481 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-09-05 11:49:56,482 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,483 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,484 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,512 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,513 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-09-05 11:49:56,514 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,515 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-09-05 11:49:56,516 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,516 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,517 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-09-05 11:49:56,518 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-09-05 11:49:56,519 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-09-05 11:49:56,520 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-09-05 11:49:56,695 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-09-05 11:49:56,696 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-09-05 11:49:56,696 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-09-05 11:49:56,698 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-09-05 11:49:56,698 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-09-05 11:49:56,870 - call returned (0, '1011')
2018-09-05 11:49:56,871 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1011'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-09-05 11:49:56,996 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1011'] due to not_if
2018-09-05 11:49:56,996 - Group['hdfs'] {}
2018-09-05 11:49:56,997 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-09-05 11:49:56,997 - FS Type: HDFS
2018-09-05 11:49:56,997 - Directory['/etc/hadoop'] {'mode': 0755}
2018-09-05 11:49:57,041 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-09-05 11:49:57,041 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-09-05 11:49:57,123 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-09-05 11:49:57,395 - Skipping Execute[('setenforce', '0')] due to not_if
2018-09-05 11:49:57,396 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-09-05 11:49:57,397 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-09-05 11:49:57,398 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2018-09-05 11:49:57,398 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-09-05 11:49:57,401 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2018-09-05 11:49:57,402 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2018-09-05 11:49:57,431 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-09-05 11:49:57,438 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-09-05 11:49:57,487 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-09-05 11:49:57,488 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-09-05 11:49:57,491 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2018-09-05 11:49:57,534 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-09-05 11:49:57,593 - Skipping unlimited key JCE policy check and setup since it is not required
2018-09-05 11:49:58,993 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf
2018-09-05 11:49:59,052 - Directory['/var/run/spark2'] {'owner': 'spark', 'create_parents': True, 'group': 'hadoop', 'mode': 0775}
2018-09-05 11:49:59,053 - Directory['/var/log/spark2'] {'owner': 'spark', 'group': 'hadoop', 'create_parents': True, 'mode': 0775}
2018-09-05 11:49:59,053 - HdfsResource['/user/spark'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'spark', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0775}
2018-09-05 11:49:59,055 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/user/spark?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp1edMkC 2>/tmp/tmp79pBvj''] {'logoutput': None, 'quiet': False}
2018-09-05 11:50:00,150 - call returned (0, '')
2018-09-05 11:50:00,150 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17406,"group":"hdfs","length":0,"modificationTime":1536115798636,"owner":"spark","pathSuffix":"","permission":"775","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-09-05 11:50:00,151 - HdfsResource['/apps/spark/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'spark', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0777}
2018-09-05 11:50:00,151 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/apps/spark/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpUp29T5 2>/tmp/tmppR9TAf''] {'logoutput': None, 'quiet': False}
2018-09-05 11:50:00,967 - call returned (0, '')
2018-09-05 11:50:00,968 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17408,"group":"hdfs","length":0,"modificationTime":1536115799157,"owner":"spark","pathSuffix":"","permission":"777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-09-05 11:50:00,968 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-09-05 11:50:00,973 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-09-05 11:50:00,973 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-spark2.json
2018-09-05 11:50:00,974 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-spark2.json'] {'content': Template('input.config-spark2.json.j2'), 'mode': 0644}
2018-09-05 11:50:00,974 - PropertiesFile['/usr/hdp/current/spark2-historyserver/conf/spark-defaults.conf'] {'owner': 'spark', 'key_value_delimiter': ' ', 'group': 'spark', 'mode': 0644, 'properties': ...}
2018-09-05 11:50:00,995 - Generating properties file: /usr/hdp/current/spark2-historyserver/conf/spark-defaults.conf
2018-09-05 11:50:00,996 - File['/usr/hdp/current/spark2-historyserver/conf/spark-defaults.conf'] {'owner': 'spark', 'content': InlineTemplate(...), 'group': 'spark', 'mode': 0644, 'encoding': 'UTF-8'}
2018-09-05 11:50:01,093 - Writing File['/usr/hdp/current/spark2-historyserver/conf/spark-defaults.conf'] because contents don't match
2018-09-05 11:50:01,096 - File['/usr/hdp/current/spark2-historyserver/conf/spark-env.sh'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2018-09-05 11:50:01,096 - Writing File['/usr/hdp/current/spark2-historyserver/conf/spark-env.sh'] because contents don't match
2018-09-05 11:50:01,171 - File['/usr/hdp/current/spark2-historyserver/conf/log4j.properties'] {'content': ..., 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2018-09-05 11:50:01,173 - File['/usr/hdp/current/spark2-historyserver/conf/metrics.properties'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0644}
2018-09-05 11:50:01,174 - XmlConfig['hive-site.xml'] {'owner': 'spark', 'group': 'spark', 'mode': 0644, 'conf_dir': '/usr/hdp/current/spark2-historyserver/conf', 'configurations': ...}
2018-09-05 11:50:01,180 - Generating config: /usr/hdp/current/spark2-historyserver/conf/hive-site.xml
2018-09-05 11:50:01,180 - File['/usr/hdp/current/spark2-historyserver/conf/hive-site.xml'] {'owner': 'spark', 'content': InlineTemplate(...), 'group': 'spark', 'mode': 0644, 'encoding': 'UTF-8'}
2018-09-05 11:50:01,254 - PropertiesFile['/usr/hdp/current/spark2-historyserver/conf/spark-thrift-sparkconf.conf'] {'owner': 'hive', 'key_value_delimiter': ' ', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2018-09-05 11:50:01,257 - Generating properties file: /usr/hdp/current/spark2-historyserver/conf/spark-thrift-sparkconf.conf
2018-09-05 11:50:01,257 - File['/usr/hdp/current/spark2-historyserver/conf/spark-thrift-sparkconf.conf'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-09-05 11:50:01,424 - Writing File['/usr/hdp/current/spark2-historyserver/conf/spark-thrift-sparkconf.conf'] because contents don't match
2018-09-05 11:50:01,427 - File['/usr/hdp/current/spark2-historyserver/conf/spark-thrift-fairscheduler.xml'] {'content': InlineTemplate(...), 'owner': 'spark', 'group': 'spark', 'mode': 0755}
2018-09-05 11:50:01,429 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:50:01,429 - Tarball version was calcuated as 3.0.0.0-1634. Use Command Version: True
2018-09-05 11:51:05,230 - Called copy_to_hdfs tarball: spark2
2018-09-05 11:51:05,230 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:51:05,230 - Tarball version was calcuated as 3.0.0.0-1634. Use Command Version: True
2018-09-05 11:51:05,230 - Source file: /tmp/spark2/spark2-hdp-yarn-archive.tar.gz , Dest file in HDFS: /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz
2018-09-05 11:51:05,230 - HdfsResource['/hdp/apps/3.0.0.0-1634/spark2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-09-05 11:51:05,278 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpz7Dtfv 2>/tmp/tmpNQetHb''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:06,615 - call returned (0, '')
2018-09-05 11:51:06,615 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /hdp/apps/3.0.0.0-1634/spark2"}}404', u'')
2018-09-05 11:51:06,616 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2?op=MKDIRS&user.name=hdfs'"'"' 1>/tmp/tmpephO5j 2>/tmp/tmp6_WUUs''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:07,848 - call returned (0, '')
2018-09-05 11:51:07,848 - get_user_call_output returned (0, u'{"boolean":true}200', u'')
2018-09-05 11:51:07,849 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2?op=SETPERMISSION&user.name=hdfs&permission=555'"'"' 1>/tmp/tmpZ7rpnL 2>/tmp/tmptJ9wUa''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:09,026 - call returned (0, '')
2018-09-05 11:51:09,026 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:09,027 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2?op=SETOWNER&owner=hdfs&group=&user.name=hdfs'"'"' 1>/tmp/tmpKbM_6z 2>/tmp/tmp2nIvQG''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:10,120 - call returned (0, '')
2018-09-05 11:51:10,120 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:10,121 - HdfsResource['/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'source': '/tmp/spark2/spark2-hdp-yarn-archive.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'replace_existing_files': True, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-09-05 11:51:10,122 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpoIUAbW 2>/tmp/tmpEh3shC''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:11,496 - call returned (0, '')
2018-09-05 11:51:11,496 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz"}}404', u'')
2018-09-05 11:51:11,497 - Creating new file /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz in DFS
2018-09-05 11:51:11,498 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/tmp/spark2/spark2-hdp-yarn-archive.tar.gz -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'"'"' 1>/tmp/tmpW64eOo 2>/tmp/tmpFZhZH6''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:19,830 - call returned (0, '')
2018-09-05 11:51:19,830 - get_user_call_output returned (0, u'201', u'')
2018-09-05 11:51:19,831 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz?op=SETPERMISSION&user.name=hdfs&permission=444'"'"' 1>/tmp/tmp02gLvd 2>/tmp/tmpIPHBkn''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:21,672 - call returned (0, '')
2018-09-05 11:51:21,672 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:21,673 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz?op=SETOWNER&owner=hdfs&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmple4yDD 2>/tmp/tmp_5a28V''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:22,503 - call returned (0, '')
2018-09-05 11:51:22,503 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:22,503 - Will attempt to copy spark2 tarball from /tmp/spark2/spark2-hdp-yarn-archive.tar.gz to DFS at /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-yarn-archive.tar.gz.
2018-09-05 11:51:22,503 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:51:22,503 - Tarball version was calcuated as 3.0.0.0-1634. Use Command Version: True
2018-09-05 11:51:32,066 - Called copy_to_hdfs tarball: spark2hive
2018-09-05 11:51:32,066 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.0.0-1634 -> 3.0.0.0-1634
2018-09-05 11:51:32,066 - Tarball version was calcuated as 3.0.0.0-1634. Use Command Version: True
2018-09-05 11:51:32,066 - Source file: /tmp/spark2/spark2-hdp-hive-archive.tar.gz , Dest file in HDFS: /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz
2018-09-05 11:51:32,067 - HdfsResource['/hdp/apps/3.0.0.0-1634/spark2'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-09-05 11:51:32,068 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpx1kovB 2>/tmp/tmpcsoGo2''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:33,386 - call returned (0, '')
2018-09-05 11:51:33,386 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":17426,"group":"hdfs","length":0,"modificationTime":1536115874374,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-09-05 11:51:33,387 - HdfsResource['/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'source': '/tmp/spark2/spark2-hdp-hive-archive.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'replace_existing_files': True, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-09-05 11:51:33,388 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp8jdktw 2>/tmp/tmp0AaRRh''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:34,857 - call returned (0, '')
2018-09-05 11:51:34,857 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz"}}404', u'')
2018-09-05 11:51:34,857 - Creating new file /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz in DFS
2018-09-05 11:51:34,858 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/tmp/spark2/spark2-hdp-hive-archive.tar.gz -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444'"'"' 1>/tmp/tmp2XUKZL 2>/tmp/tmpUNgUS7''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:37,761 - call returned (0, '')
2018-09-05 11:51:37,761 - get_user_call_output returned (0, u'201', u'')
2018-09-05 11:51:37,762 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz?op=SETPERMISSION&user.name=hdfs&permission=444'"'"' 1>/tmp/tmpMlitMr 2>/tmp/tmpJUH6S6''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:38,290 - call returned (0, '')
2018-09-05 11:51:38,291 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:38,292 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz?op=SETOWNER&owner=hdfs&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmpCXoyTh 2>/tmp/tmpNKJFP4''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:39,734 - call returned (0, '')
2018-09-05 11:51:39,734 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:39,734 - Will attempt to copy spark2hive tarball from /tmp/spark2/spark2-hdp-hive-archive.tar.gz to DFS at /hdp/apps/3.0.0.0-1634/spark2/spark2-hdp-hive-archive.tar.gz.
2018-09-05 11:51:39,735 - HdfsResource['hdfs:///spark2-history/'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'recursive_chmod': True, 'owner': 'spark', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0777}
2018-09-05 11:51:39,736 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp2uNUba 2>/tmp/tmpJPCy9p''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:41,168 - call returned (0, '')
2018-09-05 11:51:41,168 - get_user_call_output returned (0, u'{"RemoteException":{"exception":"FileNotFoundException","javaClassName":"java.io.FileNotFoundException","message":"File does not exist: /spark2-history/"}}404', u'')
2018-09-05 11:51:41,169 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=MKDIRS&user.name=hdfs'"'"' 1>/tmp/tmpT_YFxS 2>/tmp/tmpan9eGR''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:42,472 - call returned (0, '')
2018-09-05 11:51:42,472 - get_user_call_output returned (0, u'{"boolean":true}200', u'')
2018-09-05 11:51:42,473 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=SETPERMISSION&user.name=hdfs&permission=777'"'"' 1>/tmp/tmpOP1v5M 2>/tmp/tmpejuGHK''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:43,091 - call returned (0, '')
2018-09-05 11:51:43,091 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:43,092 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=GETCONTENTSUMMARY&user.name=hdfs'"'"' 1>/tmp/tmpho0h5E 2>/tmp/tmpBEwEKH''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:43,828 - call returned (0, '')
2018-09-05 11:51:43,829 - get_user_call_output returned (0, u'{"ContentSummary":{"directoryCount":1,"fileCount":0,"length":0,"quota":-1,"spaceConsumed":0,"spaceQuota":-1,"typeQuota":{}}}200', u'')
2018-09-05 11:51:43,830 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=LISTSTATUS&user.name=hdfs'"'"' 1>/tmp/tmpksIOEF 2>/tmp/tmpqXcoGI''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:44,834 - call returned (0, '')
2018-09-05 11:51:44,834 - get_user_call_output returned (0, u'{"FileStatuses":{"FileStatus":[\n\n]}}\n200', u'')
2018-09-05 11:51:44,835 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://master.knu.com:50070/webhdfs/v1/spark2-history/?op=SETOWNER&owner=spark&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmpYTKNqc 2>/tmp/tmpMkS9qH''] {'logoutput': None, 'quiet': False}
2018-09-05 11:51:45,455 - call returned (0, '')
2018-09-05 11:51:45,455 - get_user_call_output returned (0, u'200', u'')
2018-09-05 11:51:45,455 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.0.0-1634/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master.knu.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.0.0-1634/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-09-05 11:51:45,458 - Execute['/usr/hdp/current/hive-client/bin/schematool -dbType mysql -createCatalog spark -catalogDescription 'Default catalog, for Spark' -ifNotExists -catalogLocation hdfs://master.knu.com:8020/apps/spark/warehouse'] {'user': 'hive'}
Command failed after 1 tries What is the problem..?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Spark
08-28-2018
07:48 AM
This is my error code. i don't know what problem is... please help me, stderr:
Traceback (most recent call last):
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content
web_file = opener.open(req)
File "/usr/lib64/python2.7/urllib2.py", line 437, in open
response = meth(req, response)
File "/usr/lib64/python2.7/urllib2.py", line 550, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib64/python2.7/urllib2.py", line 475, in error
return self._call_chain(*args)
File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain
result = func(*args)
File "/usr/lib64/python2.7/urllib2.py", line 558, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 404: Not Found
The above exception was the cause of the following exception:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in
HiveClient().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 40, in install
self.configure(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 48, in configure
hive(name='client')
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 114, in hive
jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 628, in jdbc_connector
File(params.downloaded_custom_connector, content = DownloadSource(params.driver_curl_source))
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__
return self.get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 197, in get_content
raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex)))
resource_management.core.exceptions.Fail: Failed to download file from http://master:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found
stdout:
2018-08-28 16:39:13,907 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-08-28 16:39:13,912 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf
2018-08-28 16:39:13,913 - Group['livy'] {}
2018-08-28 16:39:13,914 - Group['spark'] {}
2018-08-28 16:39:13,914 - Group['hdfs'] {}
2018-08-28 16:39:13,914 - Group['zeppelin'] {}
2018-08-28 16:39:13,914 - Group['hadoop'] {}
2018-08-28 16:39:13,915 - Group['users'] {}
2018-08-28 16:39:13,915 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,916 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,916 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,917 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,918 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-08-28 16:39:13,918 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-08-28 16:39:13,919 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-08-28 16:39:13,919 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-08-28 16:39:13,920 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-08-28 16:39:13,921 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-08-28 16:39:13,921 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,922 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,922 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-28 16:39:13,923 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-08-28 16:39:13,924 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-08-28 16:39:13,928 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-08-28 16:39:13,928 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-08-28 16:39:13,928 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-08-28 16:39:13,929 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-08-28 16:39:13,930 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-08-28 16:39:13,935 - call returned (0, '1013')
2018-08-28 16:39:13,936 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1013'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-08-28 16:39:13,940 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1013'] due to not_if
2018-08-28 16:39:13,940 - Group['hdfs'] {}
2018-08-28 16:39:13,941 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-08-28 16:39:13,941 - FS Type: HDFS
2018-08-28 16:39:13,941 - Directory['/etc/hadoop'] {'mode': 0755}
2018-08-28 16:39:13,954 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-08-28 16:39:13,955 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-08-28 16:39:13,968 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-08-28 16:39:13,974 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-08-28 16:39:13,974 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-08-28 16:39:13,975 - Repository['HDP-3.0-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-08-28 16:39:13,977 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-08-28 16:39:13,978 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-08-28 16:39:13,979 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-08-28 16:39:13,983 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-08-28 16:39:13,983 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-08-28 16:39:13,984 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-08-28 16:39:14,050 - Skipping installation of existing package unzip
2018-08-28 16:39:14,050 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-08-28 16:39:14,060 - Skipping installation of existing package curl
2018-08-28 16:39:14,060 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-08-28 16:39:14,070 - Skipping installation of existing package hdp-select
2018-08-28 16:39:14,074 - The repository with version 3.0.0.0-1634 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-08-28 16:39:14,347 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf
2018-08-28 16:39:14,365 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-08-28 16:39:14,391 - call returned (0, 'hive-server2 - 3.0.0.0-1634')
2018-08-28 16:39:14,391 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-08-28 16:39:14,419 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://master:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-08-28 16:39:14,421 - Not downloading the file from http://master:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-08-28 16:39:15,945 - Package['hive_3_0_0_0_1634'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-08-28 16:39:16,066 - Skipping installation of existing package hive_3_0_0_0_1634
2018-08-28 16:39:16,068 - Package['hive_3_0_0_0_1634-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-08-28 16:39:16,102 - Skipping installation of existing package hive_3_0_0_0_1634-hcatalog
2018-08-28 16:39:16,104 - Directories to fill with configs: [u'/usr/hdp/current/hive-client/conf']
2018-08-28 16:39:16,104 - Directory['/etc/hive/3.0.0.0-1634/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-08-28 16:39:16,105 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.0.0-1634/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-08-28 16:39:16,124 - Generating config: /etc/hive/3.0.0.0-1634/0/mapred-site.xml
2018-08-28 16:39:16,124 - File['/etc/hive/3.0.0.0-1634/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-08-28 16:39:16,195 - File['/etc/hive/3.0.0.0-1634/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,196 - File['/etc/hive/3.0.0.0-1634/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-08-28 16:39:16,209 - File['/etc/hive/3.0.0.0-1634/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,211 - File['/etc/hive/3.0.0.0-1634/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,214 - File['/etc/hive/3.0.0.0-1634/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,216 - File['/etc/hive/3.0.0.0-1634/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,229 - File['/etc/hive/3.0.0.0-1634/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,230 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.0.0-1634/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://master.knu.com:2181,slave1.knu.com:2181,slave2.knu.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-08-28 16:39:16,236 - Generating config: /etc/hive/3.0.0.0-1634/0/beeline-site.xml
2018-08-28 16:39:16,236 - File['/etc/hive/3.0.0.0-1634/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-08-28 16:39:16,238 - File['/etc/hive/3.0.0.0-1634/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-08-28 16:39:16,238 - File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_client/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-08-28 16:39:16,239 - Writing File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] because contents don't match
2018-08-28 16:39:16,239 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2018-08-28 16:39:16,248 - Generating config: /usr/hdp/current/hive-client/conf/hive-site.xml
2018-08-28 16:39:16,249 - File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-08-28 16:39:16,465 - File['/usr/hdp/current/hive-client/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-08-28 16:39:16,466 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-08-28 16:39:16,468 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-08-28 16:39:16,468 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://master:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-08-28 16:39:16,468 - Not downloading the file from http://master:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-08-28 16:39:16,469 - File['/var/lib/ambari-agent/tmp/mysql-connector-java.jar'] {'content': DownloadSource('http://master:8080/resources/mysql-connector-java.jar')}
2018-08-28 16:39:16,469 - Downloading the file from http://master:8080/resources/mysql-connector-java.jar
2018-08-28 16:39:16,497 - The repository with version 3.0.0.0-1634 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
08-27-2018
06:08 AM
my error logs below Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0 Aug 27, 2018 1:06:46 PM com.google.inject.assistedinject.FactoryProvider2 isValidForOptimizedAssistedInject WARNING: AssistedInject factory org.apache.ambari.server.state.cluster.ClusterFactory will be slow because class org.apache.ambari.server.state.cluster.ClusterImpl has assisted Provider dependencies or injects the Injector. Stop injecting @Assisted Provider<T> (instead use @Assisted T) or Injector to speed things up. (It will be a ~6500% speed bump!) The exact offending deps are: [Key[type=com.google.inject.Injector, annotation=[none]]@org.apache.ambari.server.state.cluster.ClusterImpl.<init>()[1]] Aug 27, 2018 1:06:46 PM com.google.inject.assistedinject.FactoryProvider2 isValidForOptimizedAssistedInject WARNING: AssistedInject factory org.apache.ambari.server.controller.ResourceProviderFactory will be slow because class org.apache.ambari.server.controller.internal.HostComponentResourceProvider has assisted Provider dependencies or injects the Injector. Stop injecting @Assisted Provider<T> (instead use @Assisted T) or Injector to speed things up. (It will be a ~6500% speed bump!) The exact offending deps are: [Key[type=com.google.inject.Injector, annotation=[none]]@org.apache.ambari.server.controller.internal.HostComponentResourceProvider.<init>()[1]] Aug 27, 2018 1:06:46 PM com.google.inject.assistedinject.FactoryProvider2 isValidForOptimizedAssistedInject WARNING: AssistedInject factory org.apache.ambari.server.state.scheduler.RequestExecutionFactory will be slow because class org.apache.ambari.server.state.scheduler.RequestExecutionImpl has assisted Provider dependencies or injects the Injector. Stop injecting @Assisted Provider<T> (instead use @Assisted T) or Injector to speed things up. (It will be a ~6500% speed bump!) The exact offending deps are: [Key[type=com.google.inject.Injector, annotation=[none]]@org.apache.ambari.server.state.scheduler.RequestExecutionImpl.<init>()[2]] Aug 27, 2018 1:06:46 PM com.google.inject.assistedinject.FactoryProvider2 isValidForOptimizedAssistedInject WARNING: AssistedInject factory org.apache.ambari.server.state.scheduler.RequestExecutionFactory will be slow because class org.apache.ambari.server.state.scheduler.RequestExecutionImpl has assisted Provider dependencies or injects the Injector. Stop injecting @Assisted Provider<T> (instead use @Assisted T) or Injector to speed things up. (It will be a ~6500% speed bump!) The exact offending deps are: [Key[type=com.google.inject.Injector, annotation=[none]]@org.apache.ambari.server.state.scheduler.RequestExecutionImpl.<init>()[3]] Aug 27, 2018 1:06:49 PM com.google.inject.internal.ProxyFactory <init> WARNING: Method [public void org.apache.ambari.server.orm.dao.RepositoryVersionDAO.create(java.lang.Object)] is synthetic and is being intercepted by [org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor@295bf2a]. This could indicate a bug. The method may be intercepted twice, or may not be intercepted at all. Aug 27, 2018 1:06:49 PM com.google.inject.internal.ProxyFactory <init> WARNING: Method [public void org.apache.ambari.server.orm.dao.HostVersionDAO.create(java.lang.Object)] is synthetic and is being intercepted by [org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor@295bf2a]. This could indicate a bug. The method may be intercepted twice, or may not be intercepted at all. Aug 27, 2018 1:06:50 PM com.google.inject.internal.ProxyFactory <init> WARNING: Method [public void org.apache.ambari.server.orm.dao.AmbariConfigurationDAO.create(java.lang.Object)] is synthetic and is being intercepted by [org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor@295bf2a]. This could indicate a bug. The method may be intercepted twice, or may not be intercepted at all. Aug 27, 2018 1:06:50 PM com.google.inject.internal.ProxyFactory <init> WARNING: Method [public java.lang.Object org.apache.ambari.server.topology.tasks.ConfigureClusterTask.call() throws java.lang.Exception] is synthetic and is being intercepted by [org.apache.ambari.server.security.authorization.internal.InternalAuthenticationInterceptor@265bd546]. This could indicate a bug. The method may be intercepted twice, or may not be intercepted at all. An unexpected error occured during starting Ambari Server. org.apache.ambari.server.AmbariException: Current database store version is not compatible with current server version, serverVersion=2.7.0.0, schemaVersion=2.1.2 at org.apache.ambari.server.checks.DatabaseConsistencyCheckHelper.checkDBVersionCompatible(DatabaseConsistencyCheckHelper.java:236) at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:1096) ================================================================= After i upgrade ambari-server it doesn't work... What's the problem?
... View more
Labels:
- Labels:
-
Apache Ambari