Member since 
    
	
		
		
		07-18-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                53
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                2
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2360 | 07-30-2018 04:19 PM | |
| 5601 | 07-26-2018 06:53 PM | 
			
    
	
		
		
		08-01-2018
	
		
		07:25 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 No. I don't think so. And I don't recall that I created user for mysql during Ambari installation process  root@msl-dpe-perf74:~# mysql -u hive -passWord  hive  -h msl-dpe-perf74.msl.lab
mysql: [Warning] Using a password on the command line interface can be insecure.
ERROR 2003 (HY000): Can't connect to MySQL server on 'msl-dpe-perf74.msl.lab' (111) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-01-2018
	
		
		07:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Amarnath,  Here is what I had tried on the server with hive installed.   root@msl-dpe-perf74:~# mysql -u root -h localhost
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 8
Server version: 5.7.23-0ubuntu0.16.04.1 (Ubuntu)
Copyright (c) 2000, 2018, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| mysql              |
| performance_schema |
| sys                |
+--------------------+
4 rows in set (0.00 sec)
mysql> quit
Bye  root@msl-dpe-perf74:~# mysql -u root -h msl-dpe-perf74.msl.lab
ERROR 2003 (HY000): Can't connect to MySQL server on 'msl-dpe-perf74.msl.lab' (111)
root@msl-dpe-perf74:~#  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-01-2018
	
		
		06:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Follow the suggestions from https://community.hortonworks.com/questions/98770/hive-metastore-does-not-start.html, I had changed the value of 'bind-address' in the mysql configuration file /etc/mysql/mysql.conf.d/mysqld.cnf to '0.0.0.0', I still get following errors  I setup the cluster using Ambari on Ubuntu 16  Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 203, in <module>
    HiveMetastore().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 56, in start
    create_metastore_schema()
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 417, in create_metastore_schema
    user = params.hive_user
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 262, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED] -verbose' returned 1. SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.0-292/hive2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.0-292/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://msl-dpe-perf74.msl.lab/hive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: com.mysql.cj.jdbc.exceptions.CommunicationsException : Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
SQL Error code: 0
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:80)
at org.apache.hive.beeline.HiveSchemaTool.getConnectionToMetastore(HiveSchemaTool.java:133)
at org.apache.hive.beeline.HiveSchemaTool.testConnectionToMetastore(HiveSchemaTool.java:187)
at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:291)
at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:277)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:526)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at com.mysql.cj.jdbc.exceptions.SQLError.createCommunicationsException(SQLError.java:174)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:64)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:832)
at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:456)
at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:240)
at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:207)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:76)
... 11 more
Caused by: com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:61)
at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:105)
at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:151)
at com.mysql.cj.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:167)
at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:91)
at com.mysql.cj.NativeSession.connect(NativeSession.java:152)
at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:952)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:822)
... 17 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at com.mysql.cj.protocol.StandardSocketFactory.connect(StandardSocketFactory.java:173)
at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:65)
... 20 more
*** schemaTool failed *** 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hive
 
			
    
	
		
		
		08-01-2018
	
		
		01:01 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I got following errors when setup a 3 node cluster using Ambari (Ubuntu 16.04). It is reference content from /usr/hdp/current/slider-client/lib, which I don't have. In /usr/hdp/current/slider-client directory, I only have conf. This issue seems had been reported earlier, but I did not see the solution.   https://community.hortonworks.com/questions/149001/slider-client-install.html  Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider_client.py", line 62, in <module>
    SliderClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider_client.py", line 46, in install
    self.configure(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 120, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider_client.py", line 51, in configure
    slider()
  File "/usr/lib/ambari-agent/lib/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider.py", line 92, in slider
    group=params.user_group,
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 120, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/slider-client/lib/slider.tar.gz'] failed, parent directory /usr/hdp/current/slider-client/lib doesn't exist 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		07-30-2018
	
		
		04:19 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 There are 2 problems in the Ambari installation document for Ubuntu 16 at https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.0/bk_ambari-installation/content/download_the_ambari_repo_ubuntu16.html  wget -O /etc/apt/sources.list.d/ambari.list http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates//ambari.list
apt-key adv --recv-keys --keyserver keyserver.ubuntu.com B9733A7A07513CAD
apt-get update  As both Geoffrey and Jay had pointed out, the first sample command missing version number and document did not instruct to insert that for the execution  For 2nd command to retrieve Linux key, the given format will result in timeout due to the syntax used.   root@msl-dpe-perf77:/usr/local/Ambari# apt-key adv --recv-keys --keyserver keyserver.ubuntu.com B9733A7A07513CAD
Executing: /tmp/tmp.1enMMyqYWS/gpg.1.sh --recv-keys
--keyserver
keyserver.ubuntu.com
B9733A7A07513CAD
gpg: requesting key 07513CAD from hkp server keyserver.ubuntu.com
gpg: keyserver timed out
gpg: keyserver receive failed: keyserver error
root@msl-dpe-perf77:/usr/local/Ambari#   Following syntax will resolve this error  root@msl-dpe-perf77:/usr/local/Ambari# apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv B9733A7A07513CAD
Executing: /tmp/tmp.PQTyyilAKr/gpg.1.sh --keyserver
hkp://keyserver.ubuntu.com:80
--recv
B9733A7A07513CAD
gpg: requesting key 07513CAD from hkp server keyserver.ubuntu.com
gpg: key 07513CAD: "Jenkins (HDP Builds) <jenkin@hortonworks.com>" not changed
gpg: Total number processed: 1
gpg:              unchanged: 1
root@msl-dpe-perf77:/usr/local/Ambari# 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-28-2018
	
		
		05:08 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am trying to install Ambari with pubic repositories. At first step, I got:  root@msl-dpe-perf77:/usr/local/Ambari# wget -O /etc/apt/sources.list.d/ambari.list http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates//ambari.list
--2018-07-28 09:57:26--  http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates//ambari.list
Resolving public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)... 52.84.235.43, 52.84.235.34, 52.84.235.239, ...
Connecting to public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)|52.84.235.43|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-07-28 09:57:26 ERROR 404: Not Found.  I had been using the same process earlier of the week and it works fine. I had tried machine without firewall. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Ambari
 
			
    
	
		
		
		07-26-2018
	
		
		06:53 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 resource_management import error was caused by ambari wizard using /usr/lib/python2.6/site-packages. For Ubuntu 16, python 2.7 does not have this directory on path. It can be resolved by adding  PYTHONPATH=/usr/lib/python2.6/site-packages 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-26-2018
	
		
		04:41 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks Adi and Akhil  A closer look of the issue seems pointing to failed run of hook.py, here is the message  Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-399.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-399.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):  I tested this script manually and here is what I got  harry.li@msl-dpe-perf74:/usr/lib/python2.6/site-packages$  sudo /usr/bin/python '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py'
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 20, in <module>
    from resource_management import *
ImportError: No module named resource_management  I verified that my python 2.7.12 is installed correctly and resource_management directory has been installed correctly too. Is there a setting in Ambari to control python import path?  root@msl-dpe-perf74:/usr/lib/python2.6/site-packages# ls -l /usr/lib/ambari-agent/lib
total 20
drwxr-xr-x 3 root root 4096 Jul 25 17:54 ambari_commons
drwxr-xr-x 3 root root 4096 Jul 24 17:29 ambari_jinja2
drwxr-xr-x 2 root root 4096 Jul 24 17:29 ambari_simplejson
drwxr-xr-x 2 root root 4096 Jul 24 17:29 examples
drwxr-xr-x 4 root root 4096 Jul 24 17:29 resource_management
root@msl-dpe-perf74:/usr/lib/python2.6/site-packages# ls -l /usr/lib/ambari-agent/lib/resource_management/
total 16
drwxr-xr-x 5 root root 4096 Jul 24 17:29 core
-rwxrwxrwx 1 root root  887 Feb 23 11:10 __init__.py
-rw-r--r-- 1 root root 1049 Jul 24 17:29 __init__.pyc
drwxr-xr-x 6 root root 4096 Jul 24 17:29 libraries 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-25-2018
	
		
		11:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am trying to setup a 2 node cluster with SPARK, HIVE using Ambari Cluster Install Wizard. I had passed first 8 steps and get stuck at "Install, Start and Test" step. Here is the error message from one of the node  I am using Ubuntu 16.04  stderr: 
<script id="metamorph-23258-start" type="text/x-placeholder"></script>Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
    BeforeAnyHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook
    setup_users()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 51, in setup_users
    fetch_nonlocal_groups = params.fetch_nonlocal_groups,
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/accounts.py", line 84, in action_create
    shell.checked_call(command, sudo=True)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hadoop,user,spark,git,wheel -g hadoop spark' returned 6. usermod: user 'spark' does not exist in /etc/passwd
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-399.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-399.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
    BeforeInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select<script id="metamorph-23258-end" type="text/x-placeholder"></script>
 stdout:
<script id="metamorph-23260-start" type="text/x-placeholder"></script>2018-07-25 16:44:16,944 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-07-25 16:44:16,948 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-07-25 16:44:16,948 - Group['livy'] {}
2018-07-25 16:44:16,949 - Group['spark'] {}
2018-07-25 16:44:16,955 - Group['hdfs'] {}
2018-07-25 16:44:16,955 - Group['hadoop'] {}
2018-07-25 16:44:16,956 - Group['users'] {}
2018-07-25 16:44:16,956 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-07-25 16:44:16,957 - Modifying user hive
2018-07-25 16:44:16,973 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-07-25 16:44:16,975 - Modifying user livy
2018-07-25 16:44:16,989 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-07-25 16:44:16,991 - Modifying user zookeeper
2018-07-25 16:44:17,006 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-07-25 16:44:17,008 - Modifying user spark
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-399.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-399.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
Command failed after 1 tries
<script id="metamorph-23260-end" type="text/x-placeholder"></script>  This is the summary configuration reported in Review step and screen output for "Deploy" step  Cluster Name : HW2N
Total Hosts : 2 (2 new)
Repositories:
    ubuntu16 (HDP-2.6):
    http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.5.0
    ubuntu16 (HDP-2.6-GPL):
    http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/2.x/updates/2.6.5.0
    ubuntu16 (HDP-UTILS-1.1.0.22):
    http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16
Services:
    HDFS
        DataNode : 1 host
        NameNode : msl-dpe-perf77.msl.lab
        NFSGateway : 0 host
        SNameNode : msl-dpe-perf77.msl.lab
    YARN + MapReduce2
        App Timeline Server : msl-dpe-perf77.msl.lab
        NodeManager : 1 host
        ResourceManager : msl-dpe-perf77.msl.lab
    Tez
        Clients : 1 host
    Hive
        Metastore : msl-dpe-perf77.msl.lab
        HiveServer2 : msl-dpe-perf77.msl.lab
        WebHCat Server : msl-dpe-perf77.msl.lab
        Database : New MySQL Database
    HBase
        Master : msl-dpe-perf77.msl.lab
        RegionServer : 1 host
        Phoenix Query Server : 0 host
    Pig
        Clients : 1 host
    ZooKeeper
        Server : msl-dpe-perf77.msl.lab
    Ambari Metrics
        Metrics Collector : msl-dpe-perf77.msl.lab
        Grafana : msl-dpe-perf77.msl.lab
    SmartSense
        Activity Analyzer : msl-dpe-perf77.msl.lab
        Activity Explorer : msl-dpe-perf77.msl.lab
        HST Server : msl-dpe-perf77.msl.lab
    Spark
        Livy Server : 0 host
        History Server : msl-dpe-perf77.msl.lab
        Thrift Server : 0 host
    Spark2
        Livy for Spark2 Server : 0 host
        History Server : msl-dpe-perf77.msl.lab
        Thrift Server : 0 host
    Slider
        Clients : 1 host     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Ambari
 
			
    
	
		
		
		07-24-2018
	
		
		06:22 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Following the instruction from https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.5/bk_ambari-installation/content/set_up_the_ambari_server.html   Package was installed successfully, but failed when creating schema and user with "ambari-server setup". Here is the message  root@msl-dpe-perf75:/usr/local# ambari-server setup
Using python  /usr/bin/python
Setup ambari-server
Checking SELinux...
WARNING: Could not run /usr/sbin/sestatus: OK
Customize user account for ambari-server daemon [y/n] (n)? y
Enter user account for ambari-server daemon (root):
Adjusting ambari-server permissions and ownership...
Checking firewall status...
Checking JDK...
Do you want to change Oracle JDK [y/n] (n)? y
[1] Oracle JDK 1.8 + Java Cryptography Extension (JCE) Policy Files 8
[2] Oracle JDK 1.7 + Java Cryptography Extension (JCE) Policy Files 7
[3] Custom JDK
==============================================================================
Enter choice (1): 1
JDK already exists, using /var/lib/ambari-server/resources/jdk-8u112-linux-x64.tar.gz
Installing JDK to /usr/jdk64/
Successfully installed JDK to /usr/jdk64/
JCE Policy archive already exists, using /var/lib/ambari-server/resources/jce_policy-8.zip
Installing JCE policy...
Checking GPL software agreement...
Completing setup...
Configuring database...
Enter advanced database configuration [y/n] (n)? y
Configuring database...
==============================================================================
Choose one of the following options:
[1] - PostgreSQL (Embedded)
[2] - Oracle
[3] - MySQL / MariaDB
[4] - PostgreSQL
[5] - Microsoft SQL Server (Tech Preview)
[6] - SQL Anywhere
[7] - BDB
==============================================================================
Enter choice (1): 1
Database admin user (postgres): 
Database name (ambari): 
Postgres schema (ambari): 
Username (ambari): 
Enter Database Password (bigdata): 
Default properties detected. Using built-in database.
Configuring ambari database...
Checking PostgreSQL...
Configuring local database...
Configuring PostgreSQL...
Backup for pg_hba found, reconfiguration not required
Creating schema and user...
ERROR: Failed to execute command:['ambari-sudo.sh', 'su', 'postgres', '-', '--command=psql -f /var/lib/ambari-server/resources/Ambari-DDL-Postgres-EMBEDDED-CREATE.sql -v username=\'"ambari"\' -v password="\'bigdata\'" -v dbname="ambari"']
ERROR: stderr:psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
ERROR: stdout:
failed to execute queries ...retrying (1)
Creating schema and user...
ERROR: Failed to execute command:['ambari-sudo.sh', 'su', 'postgres', '-', '--command=psql -f /var/lib/ambari-server/resources/Ambari-DDL-Postgres-EMBEDDED-CREATE.sql -v username=\'"ambari"\' -v password="\'bigdata\'" -v dbname="ambari"']
ERROR: stderr:psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
ERROR: stdout:
failed to execute queries ...retrying (2)
Creating schema and user...
ERROR: Exiting with exit code 2. 
REASON: Running database init script failed. Exiting.
root@msl-dpe-perf75:/usr/local#   Can anyone suggest how to resolve the error? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Ambari
 
- « Previous
 - Next »