Created 10-10-2017 09:49 AM
im based on http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.2.0/ambari.listr
selected spark2 and all its required dependencies
the following services have an error:
i receive the following error on manual starting history server
INFO 2017-10-10 04:57:10,565 logger.py:75 - Testing the JVM's JCE policy to see it if supports an unlimited key length. INFO 2017-10-10 04:57:10,565 logger.py:75 - Testing the JVM's JCE policy to see it if supports an unlimited key length. INFO 2017-10-10 04:57:10,681 Hardware.py:176 - Some mount points were ignored: /dev, /run, /, /dev/shm, /run/lock, /sys/fs/cgroup, /boot, /home, /run/user/108, /run/user/1007, /run/user/1005, /run/user/1010, /run/user/1011, /run/user/1012, /run/user/1001 INFO 2017-10-10 04:57:10,682 Controller.py:320 - Sending Heartbeat (id = 4066) INFO 2017-10-10 04:57:10,688 Controller.py:333 - Heartbeat response received (id = 4067) INFO 2017-10-10 04:57:10,688 Controller.py:342 - Heartbeat interval is 1 seconds INFO 2017-10-10 04:57:10,688 Controller.py:380 - Updating configurations from heartbeat INFO 2017-10-10 04:57:10,688 Controller.py:389 - Adding cancel/execution commands INFO 2017-10-10 04:57:10,688 Controller.py:475 - Waiting 0.9 for next heartbeat INFO 2017-10-10 04:57:11,589 Controller.py:482 - Wait for next heartbeat over WARNING 2017-10-10 04:57:22,205 base_alert.py:138 - [Alert][namenode_hdfs_capacity_utilization] Unable to execute alert. division by zero INFO 2017-10-10 04:57:27,060 ClusterConfiguration.py:119 - Updating cached configurations for cluster vqcluster INFO 2017-10-10 04:57:27,071 Controller.py:249 - Adding 1 commands. Heartbeat id = 4085 INFO 2017-10-10 04:57:27,071 ActionQueue.py:113 - Adding EXECUTION_COMMAND for role SPARK2_JOBHISTORYSERVER for service SPARK2 of cluster vqcluster to the queue. INFO 2017-10-10 04:57:27,081 ActionQueue.py:238 - Executing command with id = 68-0, taskId = 307 for role = SPARK2_JOBHISTORYSERVER of cluster vqcluster. INFO 2017-10-10 04:57:27,081 ActionQueue.py:279 - Command execution metadata - taskId = 307, retry enabled = False, max retry duration (sec) = 0, log_output = True WARNING 2017-10-10 04:57:27,083 CommandStatusDict.py:128 - [Errno 2] No such file or directory: '/var/lib/ambari-agent/data/output-307.txt' INFO 2017-10-10 04:57:32,563 PythonExecutor.py:130 - Command ['/usr/bin/python', u'/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/job_history_server.py', u'START', '/var/lib/ambari-agent/data/command-307.json', u'/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package', '/var/lib/ambari-agent/data/structured-out-307.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', ''] failed with exitcode=1 INFO 2017-10-10 04:57:32,577 log_process_information.py:40 - Command 'export COLUMNS=9999 ; ps faux' returned 0. USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
Created 10-15-2017 07:03 AM
Sorry to hear you are encountering all these problems. Could you tell me the
HDP,Ambari and OS type and version you are trying to install.
I will try to guide you.
Created 10-18-2017 06:20 AM
before install i enterg
the error:
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 211, in <module> HiveMetastore().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 61, in start create_metastore_schema() File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 382, in create_metastore_schema user = params.hive_user File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED] -verbose' returned 1. SLF4J: Class path contains multiple SLF4J bindings.
can you provide detail information of hive requirements installation, including mysql\pgsql simple configuration
Created 10-18-2017 07:31 AM
I assume you have installed Postgres or Mysql and run the corresponding on host ubuntu17
ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar
I have used the same user, password and database name for simplicity.
For Postgres
As the root switch to Postgres user
# su - postgres postgres@ubuntu17:~$ psql psql (9.5.9) Type "help" for help.
Hive user/database setup
postgres=# DROP DATABASE if exists hive; postgres=# CREATE USER hive PASSWORD 'hive'; postgres=# CREATE DATABASE hive OWNER hive; postgres=# grant all privileges on database hive to hive; postgres=# \q
Oozie user/database setup
postgres=# DROP DATABASE if exists oozie; postgres=# CREATE USER oozie PASSWORD 'oozie'; postgres=# CREATE DATABASE oozie OWNER oozie; postgres=# grant all privileges on database oozie to oozie; postgres=# \q
Ranger user/database setup
postgres=# DROP DATABASE if exists rangerdb; postgres=# CREATE USER rangerdb PASSWORD 'rangerdb'; postgres=# CREATE DATABASE rangerdb OWNER rangerdb; postgres=# grant all privileges on database rangerdb to rangerdb; postgres=# \q
Edit the pg_hba.conf
vi /etc/postgresql/9.5/main/pg_hba.conf at the end of the file in example below my ambari,hive and ranger are using postgres database
local all ambari,hive,oozie,ranger,mapred md5 host all ambari,hive,oozie,ranger,mapred 0.0.0.0/0 md5 host all ambari,hive,oozie,ranger,mapred ::/0 md5
Then restart postgres
/etc/init.d/postgresql restart
For Mysql
In this exampel I assume the root password is hadoop
hive user
# mysql -u root -phadoop CREATE USER 'hive'@'localhost' IDENTIFIED BY 'hive'; GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost'; CREATE USER 'hive'@'%' IDENTIFIED BY 'hive'; GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%'; CREATE USER 'hive'@'<HIVEMETASTORE_FQDN>' IDENTIFIED BY 'hive'; GRANT ALL PRIVILEGES ON *.* TO 'hive'@'<HIVEMETASTOREFQDN>'; FLUSH PRIVILEGES;
Create the Hive database
The Hive database must be created before loading the Hive database schema.
mysql -u root -phadoop CREATE DATABASE hive; quit;
Oozie user setup
mysql -u root -phadoop CREATE USER 'hive'@'%' IDENTIFIED BY 'hive'; GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%'; FLUSH PRIVILEGES;
Create the Oozie database
The oozie database must be created before loading the oozie database schema.
mysql -u root -phadoop CREATE DATABASE oozie;
After either the MySQL or Postrges setup, now in the Ambari UI hive setup see attached screenshots, you will need to use the credentials setup earlier. Choose use existing PostgreSQL/MySQL database
In the initial setup, it will ask you to test the connectivity between hive and Postgres this MUST succeed.In both setups just make sure for both cases the below correct entries are chosen.
Hive Database Database Name Database Username Database Password JDBC Driver class Database URL Hive Database Type
Hope that helps
Created 10-18-2017 07:39 AM
few minuets before i saw this post i just successfully solved the problem, i had two issues one i did not create hive db
CREATE DATABASE hive;
i base it on your post from
https://community.hortonworks.com/answers/107905/view.html
another issue i had i in the db url connection, i change it, to localhost.
i am trying to accept your answer but i cant, don't have a button for it?
next stage is to try it with non root install