Member since
09-29-2015
286
Posts
601
Kudos Received
60
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
11516 | 03-21-2017 07:34 PM | |
2913 | 11-16-2016 04:18 AM | |
1626 | 10-18-2016 03:57 PM | |
4299 | 09-12-2016 03:36 PM | |
6304 | 08-25-2016 09:01 PM |
01-26-2016
04:04 AM
@David Yee Please accept the answer that helped and close the question
... View more
01-26-2016
03:09 AM
3 Kudos
That step tries to install the ambari agent and register it with the server. Since this is a single node, your hostname may not be correct. Is that host name in /etc/hosts? Is that the hostname returned by hostname -f on that node? Enter the hostname -f if that isn't the case. You can also install the ambari agent manually. See http://docs.hortonworks.com/HDPDocuments/Ambari-2.1.0.0/bk_ambari_reference_guide/content/_install_the_ambari_agents_manually.html Or set up password ssh. See http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_Installing_HDP_AMB/content/_set_up_password-less_ssh.html
... View more
01-25-2016
06:54 PM
What if the Database is in a VM Image?
... View more
01-25-2016
06:47 PM
2 Kudos
This link has some great write ups on Oozie Production Recommendations. It does mention DO not use the same Hive Server MYSQL database for Oozie. What I want to know is can I use an existing database I have in my organization, perhaps on another node outside of the cluster, for Oozie, Ranger, Hive etc, or should all the databases be located in the cluster? What are the cons for using an existing database on a separate node, network etc. outside the cluster for these components?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
01-24-2016
08:38 PM
1 Kudo
Other solutions if this does not work: Try Promiscuous mode - Allow. Try Disabling your firewall also. Finally Try changing the
Network Adapter settings to Bridged Adapter, which will generate a whole new
<ip-address>. https://community.hortonworks.com/questions/10149/installed-sandbox-but-cant-get-the-welcome-hdp-pag.html
... View more
01-24-2016
08:01 PM
Communication failure with server while initializing kadmin interface Cause:
The host that was specified for the admin server,
also called the master KDC, did not have the kadmind daemon
running. Solution:
Make sure that you specified the correct
host name for the master KDC. If you specified the correct host name, make
sure that kadmind is running on the master KDC that you
specified.
From http://docs.oracle.com/cd/E19253-01/816-4557/trouble-6/index.html
... View more
01-23-2016
07:32 PM
3 Kudos
There is no one best tool. For an Opensource offering use the Hortonworks Connector for Teradata. This is the Sqoop Implementation to take data from Teradata to HDP. Documentation for the connector is here. Hortonworks documentation is here. Teradata driver can be found here Save the files from the download to the SQOOP library folder /usr/hdp/current/sqoop-client/lib #Set Classpath export HIVE_HOME=/usr/hdp/current/hiveserver2
export HADOOP_HOME=.usr/hdp/current/
export SQOOP_HOME=/usr/hdp/current/sqoop-client/lib
export HADOOP_CLASSPATH=$(hcat -classpath)
export LIB_JARS=$(echo ${HADOOP_CLASSPATH} | sed -e 's/::*/,/g’) Some command examples
# Hive Import: sqoop —hive-import —hive-overwrite - -create-hive-table —hive-table <table-name> —null-string ‘\\N' —null-non-string ‘\\N'
#Define a Table based on one in a database (Eg. MySQL):
#Important because the Teradata Connector needs to have a Table exists first before importing data
sqoop create-hive-table --connect jdbc:mysql://db.example.com/corp --table employees --hive-table emps Store as ORC file
sqoop import -libjars ${LIB_JARS}
-Dteradata.db.input.target.table.schema="cust_id int, acct_type string,
acct_nbr string, acct_start_date date, acct_end_date date"
-Dteradata.db.input.file.format=orcfile --connect
jdbc:teradata://<teradata host ip address>/Database=financial
--connection-manager org.apache.sqoop.teradata.TeradataConnManager
--username dbc --password dbc --table accts --hive-import --hive-table
financial.acctssqoop import —connect
jdbc:teradata://192.168.1.13/Database=retail --connection-manager
org.apache.sqoop.teradata.TeradataConnManager --username dbc --password
dbc --table accts --hive-import --hive-table financial.accts Info on Teradata offerings beyond Sqoop can be found at this link https://community.hortonworks.com/questions/4418/access-modes-for-teradata-beyond-sqoop-ingestion.html https://community.hortonworks.com/questions/8411/sqoop-job-too-slow-importing-data-from-teradata-to.html From HDFS to Teradata can be found here https://community.hortonworks.com/articles/6161/hdfs-to-teradata-example.html Other ETL tools available are Talend Big Data Edition, Pentaho, Oracle Data Integrator
... View more
01-22-2016
01:15 AM
1 Kudo
From researching on HCC I found out that
Ambari may not handle custom symlinks. So when you installed it via Ambari, if you created different symlinks I think things would break. See this article on the Hortonworks Community Site: https://community.hortonworks.com/questions/8776/customized-hdp-installation.html Someone mentioned that perhaps it is a permission issue on the symlink, based on the rwx settings on the symlink which could cause it to fail to execute. Related to umask issue. Umask is very important. Have it at 0022 during install at setup. When it is done change it to 0027. See https://community.hortonworks.com/questions/8614/zookeeper-service-fails-to-start.html Maybe we can add something to the hdp-select script for custom symlinks to work. This is the script that manages the symlinks. Maybe I can add "html" to a list of options somewhere in conf_select.py: Inspired from https://community.hortonworks.com/questions/5811/install-of-hdp-fails-with-valueerror-invalid-liter.html. Maybe it’s a case where we need to clear the repos, and folders from an old or failed install: https://community.hortonworks.com/questions/1848/python-errors-or-script-does-not-exist-while-insta.html I am still interested in thoughts or if anyone has done this successfully, having custom symlinks since Ops folks don't like using /usr mount
... View more
01-22-2016
12:29 AM
Failure in Ambari Install. Note that /usr/hdp is a symlink to another location (/p01/app/had). Will that symlink break things? Ambari cluster service deployment is failing.
As an Example, the traceback from RegionServer install is: Traceback (most recent call last):
<br>File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
<br>File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
<br>File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 26, in hook
import params
<br>File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py", line 187, in <module>
hadoop_conf_dir = conf_select.get_hadoop_conf_dir(force_latest_on_upgrade=True)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 374, in get_hadoop_conf_dir
select(stack_name, "hadoop", version)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 252, in select
shell.checked_call(get_cmd("set-conf-dir", package, version), logoutput=False, quiet=False, sudo=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'conf-select set-conf-dir --package hadoop --stack-version 2.3.0.0-2557 --conf-version 0' returned 1. Traceback (most recent call last):
File "/usr/bin/conf-select", line 182, in <module>
setConfDir(options.pname, options.sver, options.cver)
File "/usr/bin/conf-select", line 138, in setConfDir
check(sver, pname, cver, "set")
File "/usr/bin/conf-select", line 100, in check
chksVer(sver)
File "/usr/bin/conf-select", line 78, in chksVer
result[tuple(map(int, versionRegex.split(f)))] = f
ValueError: invalid literal for int() with base 10: 'html'
Error: Error: Unable to run the custom hook script ['/usr/bin/python2', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-1718.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-1718.json', 'INFO', '/var/lib/ambari-agent/tm Note that on this host, /usr/hdp is a symlink to another location (/p01/app/had). Will that symlink break things in this case?
... View more
Labels:
- Labels:
-
Apache Ambari
01-21-2016
07:31 PM
1 Kudo
@Predrag Minovic why not Supervisors on their own nodes, not on Data nodes?
... View more