Member since
09-29-2015
286
Posts
601
Kudos Received
60
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
11505 | 03-21-2017 07:34 PM | |
2907 | 11-16-2016 04:18 AM | |
1625 | 10-18-2016 03:57 PM | |
4294 | 09-12-2016 03:36 PM | |
6269 | 08-25-2016 09:01 PM |
02-02-2016
07:58 AM
1 Kudo
@rich
... View more
02-02-2016
07:46 AM
@Kibrom GebreHiwot Perhaps your /etc/hosts is wrong...
What's in your /etc/hosts? Is 10.0.225.60 there? What is the eth0 ip address when you do a ifconfig? Perhaps your ambari-server really did not start. What is your ambari-server status after executing > ambari-server status If it is not running do > ambari-server stop
> rm -f /var/run/ambari-server/ambari-server.pid
> ambari-server start
Do you see an ambari-server process after executing ps -ef | grep ambari-server? If not what is in the log file /var/log/ambari-server/ambari-server.log? If you don't see any errors follow the steps here: https://community.hortonworks.com/articles/6227/sandbox-1270018080-not-accessible.html If
NAT does not work try Bridged and get the ipaddress using ifconfig,
restart Ambari-server and use that ip address when you do a http://:8080>:8080 Finally check the firewalls on your host.
... View more
02-02-2016
12:23 AM
I assume this is a completely different question right as you are able to view the hive table since you used the --hive-import Sqoop option, and you are now able to see the ORC table you created subsequently and query the data after accomplishing an "INSERT OVERWRITE TABLE" right? If you are just asking how can I import an existing HDFS file/ folder into a Hive Table, then @Artem Ervits is correct. Just do for example create external table employee2 LOCATION '/tmp/task/employees' and query using Hive view. For syntax see https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL
... View more
02-02-2016
12:01 AM
3 Kudos
Yes. I do recommend the following course of action: Setup Ambari with Both LDAP and SSL first Then setup HiveServer2 with LDAP SSL. See the following on HCC: https://community.hortonworks.com/questions/1522/how-to-setup-hiveserver2-authentication-with-ldap.html and HW docs here Then you can turn SSL On for Hive also. See http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_Security_Guide/content/ch_wire-hiveserver2.html Then test in Beeline LDAPs authentication with ssl=true Optionally you can setup Knox also with another HiveServer2 for just HTTPs Then do the Kerberos Wizard
... View more
02-01-2016
11:41 PM
@keerthana gajarajakumar Let's see if I can make it clear.
You asked how can I sqoop data into HDP/ Hive and then create an ORC table from it.
I suggested the following:
INSTEAD of Sqoop Data into a table and THEN creating an ORC table you can simply create the ORC table directly with Sqoop - hence the following options in the Sqoop call
--hcatalog-database default --hcatalog-table my_table_orc --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile" OR, you can use normal Sqoop without the ORC options, create a normal table as you have done and then create an OrcEmployees table.
... View more
02-01-2016
11:05 PM
1 Kudo
You can also review the docs here: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_dataintegration/content/moving_data_from_hdfs_to_hive_external_table_method.html
... View more
02-01-2016
10:59 PM
I just realized you had a comment under your original question. I will edit it so that it is the actual question.
I was suggesting you can use Sqoop to DIRECTLY create an ORC Table.
Try this also:
INSERT OVERWRITE TABLE orcemployees SELECT * FROM employees;
What is your output after you have done so?
... View more
02-01-2016
10:42 PM
1 Kudo
@keerthana gajarajakumar You have two options: Use Sqoop to save directly as ORC file $ sqoop import --connect jdbc:mysql://localhost/employees --username hive --password hive --table departments --hcatalog-database default --hcatalog-table my_table_orc --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile"
Since you already created the Employee Table then create the ORC table first (employee_orc) and then select from your employee hive table and insert into employee_orc
create table employee_orc stored as orc as select * from employee_hive;
... View more
02-01-2016
05:07 PM
3 Kudos
The requirement is to extract data from Salesforce and to ingest into Hive.
Is this a good use case for HDF? The main requirement is to pull data from Salesforce. What processors are appropriate here? The invokeHttp processor and/or ExtractText processor?
... View more
Labels:
- Labels:
-
Apache NiFi
01-31-2016
11:55 PM
@keerthana gajarajakumar
Try this grant all privileges on *.* to ‘username’@’%’ identified by ‘userpassword’;
... View more