Member since
09-30-2016
11
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4892 | 11-14-2016 02:49 PM | |
1352 | 11-03-2016 07:29 PM |
11-14-2016
02:49 PM
I finally figured this out and thought it would be friendly of me to post the solution. One of those that when you finally get it you think, "Ugh, that was so obvious". One important note, if you are having trouble with Hive make sure to check the Yarn logs too! My solution to this and so many other issues was ensuring all my nodes had all the other nodes ip addresses in their host files. This ensures Ambari picks up all the correct IPs by hostname. I am on Ubuntu so I did the following: $ vim /etc/hosts And then the file came out looking like this: 127.0.0.1 localhost
#127.0.1.1 ambarihost.com ambarihost
# Assigning static IP here so ambari gets it right
192.168.0.20 ambarihost.com ambarihost
#Other hadoop nodes
192.168.0.21 kafkahost.com kafkahost
192.168.0.22 hdfshost.com hdfshost
... View more
11-03-2016
07:29 PM
Okay, I found a workaround. Since I don't really need this table to be transactional, it was just a nice to have, I have created the table without buckets and without TBLPROPERTIES and now it works as expected. create table stgicplogs (actdatetime timestamp,server VARCHAR(10),pid VARCHAR(25),level VARCHAR(50),type VARCHAR(50),details VARCHAR(8000)) PARTITIONED BY(actdate DATE) STORED AS orc;
... View more