Member since
05-09-2016
280
Posts
58
Kudos Received
31
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3524 | 03-28-2018 02:12 PM | |
2956 | 01-09-2018 09:05 PM | |
1552 | 12-13-2016 05:07 AM | |
4803 | 12-12-2016 02:57 AM | |
4038 | 12-08-2016 07:08 PM |
07-08-2016
07:21 PM
1 Kudo
Hi @Timothy Spann, there is a bug from Ambari perspective, it is not generating hiveserver2-site.xml. So any changes made in Advanced Hiveserver2 site section from Ambari are not getting reflected, (we make changes in hiveserver2-site.xml for Ranger) so if you disable authorization from the general settings as mentioned above, you will be able to run Hive cli but Ranger policies will not work as expected. This issue has been raised up and will be resolved soon in the upcoming releases of Sandbox. For now, you can use Hive but without any Ranger policies.
... View more
07-08-2016
05:38 AM
Thank a lot for the quick reply and a great explanation. Yes, you are right. I was running through HDP 2.5 command line. Now I installed Squirrel on my local MAC but it is little confusing now, how do I point it to Phoenix server at my Sandbox? squirrelhomepage.png
... View more
07-08-2016
04:06 AM
I have downloaded and installed Squirrel from here. I copied phoenix client jar to the lib directory of Squirrel and it is being installed successfully but I am not able to figure out how to start it. I ran squirrel-sql.sh from the HDP sandbox terminal but nothing happens after that. Please help.
... View more
Labels:
- Labels:
-
Apache Phoenix
07-05-2016
06:29 PM
@Sunile Manjee Grafana is not yet installed on HDP2.5 technical preview. It will be installed and tested very soon. @rmolina, please check out this.
... View more
07-05-2016
05:42 PM
Also, if you stop the phoenix shell and then open it again, the warning message will not come.
... View more
06-13-2016
04:34 PM
Hi @Sunile Manjee, I am following a hbase export table technique. I did an export, created a hive table stored as sequencefile but if I am loading the sequence file data into Hive table, its giving me the error: java.lang.RuntimeException: java.io.IOException: WritableName can't load class: org.apache.hadoop.hbase.io.ImmutableBytesWritable It would be really helpful if you let me know the solution. Thanks.
... View more
06-03-2016
06:59 PM
2 Kudos
@BRivas garriv It takes some manual effort but, first change the working directory to / (cd /). then do du -h on every folder to find the disk usage. (Example: "du -h var" or "du -h usr"). Locate the folder which is taking up all disk space and try to delete irrelevant files from that folder.
... View more
05-27-2016
09:44 PM
@Wrangler Data Please refer to this link which will guide through the update process of Ambari and HDP both. http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_upgrading_Ambari/content/_upgrade_ambari.html Let us know if you face any issue.
... View more
05-18-2016
08:37 PM
Have any one experienced this before?
... View more
- « Previous
- Next »