Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2573 | 11-01-2016 05:43 PM | |
| 8532 | 11-01-2016 05:36 PM | |
| 4877 | 07-01-2016 03:20 PM | |
| 8199 | 05-25-2016 11:36 AM | |
| 4348 | 05-24-2016 05:27 PM |
11-30-2018
12:21 AM
Hi Neeraj, I am able to install presto but my queries are failing. Did you face similar error before? presto:default> show tables;
Query 20181130_001533_00002_5gf9c failed: 10.xxx.xx.xx: null
presto:default> exit
Caused by: org.apache.thrift.transport.TTransportException: 10.xxx.xx.xx:: null
at com.facebook.presto.hive.HiveMetastoreClientFactory.rewriteException(HiveMetastoreClientFactory.java:58)
at com.facebook.presto.hive.HiveMetastoreClientFactory.access$000(HiveMetastoreClientFactory.java:33)
... View more
06-18-2019
06:09 PM
it is strange, I relaunched again Registering host, his passed !!? would not it be a performance problem of memory (RAM) or network?
... View more
02-27-2016
05:03 PM
Only apply patches if necessary and instructed by support. In case you don't have a support contract, here's Pivotal instructions to patch Ambari, we don't provide steps due to the reasons above. http://hawq.docs.pivotal.io/docs-hawq/topics/hdp-prerequisites.html Needless to say its at your own risk.
... View more
03-15-2017
04:51 PM
At this time, column level security it only possible when accessing data through Hive
... View more
02-02-2016
07:39 PM
@niraj nagle are you still having issues with this? Can you accept best answer or provide your own solution?
... View more
02-04-2016
05:09 PM
@Gerd Koenig good job, re-accepted to give you credit
... View more
01-18-2016
05:52 AM
Thank you Neeraj Sabharwal, yes, this is my fault not to go through the documentation first. i do first check documentations before touching any technology and i got this habbit from oracle. i dont know why in Hortonworks i skip documentations, may be it is new for me and a bit difficult to find the proper documentation... i dont know. but i promise, i will check them from now on. i really appriciate your kind replies and support. thank you so much.
... View more
01-16-2016
01:59 AM
5 Kudos
Original 1) Setup Azure account 2) Setup CloudBreak account Very important steps : Applies to Azure only Create a test network in Azure before you start creating cloudbreak credentials. In your local machine, run the following and accept default values. openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout azuretest.key -out azuretest.pem You will see 2 files as listed below. -rw-r--r-- 1 nsabharwal staff 1346 May 7 17:00 azuretest.pem --> We need this file to create credentials in cloudbreak. -rw-r--r-- 1 nsabharwal staff 1679 May 7 17:00 azuretest.key --> We need this to login into the host after cluster deployment. chmod 400 azuretest.key --> otherwise, you will receiver bad permission error for example: ssh -i azuretest.key ubuntu@<server> Very important: check your openssl version and if it's latest version then run the following and use azuretest_login.key to login openssl rsa -in azuretest.key-out azuretest_login.key hw11326:jumk nsabharwal$ openssl version OpenSSL 0.9.8zc 15 Oct 2014 Latest version of openssl creates .key with -----BEGIN PRIVATE KEY----- Old openssl creates keys with ( we need this) -----BEGIN RSA PRIVATE KEY----- Login to cloudbreak portal and create Azure credential Once you fill the information and hit create credentials then you will get a file from cloudbreak that needs to be uploaded into the Azure portal. I saved it as azuretest.cert Login to Azure portal ( switch to classic mode in case you are using new portal) click Settings --> Manage Certificates then upload the bottom of the screen. There are 2 more actions In CloudBreak windows 1) Create a template You can change the instance type & volume type as per your setup. 2) Create a blueprint - You can grab sample blueprints here ( You may have to format the blueprint in case there is any issue) Once all this done then you are all set to deploy the cluster select the credential and hit create cluster Create cluster window handy commands to login into docker login into your host ssh -i azuretest.key ubuntu@fqdn " New announcement: Just found out that user needs to be cloudbreak instead of ubuntu " ssh -i azuretest.key cloudbreak@fqdn Once you are in the shell , sudo su - docker ps docker exec -it <container id> bash [root@azuretest ~]# docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES f493922cd629 sequenceiq/docker-consul-watch-plugn:1.7.0-consul "/start.sh" 2 hours ago Up 2 hours consul-watch 100e7c0b6d3d sequenceiq/ambari:2.0.0-consul "/start-agent" 2 hours ago Up 2 hours ambari-agent d05b85859031 sequenceiq/consul:v0.4.1.ptr "/bin/start -adverti 2 hours ago Up 2 hours consul [root@test~]# docker exec -it 100e7c0b6d3d bash bash-4.1# docker commands Happy Hadooping!!!! Note: For the latest information and changes, please see https://github.com/sequenceiq/cloudbreak Hadoop
Cloud Computing
Big Data
... View more
Labels: