Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
01-16-2016
01:59 AM
5 Kudos
Original 1) Setup Azure account 2) Setup CloudBreak account Very important steps : Applies to Azure only Create a test network in Azure before you start creating cloudbreak credentials. In your local machine, run the following and accept default values. openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout azuretest.key -out azuretest.pem You will see 2 files as listed below. -rw-r--r-- 1 nsabharwal staff 1346 May 7 17:00 azuretest.pem --> We need this file to create credentials in cloudbreak. -rw-r--r-- 1 nsabharwal staff 1679 May 7 17:00 azuretest.key --> We need this to login into the host after cluster deployment. chmod 400 azuretest.key --> otherwise, you will receiver bad permission error for example: ssh -i azuretest.key ubuntu@<server> Very important: check your openssl version and if it's latest version then run the following and use azuretest_login.key to login openssl rsa -in azuretest.key-out azuretest_login.key hw11326:jumk nsabharwal$ openssl version OpenSSL 0.9.8zc 15 Oct 2014 Latest version of openssl creates .key with -----BEGIN PRIVATE KEY----- Old openssl creates keys with ( we need this) -----BEGIN RSA PRIVATE KEY----- Login to cloudbreak portal and create Azure credential Once you fill the information and hit create credentials then you will get a file from cloudbreak that needs to be uploaded into the Azure portal. I saved it as azuretest.cert Login to Azure portal ( switch to classic mode in case you are using new portal) click Settings --> Manage Certificates then upload the bottom of the screen. There are 2 more actions In CloudBreak windows 1) Create a template You can change the instance type & volume type as per your setup. 2) Create a blueprint - You can grab sample blueprints here ( You may have to format the blueprint in case there is any issue) Once all this done then you are all set to deploy the cluster select the credential and hit create cluster Create cluster window handy commands to login into docker login into your host ssh -i azuretest.key ubuntu@fqdn " New announcement: Just found out that user needs to be cloudbreak instead of ubuntu " ssh -i azuretest.key cloudbreak@fqdn Once you are in the shell , sudo su - docker ps docker exec -it <container id> bash [root@azuretest ~]# docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES f493922cd629 sequenceiq/docker-consul-watch-plugn:1.7.0-consul "/start.sh" 2 hours ago Up 2 hours consul-watch 100e7c0b6d3d sequenceiq/ambari:2.0.0-consul "/start-agent" 2 hours ago Up 2 hours ambari-agent d05b85859031 sequenceiq/consul:v0.4.1.ptr "/bin/start -adverti 2 hours ago Up 2 hours consul [root@test~]# docker exec -it 100e7c0b6d3d bash bash-4.1# docker commands Happy Hadooping!!!! Note: For the latest information and changes, please see https://github.com/sequenceiq/cloudbreak Hadoop
Cloud Computing
Big Data
... View more
Labels:
05-28-2018
02:44 PM
Hi Neeraj, We are trying to test GCS connector with HDP 2.6.5 ( https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-gcp.html
) by having GCS as my storage. When trying to create hive external table it’s failing with the
following error: Error: Error while
processing statement: FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.security.AccessControlException: Permission denied:
user=hdpuser1, path="gs://hive_metastore/":hive:hive:drwx------)
(state=08S01,code=1) Syntax: CREATE EXTERNAL TABLE test1256(name string,id
int) LOCATION 'gs://hive_metastore/' ;
... View more
01-30-2016
07:28 PM
@Shaofeng Shi Thanks for sharing all the comments. I wonder if it;s possible to post them as an article...Please
... View more
12-24-2015
03:41 PM
4 Kudos
Part 2
Linkedin Post
Extending
Blog 2 to look for starwars tweet
Searching for Yoda, Love & Hate
Let's see Tweets/Data with word YODA related to Starwars tweets
Keyword : LOVE in STARWARS
Source Giphy
Word Hate
Happy Hadooping!!!
... View more
Labels:
04-13-2016
01:09 AM
Hi Neeraj, Trying to follow this demo. My dashboard is empty. Also in the PutSolrContentStream processor, there are zero records that are written to output although 64 records have been input. How do I debug to see what is stopping from writing the records into Solr?
... View more
03-07-2017
06:06 AM
hi, i'm getting authentication error (refer attachment) nifi-puthdfs-error.png nifi-config.png Can you tell me what could be the reason ? Regards Subramaniyam.KMV
... View more
12-23-2015
11:07 AM
5 Kudos
Original post A web-based notebook that enables interactive data analytics.
You can make beautiful data-driven, interactive and collaborative documents with SQL, Scala and more. In few words " It's really cool tool to interact with Data" HDFS, Hive, Spark, Kylin, Flink This is from HDP latest Sandbox Continue to Blog 3 on NiFi Let's analyze Starwars data Hive Demo Table definition and Top 10 users based on tweet count Top 10 users who used the word "love" in #starwars Word hate used in #starwars Word yoda used in #starwars You can see the Tweet sent by my id in Zeppelin output. Spark I used this for the sentiment analysis. Replace %hive with %sql (Assuming that you have setup the Zeppelin correctly) Links Zeppelin Hortonworks and Zeppelin Happy Hadooping!!!
... View more
Labels:
03-16-2016
01:35 AM
Hi Neeraj, Is there any latest hdp-teradata connector for hdp 2.3.4 . https://community.hortonworks.com/questions/22957/sqoop-export-with-hdp-teradata-connector-error.html
... View more
02-23-2016
05:26 PM
@Neeraj Sabharwal I just created an updated integration guide using the latest HDP version 2.3.4/Ambari 2.2 and Centrify Server Suite 2016. (all worked great) We will be publishing my updates publicly in the next week or two but I have extensive notes on many of the configurations and common problems @rgarcia detailed here. If anyone needs assistance or has any questions regarding Centrify components, I will now be here to help.
... View more