Member since
03-21-2017
197
Posts
6
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2660 | 07-02-2018 11:25 AM | |
1196 | 05-29-2018 07:20 AM | |
2746 | 05-09-2018 10:18 AM |
11-25-2019
03:14 AM
I have created cloudbreak VM on azure. The default cloudbreak version in 2.7.1. I have upgraded cloudbreak version to 2.9.1. When i'm trying to deploy cluster it gives below error:
Infrastructure creation failed. Reason: Failed to retrieve the server's certificate
how can i burn new image for cloudbreak 2.9.1 ?
... View more
- Tags:
- Cloudbreak
- error
- VM
Labels:
- Labels:
-
Hortonworks Cloudbreak
04-26-2019
11:26 AM
Hi, There are many etl tools in market. I did some research on Pentaho Data Integraion Tool. In which kettle is for ETL. How can we compare kettle with hadoop ? Does kettle(any ETL tool) replaces hadoop ? When do we need ETL tool on the top of hadoop ? Please someone help me to clear my doubts. Thanks, Heta
... View more
Labels:
- Labels:
-
Apache Hadoop
03-26-2019
07:45 AM
Hi, In LDAP we create users and sync them with cluster, Ambari and Ranger, so same user can be used everywhere. I have configured my cluster for kerberos. How can i achieve centralized user authentication in kerberos enabled cluster as i mentioned above we can do in LDAP ? I do not have LDAP server. I want to do Authentication and Authorization both using Kerberos only. Thanks, Heta
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Ranger
-
Kerberos
03-25-2019
09:47 AM
@Jonathan Sneep After downloading jce8 i am not able to unzip it. It shows below error: Archive: jce_policy-8.zip
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of jce_policy-8.zip or
jce_policy-8.zip.zip, and cannot find jce_policy-8.zip.ZIP, period.
I am using this doc.
... View more
03-25-2019
07:30 AM
Hi, I want to provide user authentication and resource/service authority to my cluster. I do not have LDAP server. I want to use Kerberos for my cluster security. How can I configure and start using kerberos without LDAP ? I have followed this link but i don't what is next step ? How can i integrate kerberos with Ambari ? For now i am trying on sandbox HDP 2.6.5. Thanks, Heta
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
03-16-2019
11:49 AM
Hi, I am using cloudbreak data platform on azure. My company has own authorization tool for user authorization. Is it possible to use this tool with cluster created on cloudbreak for user security management ? Same concern with HDinsight on Azure ? Thanks, Heta
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
03-14-2019
02:07 PM
I thin docker is running. But there is no directory named start_scripts under /root didrectory.
... View more
03-14-2019
12:38 PM
@Jay Kumar SenSharma I am doing ssh to root@#localhost 2122 using putty. but still i am not able to do that.
... View more
03-14-2019
12:01 PM
@Jay Kumar SenSharma When i try to ssh on port 2122 it shows connection refused.
... View more
03-14-2019
11:21 AM
@Jay Kumar SenSharma I have same issue and not able to solve it. I have raised the question. Please help me.
... View more
03-14-2019
11:10 AM
Hi, I am using sandbox HDP 2.6.5 in virtualbox. I have configured Kylin 2.6.0. Kylin started successfully but i am not able to open Kylin Web UI. So i need to add Port 7070 to sandbox and for that i am following this article which is for HDP 2.5. As defined in the article to add port i have to edit start_sandbox.sh file under /root/start_scripts directory. But there is no directory named start_scripts under /root. I am facing same issue for HDP 3.0. So, how can i add new port in HDP 2.6.5/3.0 ? Thanks, Heta
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
03-14-2019
08:14 AM
@Jay Kumar SenSharma I can successfully login to Amabri in Incognito Mode in Gogle Chrome. I am running one HDP sandbox instance at a time, but i am using different versions so i think the issue i was facing is due to caching. Thank you so much.
... View more
03-14-2019
07:54 AM
@Jay Kumar SenSharma In browser when i try to login to Ambari using login credentials it shows "server error".
... View more
03-14-2019
07:32 AM
Hi, I have 3 versions of HDP in oracle virtual box. The versions are 2.5, 2.6.5 and 3.0. When i start VM for HDP 3.0 the Ambari UI works fine. But after that if i start VM for HDP lower versions 2.5 or 2.6.5 i can not open Ambari UI. It shows Server error. Based on the requirement i need to use different versions of HDP. How can i resolve this issue ? Thanks, Heta
... View more
Labels:
03-05-2019
02:05 PM
1 Kudo
#TO CHANGE KYLIN UI PORT 1. cd /usr/local/apache-kylin-2.6.0-bin/tomcat/conf 2. Open server.xml Change port in Connector Port argument: As shown below Connector Port 7070 is assigned. You can give another port instead of 7070. <Connector port="7070" protocol="HTTP/1.1"
connectionTimeout="20000"
redirectPort="7443"
compression="on"
compressionMinSize="2048"
noCompressionUserAgents="gozilla,traviata"
compressableMimeType="text/html,text/xml,text/javascript,application/javascript,application/json,text/css,text/plain"
/> Thanks
... View more
02-26-2019
12:14 PM
@Gorka Zárate When i did same configuration on Cloudbreak for HDP on azure, Kylin UI is working.
... View more
02-21-2019
11:41 AM
Hi, I am using Hortonworks sandbox HDP 2.6.5. I have installed apache kylin 2.6.0-bin on sandbox. Kylin has started successfully. But i am not able to open Kylin Web Interface. Port 7070 is registered and when i checked using below command it shows port is open: netstat -tupln | grep 7070
tcp 0 0 0.0.0.0:7070 0.0.0.0:* LISTEN 4092/java
When i enter http://localhost:7070/kylin in chrome it shows page can't be found. How can i resolve this ? Thanks
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
02-15-2019
07:14 AM
Hi,
I have create cluster on Cloudbreak for hortonworks data platform on Azure. Initially i have created cluster with 1 master node and 1 worker node. The Hive was running perfectly from Hive view as well as Hive shell.
After that when i added 2 additional worker nodes to the cluster, In Ambari Hive view is not opening and throws below error: Cannot open a hive connection with connect string jdbc:hive2://xx.xx.xx.xx.xx:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2;hive.server2.proxy.user=admin When I try to insert data into Hive table from Hive shell(CLI) the job gets stuck. Query ID = cloudbreak_20190215071908_8e0a113e-2bfb-488a-95cd-8bacdb45abbd
Total jobs = 1
Launching Job 1 out of 1
Status: Running (Executing on YARN cluster with App id application_1550214881831_0001) How to resolve this error ? Thank you.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
02-12-2019
11:50 AM
Hi, I have launched cloudbreak for hortonworks data platform on azure. Created cluster on top of cloudbreak. I want to connect Hive database with SSRS. So I am trying to connect using Hortonworks Hive ODBC Driver. But i am not able to connect. While entering configuration parameter i am confuse which value needs to enter like hostname, port, authentication mechanism, username-password and transport mode. can anyone help me to configure ODBC driver for Hive ? Thank you
... View more
Labels:
- Labels:
-
Apache Hive
-
Hortonworks Cloudbreak
02-12-2019
07:39 AM
When i use WASB as storage, while creating cluster i need to have Master node and compute nodes only right ? No need to have worker node as i am using WASB not HDFS ?
... View more
02-12-2019
07:37 AM
@lnardai It worked for me. What does it mean "wrap it as a service" ?
... View more
02-12-2019
07:17 AM
@Geoffrey Shelton Okot When i need to accept some parameter from third party application/portal and needs data from hive based on that parameter how can i perform ?
... View more
02-12-2019
07:15 AM
@Geoffrey Shelton Okot I am using SSRS. I am trying to connect Hortonworks Hive ODBC driver to connect Hive with SSRS. But i am facing some issue.
... View more
02-11-2019
10:44 AM
Hi, I have a dashboard for reporting. I wan to create customized report from the data stored in hive. when user selects some parameters and based on that parameter the data should be fetch from Hive and report will be generate by reporting tool. This is possible or not ? If yes than how can i achieve ? Thank you.
... View more
Labels:
- Labels:
-
Apache Hive
02-11-2019
09:41 AM
Hi, I have launched cloudbreak hortonworks data platform on Azure. The url was working properly before VM restarted. When I restart the vm the url is not working. Is there any other way to open cloudbreak UI ? Thank you
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
02-11-2019
09:37 AM
so there is no replication mechanism if i use WASB as default storage ?
... View more
02-11-2019
07:19 AM
@Dominika Bialek So i can use WASB as a default storage right ? and compute node can read data from WASB for processing ?
... View more
02-08-2019
06:26 AM
Hi, I am trying to create cluste using cloudbreak. I am facing below error: Infrastructure creation failed. Reason: Error in provisioning stack cdrcluster3: com.fasterxml.jackson.core.JsonParseException: Illegal unquoted character ((CTRL-CHAR, code 10)): has to be escaped using backslash to be included in string value
at [Source: java.io.StringReader@3e479b2e; line: 47, column: 63]
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
02-08-2019
05:56 AM
Hi, I have following confusions regarding Cloudbreak HDP on Azure : Does azure blob storage works as default HDFS for cluster ? difference between worker node and compute node Suppose I am puling data from SFTP Source and i want to store them in HDFS, Where data will be stored ? When I want to process this data on which node data gets processed, worker node or compute node? Please someone help me clear this doubts. Thank you
... View more
Labels: