Member since
04-24-2019
20
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4085 | 10-15-2018 06:26 AM |
08-04-2019
11:40 PM
Hi,
is there any way to setup cloudera cluster using ansible?
I've tried but everything installed except Cloudera Management service.
Do let me know i really need the solution.
Thanks,
... View more
Labels:
- Labels:
-
Cloudera Manager
05-28-2019
09:50 AM
thanks Geoffrey I'm able to send data from one cluster to other but not in reverse direcion (showing error same as mentioned above)and second thing whenever i add capaths in /etc/krb5.conf file, it's start showing error. Please help me to figure it out too. thanks again.
... View more
05-27-2019
08:50 PM
hi, I've two clusters running on hdp 3.1, both consists of single node and kerbeorized as well. I am trying to send a single file using distcp but failed. got error(pictures attached) even unable to list hdfs directories of other cluster on either of the cluster. Thanks.
... View more
Labels:
- Labels:
-
Apache Hadoop
05-19-2019
11:07 PM
Thanks for the reply. Actually i can send data from cloudera to hortonworks but not in reverse direction.Other than that i can list hdfs directories(of hortonworks cluster) on cloudera and vice versa but failed to copy data from hortonworks to cloudera. I'm still looking for the solution.
... View more
05-17-2019
07:26 AM
I've applied all the steps you've written above but unfortunately result is same. i dont know why my ticket is not being authenticatd!!!!
... View more
05-17-2019
07:24 AM
Operating system SuSE Linux sles 12 sp3 cloudera cdh 6.2 details of files mentioned below: 1: krb5.conf [libdefaults] default_realm = ABCDATA.ORG dns_lookup_kdc = false dns_lookup_realm = false ticket_lifetime = 86400 renew_lifetime = 604800 forwardable = true default_tgs_enctypes = aes256-cts-hmac-sha1-96 default_tkt_enctypes = aes256-cts-hmac-sha1-96 permitted_enctypes = aes256-cts-hmac-sha1-96 udp_preference_limit = 1 kdc_timeout = 3000 [realms] ABCDATA.ORG = { kdc = cloudera.abcdata.org admin_server = cloudera.abcdata.org } 2: kdc.conf [kdcdefaults] kdc_ports = 88 kdc_tcp_ports = 88 [realms] ABCDATA.ORG = { database_name = /var/lib/kerberos/krb5kdc/principal admin_keytab = FILE:/var/lib/kerberos/krb5kdc/kadm5.keytab acl_file = /var/lib/kerberos/krb5kdc/kadm5.acl dict_file = /var/lib/kerberos/krb5kdc/kadm5.dict key_stash_file = /var/lib/kerberos/krb5kdc/.k5.EXAMPLE.COM kdc_ports = 88 max_life = 1d max_renewable_life = 7d } [logging] kdc = FILE:/var/log/krb5/krb5kdc.log admin_server = FILE:/var/log/krb5/kadmind.log 3: kadm5.acl ############################################################################### #Kerberos_principal permissions [target_principal] [restrictions] ############################################################################### # */admin@ABCDATA.ORG *
... View more
05-16-2019
09:03 AM
Hi developers, I have enabled kerberos on my cluster although cluster is green and running but i'm not able to access hdfs even after generating ticket.The error is same as the ticket is'nt generated. Probably authenticated server is not authenticating my ticket. Here is the error. "19/05/16 08:53:07 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS] ls: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "FQDN/X.X.X.X"; destination host is: "FQDN":PORT;" Thanks.
... View more
Labels:
- Labels:
-
Apache Hadoop
04-26-2019
05:18 AM
@Geoffrey Shelton Okot Thanks for the reply. Actually i am trying to send data from hortonworks to cloudera cluster. dfs.exclude file is empty. The distcp command you've written has some issues,we can't attach 50070 port with hdfs prefix to do this write webhdfs instead of hdfs. my distcp command is hadoop --config (path of directory containing hdfs-site or core-site.xml files of targeting cluster) distcp hdfs://nn1/path hdfs://nn2/path Aforementioned command can send data from cloudera to hortonworks cluster but i want to do it in reverse direction.
... View more
04-25-2019
03:30 AM
I am using distcp command to send data from one cluster to another remote cluster but failed to do so.Getting error 'file could only be written to 0 of the 1 minreplication (=1) nodes". I got stuck into it please do suggest me the solution. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hadoop
04-25-2019
03:01 AM
Please guide me I'm trying to send simple file from hortonworks to cloudera using distcp command but getting error "could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation". I am looking to hear from you. Thanks.
... View more
04-24-2019
11:20 PM
I am trying to send simple small file from hortonworks to cloudera cluster using distcp command but failed. Getting error '"could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation." Did several attempts to get rid off this but not succeeded yet.Please do suggest me the solution. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache YARN
-
HDFS
-
MapReduce
11-13-2018
07:27 AM
hi developers, I have two clusters c1 and c2.Nifi is installed on c1 while minifi on c2.I want to ingest data snort logs(alerts.csv) file from minifi to nifi.I have given all permissions to path leading to this file but still getting no data and showing error on nifi gui is (unable to open /var/log/snor/alerts.csv file will attempt to access file again after the configuration yield Duration has elapsed:{}).Same problem occur while ingesting squid logs. thanks.
... View more
Labels:
- Labels:
-
Apache MiNiFi
-
Apache NiFi
11-07-2018
09:13 PM
hi developers, I've a file name alerts.csv which grows with some network logs data automatically.I've to watch the new log event on it and transmit this new entry to another cluster that's why i'm using Nifi TailFile processor.But it seems it does'nt work with csv and produces the error mentioned below. error:Attempted to position reader at current position in file /var/log/snort/alerts.csv but failed to do so due to java.nio.filenoSuchFileException:/var/log/snort/alerts.csv thanks.
... View more
Labels:
- Labels:
-
Apache NiFi
10-15-2018
06:26 AM
1 Kudo
hy @Hamza, if you are using no security protocol like kerberos/tsl then do use kafka processor version less than 0.9 which is getkafka and for those having kerberos security installed must have to use version above than 0.9 like consumekafka 0.10 and above.Its mandatroy.
... View more
10-12-2018
04:10 PM
I got stuck into this problem, nothing worked for me. I want to establish a communication channel between nifi and kafka. 1-Four processors in nifi(GenerateFlowFile->publishKafka) and (ConsumeKafka->logAttribute). 2-PublishKafka and ConsumeKafka are not communicating with Apache kafka(pictures are attached along with their errors). 3-Apache kafka security=PLAINTEXT 4-Kebros not installed. Please do suggest me the solution. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Kafka
-
Apache NiFi