Member since
05-25-2018
77
Posts
2
Kudos Received
0
Solutions
02-19-2018
05:15 PM
Thank you very much, Harald, for addressing my questions Regards JJ
... View more
02-18-2018
09:47 AM
Hi Guru, Can you please clarify few Kafka architecture question. Please answer here rather than pointing to links ( which I already did and could not understand) I just want to understand where Kafka partition structure is created in Kafka, "FIRST"? i) Was it created in memory or ii) on disk in log.dirs location 2) do consumers read the partition, that are stored in memory or from disk? 3) some of the links in google search says "Kafka purges the messages as per the retention policies --- Regardless of whether the messages has been consumed". Does this mean that consumer reads the topics from disk only and not from memory? 4) what is the relation among batch.size vs log.flush.interval.messages vs log.segment.bytes ? 4a) https://community.hortonworks.com/articles/80813/kafka-best-practices-1.html links say, Kafka first writes data immediately to files, as soon as Log.flush.interval.messages number of messages got received. Question is where this file is created, in memory or on disk in which location? 4b) when the log file reaches log.segment.bytes, it flushed the log file to disk. Question is in first place where this log file is first created in memory or any other temporary location? Thanks JJ
... View more
Labels:
- Labels:
-
Apache Kafka
02-17-2018
07:39 PM
When we have Kafka and Spark streaming setup. This is a continuous job, running 24X7. How can we configure that Kerberos token to never expiring, for continuous never-ending streaming jobs? What configuration parameter needs to set in KDC and in spark job definition ( for making the token never expire)
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Spark
-
Kerberos
10-31-2017
03:09 PM
Hi Raju, I tried both of your options . Still it did not work. Having problem only with ranger ldaptool. Unix level ldapsearch utility works fine. Regards JJ
... View more
10-31-2017
03:55 AM
unix ldapsearch workfine ,but ranger ldaptool is failing below ldapsearch works fine : ldapsearch -h free-ipa-dev-01.uat.txdc.datastax.com -x -b "dc=txdc,dc=datastax,dc=com" -W hadoopadmin but ranger ldaptool is failing : [root@dev-rng-001 ~]# cd /usr/hdp/current/ranger-usersync/ldaptool [root@dev-rng-001 ldaptool]# ./run.sh -d users Ldap url [ldap://ldap.example.com:389]: ldaps://free-ipa-dev-01.uat.txdc.datastax.com:636 Bind DN [cn=admin,ou=users,dc=example,dc=com]: hadoopadmin Bind Password: User Search Base [ou=users,dc=example,dc=com]: dc=txdc,dc=datastax,dc=com User Search Filter [cn=user1]: cn=* Reading ldap properties from input.properties ERROR: Failed to perfom ldap bind. Please verify values for ranger.usersync.ldap.binddn and ranger.usersync.ldap.ldapbindpassword javax.naming.CommunicationException: simple bind failed: free-ipa-dev-01.uat.txdc.datastax.com:636 [Root exception is javax.net.ssl.SSLException: Connection has been shutdown: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target] Can you please help Regards JJ
... View more
Labels:
- Labels:
-
Apache Ranger
09-21-2017
05:05 PM
Thank Sonu, it helps me alot . keep up this spirit
... View more
09-21-2017
05:04 PM
Thank you very much Geoffrey, for your insights and supporting the community.
... View more
09-20-2017
07:38 PM
In our Stack we installed with HDFS and Yarn with version 2.7.1.2.5. Do we still need to install MapReduce2 ( which as two components like History Server and MapReduce2 Clients) . If so, On what nodes do we need to install MapReduce2 Clients ( like only on Data nodes or yarn hosts or MapReduce2 host only) Regards JJ
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache YARN
09-07-2017
05:00 AM
Thank Jay for your perfect solution.
... View more
08-20-2017
05:42 AM
Hi , I am using mysql as the database for Ambari Version2.4.2.0. Would like to change Ambari database user password. From mysql we can change the password. That is easy. Question is how to update the 1) -- server.jdbc.rca.user.passwd=${alias=ambari.db.password} 2) -- server.jdbc.user.passwd=${alias=ambari.db.password} in the file /etc/ambari-server/conf/ambari.properties Note, i am not trying to change Ambari admin password, which is used to login to Ambari UI Please help Regards JJ
... View more
Labels:
- Labels:
-
Apache Ambari