Member since
04-17-2016
75
Posts
9
Kudos Received
0
Solutions
02-23-2017
08:35 PM
Hi Frank, Thanks for your reply. I will check as you mentioned also here i have attached the kafka broker screenshot for your reference. Thanks again!kafka-error.png
... View more
02-23-2017
08:30 PM
Hi amankumbare, Thanks for your reply. I am seeing in the ambari, the broker status are started. There are 2 brokers. Both are started. But, still I am getting the error. Because of this , I am stuck at office work.
... View more
02-23-2017
08:18 PM
Hi there, This is the command I ran. bin/kafka-topics.sh --create --zookeeper cmtoldhwdpadm03.dev.bmocm.com:2181 --replication-factor 1 --partitions 1 --topic test21
Error while executing topic command : replication factor: 1 larger than available brokers: 0 I am getting the below error. ERROR kafka.admin.AdminOperationException: replication factor: 1 larger than available brokers: 0
at kafka.admin.AdminUtils$.assignReplicasToBrokers(AdminUtils.scala:117)
at kafka.admin.AdminUtils$.createTopic(AdminUtils.scala:403)
at kafka.admin.TopicCommand$.createTopic(TopicCommand.scala:110)
at kafka.admin.TopicCommand$.main(TopicCommand.scala:61)
at kafka.admin.TopicCommand.main(TopicCommand.scala)
(kafka.admin.TopicCommand$)
But, I am seeing in Ambari, There are two brokers has been started. Please help me. Thanks.
... View more
Labels:
- Labels:
-
Apache Kafka
01-25-2017
05:25 PM
Hi Ravi, Thank you very much for your prompt reply and it helped me a lot. I understand the operation now. Once again thanks
... View more
01-25-2017
03:14 PM
Hi there,
How to automate sqoop incremental import using sqoop job?
As for as I know, sqoop job remembers the last value.
If we create a sqoop job like
sqoop job --create myjob -- import --connect blah blah..
sqoop exec myjob
and automate the job, It will create the job each time it executes. So, we will get job already exists error.
So, Is it possible to automate sqoop incremental import using sqoop job?
Please let me know your knowledge on this. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
11-28-2016
03:17 AM
Hi there, I know Jenkins & GIt in general. But, I'm not aware of how Jenkins/GIT plays role in Hadoop projects.. Please let me know your information on this. Thanks in advance. Regards, Jee
... View more
Labels:
- Labels:
-
Apache Hadoop
11-16-2016
01:01 PM
1 Kudo
Hi there, I have an idea about Multi-threading in general but not sure how it is used in Hadoop. Based on my knowledge, Yarn is responsible for managing/controlling
Spark/Mapreduce job resources, can't think of Multi-threading here. Not
sure whether it can be used anywhere else in Hadoop Eco System. I would appreciate if anybody could provide some information on this. Many thanks,
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Spark
10-21-2016
06:33 PM
I tried the implicit function in the way as you suggested and it works very well. Thank you for explaining why the implicit functions are kept in Companion Object classes. You have done very good job... One more request is that, Could you please send me any shell script if you have to automate sqoop incremental import job? Once again Thank you very much...!!!
... View more
10-21-2016
04:30 PM
Dear jfrazee, Thank you very much for your prompt reply. Somehow i figured out the answer in different way by defining the implicit function within the object where the main function is , but outside the main function. But, I'll try your approach too.. Once again Thank you very much. I appreciate your support..
... View more
10-21-2016
04:14 PM
Hi, I'm new to Scala and learning now. I'm trying a program with implicit conversions. But, it shows error in the main program. I've attached the program file for your reference. Please note that both the Rational class and Rational main object in the same project. So, i don't have to import the Rational class to the main program. I'm trying these programs in Scala eclipse IDE. Please let me know your information on this.. Thanks in advance. Regards, Jeeva implicit-error.txt
... View more
Labels:
- Labels:
-
Apache Spark