Member since
11-04-2018
16
Posts
1
Kudos Received
0
Solutions
02-19-2019
06:40 AM
Hello, Your requirement can be fulfilled in two ways. 1. Using the below query. --query "select order_id, order_date, order_customer_id, concat('\"', order_status, '\"') from orders WHERE \$CONDITIONS and order_date LIKE '2014-01-%'" 2. You can further process your actual output in spark and get your expected output. Thanks
... View more
11-16-2018
11:42 PM
Hi Nick, Can you please share what do you mean by not working? Error messages would be helpful here. Cheers
... View more
11-16-2018
07:30 PM
@Nikhil Vemula Have you been able to get this working? If you have received the info, you can use something like the below command to get the data based on the date range #!/bin/sh
mindate=$1
maxdate=$2
querytorun="select * from <TABLENAME> where date>= $mindate and date<= $maxdate"
querytorun+=" and \$CONDITIONS"
sqoop import --connect jdbc:mysql://<DPIPADDRESS>/<DBNAME> --username <USERNAME> --password <PASSWORD> --query "$querytorun" --split-by "<SPLITBYKEY>" --delete-target-dir ......
... View more
06-15-2018
11:34 PM
Hey @Nikhil Vemula! I'm not sure about your error, but are you able to connect to the mysql from your Nifi host? telnet <mysql> port And could you share your configs from querydatabse processor and see if there's more detail on nifi's log. PS: Did you set any DBCPConnectionPool at the controller service? Hope this helps!
... View more
04-26-2018
04:08 AM
@Nikhil Vemula useloc You can also use localhost:4200 and it will directly link you to the Docker container. Username: root and Password: hadoop If you want to access anything else then they specifically mention their tutorial port which is: 8888. So, you just need to enter the web url 127.0.0.1:8888 and you'll have the information about all the ports.
... View more
05-16-2018
01:35 PM
1 Kudo
@Nikhil Vemula I took HDPCA certification exam last year, and I can tell you the key to pass this exam is practice and practice. Before the Exam ( time to get prepared) Take each subject from HDPCA and try to stress very possible scenarios. Hortonworks has a brilliant documentation use them in your preparation. (I used them in my preparation) By the way, you have a Practise exam available to check if you are prepared to take the exam. https://br.hortonworks.com/services/training/certification/hdpca-certification/ During the Exam - Read each question carefully. - You will have 2 hours to finish the exam, this is enough time to do it and still get to check it out. - If you find any questions that you do not know you can use the documentation that is available within the exam environment After the Exam Wait for your results. For more information, please check: https://br.hortonworks.com/services/training/certification/hdp-certified-developer-faq-page/
... View more
04-10-2018
12:16 AM
@Nikhil Vemula Please check here
... View more
04-04-2018
05:00 AM
1 Kudo
You need to add spark-core as dependency. It will download
all required jar files. You can refer code from
https://github.com/databricks/learning-spark. It has WordCount sample program.
https://github.com/databricks/learning-spark/blob/master/mini-complete-example/src/main/scala/com/oreilly/learningsparkexamples/mini/scala/WordCount.scala Steps: 1. Clone of download repo from https://github.com/databricks/learning-spark 2. Extract the repo. Go to mini-complete-example directory. 3. Build the Project mvn clean install 4. Import project into IDE and run WordCount.scala. Thanks Shubham
... View more
12-09-2018
07:56 AM
Its a firewall issue, added the firewall rule in amabari server and fixed the password less login issue to register the host.
... View more