Member since
08-03-2019
186
Posts
34
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1956 | 04-25-2018 08:37 PM | |
5877 | 04-01-2018 09:37 PM | |
1591 | 03-29-2018 05:15 PM | |
6759 | 03-27-2018 07:22 PM | |
2005 | 03-27-2018 06:14 PM |
04-02-2018
03:15 AM
2 Kudos
@Sri Kumaran Thiruppathy I don't think so! Sqoop and Spark SQL both use JDBC connectivity to fetch the data from RDBMS engines but Sqoop has an edge here since it is specifically made to migrate the data between RDBMS and HDFS. Every single option available in Sqoop has been fine-tuned to get the best performance while doing the data ingestions. You can start with discussing the option -m which control the number of mappers. This is what you need to do to fetch data in parallel from RDBMS. Can I do it in Spark SQL? Of course yes but the developer would need to take care of "multithreading" that Sqoop has been taking care automatically. And the list goes on! Hope that helps!
... View more
04-01-2018
09:37 PM
1 Kudo
@Mahendra Hegde In the snapshot you attached, all the processors are stopped. Start all your processors and also verify the following metrics. 1. Is your processor running on all the nodes? Sometimes it may happen that you have your flow files on "Not Primary Node" and the downstream processor is running only in the Primary Node, which sometimes results in such behavior. 2. If #1 does not hold true and all your processors are running "correctly" on all nodes or whatever the correct configuration is, please verify that how many threads are running for your processor(s). 3. Please check your JVM usage for NiFi. Verify if it is "too" high for whatever reason. 4. Please check the scheduling of your processors. Let know if you are able to see anything "unusual" from the above basic debugging steps. Hope that helps!
... View more
04-01-2018
09:27 PM
@Yassine Looking at your log, it seems like you are trying to change the datatype in Spark. Is this the case? If yes, use the statement like val a = sqlContext.sql("alter table tableName change col col bigint") Talking about the issue you are facing while converting the type of the column, you need to understand the available datatypes and the implicit cast option available between them. So whenever you issue a command like alter table tableName change columnName columnName <newDataType>; You need to understand that you may have some data in your Hive table's column which is string type now and if you are casting to a variable with datatype like int etc, you may not be able to access certain values and they will generate null. Check this link for Hive datatypes and implicit cast options available.
... View more
04-01-2018
04:36 PM
@Sudha Chandrika Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:35 PM
You can use multiple options. But they have there if's and buts! 🙂 Follows the best option that I can think of! Use MergeContent to merge multiple flow files to one bigger file, put the bigger flow file on Local disk and use "LOAD DATA" statement from MySQL. Will be very fast! Let me know if you need additional help on the topic! If the answer helped you resolve your query, actual or new :), please mark the answer as Accepted!
... View more
04-01-2018
04:27 PM
@rajdip chaudhuri Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:24 PM
@heta desai Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:24 PM
@vishal dutt Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:23 PM
1 Kudo
@ANKIT PATEL Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:19 PM
@Vinitkumar Pandey Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more