Support Questions

Find answers, ask questions, and share your expertise

Transfer files from hdfs to kudu

avatar
Contributor

Hello,

I want to transfer files from hdfs to kudu.  I tried through talend fabric and its components but I have an error : Cannot run anywhere due to node and executor blacklist.

 

Can you help me please? Thanks a lot.

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Hi @drgenious 

 

Are you getting a similar error which reported in KUDU-2633 It seems this is open JIRA reported in the community

ERROR core.JobRunShell: Job DEFAULT.EventKpisConsumer threw an unhandled Exception: 
org.apache.spark.SparkException: Job aborted due to stage failure: Aborting TaskSet 109.0 because task 3 (partition 3) cannot run anywhere due to node and executor blacklist.  Blacklisting behavior can be configured via spark.blacklist.*.

If you have the data in HDFS in (csv/avro/parquet) format, then you can use the below command to import the files to Kudu table.

Prerequisites: Kudu jar with compatible version (1.6 or higher) For more reference

spark2-submit  --master yarn/local  --class org.apache.kudu.spark.tools.ImportExportFiles <path of kudu jar>/kudu-spark2-tools_2.11-1.6.0.jar --operation=import --format=<parquet/avro/csv> --master-addrs=<kudu master host>:<port number>  --path=<hdfs path for data> --table-name=impala::<table name>
Hope this helps. Please accept the answer and vote up if it did.

View solution in original post

1 REPLY 1

avatar
Master Collaborator

Hi @drgenious 

 

Are you getting a similar error which reported in KUDU-2633 It seems this is open JIRA reported in the community

ERROR core.JobRunShell: Job DEFAULT.EventKpisConsumer threw an unhandled Exception: 
org.apache.spark.SparkException: Job aborted due to stage failure: Aborting TaskSet 109.0 because task 3 (partition 3) cannot run anywhere due to node and executor blacklist.  Blacklisting behavior can be configured via spark.blacklist.*.

If you have the data in HDFS in (csv/avro/parquet) format, then you can use the below command to import the files to Kudu table.

Prerequisites: Kudu jar with compatible version (1.6 or higher) For more reference

spark2-submit  --master yarn/local  --class org.apache.kudu.spark.tools.ImportExportFiles <path of kudu jar>/kudu-spark2-tools_2.11-1.6.0.jar --operation=import --format=<parquet/avro/csv> --master-addrs=<kudu master host>:<port number>  --path=<hdfs path for data> --table-name=impala::<table name>
Hope this helps. Please accept the answer and vote up if it did.