Spark command:
spark-submit --master yarn --deploy-mode cluster --jars json-simple-1.1.jar --packages com.googlecode.json-simple:json-simple:1.1 --files /home/hadoop/Property_Analytics_UAP_Dev.json,/home/hadoop/Property_dl_Store_Inc.json --class com.vfc.uap.App_dl_ld_Store_Inc datalake101.jar Property_Analytics_UAP_Dev.json Property_dl_Store_Inc.json yes 99990119
Error
18/01/23 17:02:18 INFO Client:
client token: N/A
diagnostics: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: `vfuap_support`.`etl_process_status`; line 1 pos 160;
'Project ['src_filename]
+- 'Filter isnull('core.trg_filename)
+- 'SubqueryAlias core
+- 'Project ['a.file_name AS src_filename#7, 'b.file_name AS trg_filename#8]
+- 'Join LeftOuter, ('a.file_name = 'b.file_name)
:- SubqueryAlias a
: +- SubqueryAlias filelist_tempreceipt
: +- Project [value#1 AS file_name#3]
: +- LocalRelation <empty>, [value#1]
+- 'SubqueryAlias b
+- 'Project ['file_name]
+- 'Filter ('b.file_source_system_name = IP)
+- 'SubqueryAlias b
+- 'UnresolvedRelation `vfuap_support`.`etl_process_status`
ApplicationMaster host: 10.0.140.6
ApplicationMaster RPC port: 0
queue: default
start time: 1516726891460
final status: FAILED
tracking URL: http://ip-10-0-141-23.ec2.internal:20888/proxy/application_1516721691577_0009/
could you please help me why I am not able to connect to hive table using cluster mode