Member since
01-25-2017
11
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3790 | 02-13-2017 07:27 AM |
03-15-2019
04:56 AM
Dear @AnisurRehman You can import data from RDBMS to HDFS only with SQOOP. Then If you want to manipulate this table through Impala-Shell then you only need to run the following command from a pc where Impala is installed. impala-shell -d db_name -q "INVALIDATE METADATA tablename"; You have to do INVALIDATE because your table is new for Impala daemon metadata. Then if you append new data-files to the existing tablename table you only need to do refesh, the command is impala-shell -d db_name -q "REFRESH tablename"; Refresh due to the fact that you do not want the whole metadata for the specific table, only the block location for the new data-files. So after that you can quey the table through Impala-shell and Impala query editor.
... View more
05-26-2017
10:48 AM
Thank you sir for helping me walk through the profile. This is very informative.
... View more
01-25-2017
11:57 AM
I am asking if your Spark service is Spark2 or not? Cloudera Manager is not supporting directly configuring Hive to depend on Spark2 yet. So you might want to try using 'Spark1' and see.
... View more