Created 10-16-2015 12:22 PM
I am trying to install Cloudera live on AWS with Tableau. The stack creation is complete. I see 6 instances running on my account. I did not receive any email with instructuctions on how to access Cloudera. Can someone suggest how I can check if the installation is complete
Mark
Created 10-27-2015 03:27 PM
Created 10-30-2015 08:46 AM
Created 10-29-2015 10:00 AM
Hi Sean,
Is there any way I can restore just the Orders HDFS file? I have been making changes to this table and it looks like I have corrupted it. I know I need to move the table from mysql database using sqoop. I would appreciate your reply.
Mark
Created 10-29-2015 11:28 AM
Hi Sean,
I am sorry for another posting. I found the command to move just 1 table from mysql to hdfs. But, I am running into one issue
I dropped the table using Hive. But the table still exists in hadoop fs -ls /user/hive/warehouse/ directory. I tried to delete the file and I don't seem to have the right permissions. Can you delete this file for me?
Mark
Created 10-29-2015 12:34 PM
Created 10-29-2015 01:56 PM
Hi Sean,
I could move just one table after I deleted the hdfs file from the directory. Thanks for your help.
Mark
Created 10-30-2015 07:32 AM
Hi Sean,
I am using metastore manager to copy hdfs file and create a new table in hive. But, this table file (orders) has 3 different parquet files. I am using the "create a new table from a file" utility and I am getting an error message as shown below. Can you tell me how I can create a hive table from this kind of a file?
Mark
"
Failed to open file '/user/hive/warehouse/orders_new': [Errno 21] Is a directory: '/user/hive/warehouse/orders_new'
None
Created 10-30-2015 08:15 AM
This thread's getting quite long and touching a lot of different types of questions - I'd suggest you post that issue under the Hue forum - I don't know the answer. I'd show them the output of `hadoop fs -ls /user/hive/warehouse/orders_new` so they can see the specific files under that directory and permissions, etc.
Created 10-30-2015 08:46 AM
Hi Sean,
Thanks for your suggestion. I will create a newpost.
Mark
Created 10-20-2015 01:47 PM
1. For Impala Issue: Try to refresh metadata by using below command in Impala.
invalidate metadata;
2.I am able to connect through Putty with ec2-user as userid. I ran the script and I get an error:
-bash: import-all-tables: command not found
Are you able to invoke SQOOP ?
I am not sure which mysql db you are connecting. Thats why i mentioned you might need to change password, username and db name etc.
Created 10-20-2015 02:09 PM
Just to clarify
I used the private IP to connect using Coludera Live ODBC driver. I tried different userids including ec2-user and I was not able to connect to Hive Server 2.
When I try to connect using Impala directly in Tableau, I can connect using ec2-user as id but cannot access the tables that I can see through Hue/Hive.
Please let me know what I am missing in the workflow.
Mark
Created 10-20-2015 02:15 PM