01-03-2015 06:19 PM
I am a bit confused with the lines "
You should first log in to the Master Node of your cluster using SSH - you can get the credentials using the instructions on Your Cloudera Cluster. Once you are logged in, you can launch the Sqoop job:
You should first open a terminal, which you can do by clicking the black "Terminal" icon at the top of your screen"
I am supposed to be doing this via the GoGrid console as I can't see the "Terminal" icon via the Cloudera Manager
I have the Master node IP Address and have tried to follow the tutorial but to no avail. I also noticed that the tutorial that is available as part of Cloudera Live is has slightly different instructions to the following - http://www.cloudera.com/content/cloudera/en/developers/home/developer-admin-resources/get-started-wi... but that didn't help much either.
Any help would be appreciated.
01-06-2015 06:47 AM
The line that says to us the black "Terminal" icon is actually only relevant for the QuickStart VM. It's not supposed to show up in the "Cloudera Live" environment, so just ignore it (and I'll look into why it's showing up for you when it shouldn't). You should be able to log in with a terminal using SSH from Mac OS / Linux, or perhaps a tool like PuTTY from Windows.
01-06-2015 11:39 AM
Thanks for your reply.
After a bit of trial and error I managed to get through all the tuts.
There were a couple of other things that I noticed that might be useful for others to know in regards to the "Cloudera Live" environment.
- The MySQL service isn't started by default, so needs to be started before exporting the data into HDFS (sudo service mysqld start)
- I also found that using the IP address of the master node didn't seem to work when trying to map the schema metadata into Impala, so I have pasted what I used below, this seemed to work for me. The part I changed was in the TBLPROPERTIES from the IP address of the master node to f9b00-cldramaster-01
CREATE EXTERNAL TABLE categories
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
01-13-2015 08:11 AM
Thanks again for reporting the issue and sharing your workaround with other users. Just wanted to let you know that as of noon PT yesterday, now deployments should not have this problem. You should be able to access MySQL via the public IP from any of the instances now, and the tutorial now gives you a working command for the Sqoop import and Hive metadata.