Hi Sean, Thanks for your reply. After a bit of trial and error I managed to get through all the tuts. There were a couple of other things that I noticed that might be useful for others to know in regards to the "Cloudera Live" environment. - The MySQL service isn't started by default, so needs to be started before exporting the data into HDFS (sudo service mysqld start) - I also found that using the IP address of the master node didn't seem to work when trying to map the schema metadata into Impala, so I have pasted what I used below, this seemed to work for me. The part I changed was in the TBLPROPERTIES from the IP address of the master node to f9b00-cldramaster-01 CREATE EXTERNAL TABLE categories ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat' LOCATION 'hdfs:///user/hive/warehouse/categories' TBLPROPERTIES ('avro.schema.url'='hdfs://f9b00-cldramaster-01/user/examples/sqoop_import_categories.avsc'); Regards, Scott.
... View more
Hi, I am a bit confused with the lines " You should first log in to the Master Node of your cluster using SSH - you can get the credentials using the instructions on Your Cloudera Cluster. Once you are logged in, you can launch the Sqoop job: You should first open a terminal, which you can do by clicking the black "Terminal" icon at the top of your screen" I am supposed to be doing this via the GoGrid console as I can't see the "Terminal" icon via the Cloudera Manager I have the Master node IP Address and have tried to follow the tutorial but to no avail. I also noticed that the tutorial that is available as part of Cloudera Live is has slightly different instructions to the following - http://www.cloudera.com/content/cloudera/en/developers/home/developer-admin-resources/get-started-with-hadoop-tutorial/exercise-1.html but that didn't help much either. Any help would be appreciated. Regards, Scott.
... View more