Member since
09-29-2015
44
Posts
33
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2499 | 03-07-2017 02:30 PM | |
1272 | 12-14-2016 02:53 AM | |
8857 | 12-07-2015 03:58 PM | |
3361 | 11-06-2015 07:40 PM | |
1963 | 10-26-2015 05:59 PM |
03-07-2017
02:32 PM
Once you recreate the hive table try to rerun your Pig script. Inside the pig script don't forget to add the argument -useHCatalog.....
... View more
03-07-2017
02:30 PM
2 Kudos
hello @voca voca , I ran into the same problem but realized that the totalmiles column within Hive should be a Double Column and not an INT as described in the tutorial. So if you take this block of code below and rerun in hive view. It should work for you. drop table riskfactor;
CREATE TABLE riskfactor (driverid string,events bigint,totmiles double,riskfactor float)
STORED AS ORC;
... View more
03-02-2017
01:51 PM
Great to hear that you got things working @Prasanna G
... View more
03-01-2017
02:24 PM
1 Kudo
@Prasanna G
You'll have to first sudo su as (or just use sudo docker ps). See screen shot
... View more
02-28-2017
09:10 PM
@Adedayo Adekeye Ambari is the web ui that is used to administer, monitor, and provision out a hadoop cluster. It also has the concept of VIEWs which allow for browsing the Hadoop Distributed filesystem (HDFS) as well as querying data through Hive, writing pig scripts amongst other things (even extensible to do something custom). Regardless within Ambari (example link - http:// TO AZURE PUBLIC IP>:8080/#/main/views/FILES/1.0.0/AUTO_FILES_INSTANCE ) You can log in as raj_ops (with password as raj_ops) in order to get to the files view. Don't just click the link but you'll have to change the above link to match your Azure sandbox Public IP address. This also assumes you have port 8080 open in Azures Network Security Group setting. You may want to follow the instructions on how to reset the ambari admin password - https://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/#setup-ambari-admin-password Hope this helps
... View more
02-28-2017
03:14 PM
Hi @Prasanna G You'll have to first copy the file from the local filesystem into the docker container like below. First I created a directory on my docker container like so docker exec sandbox mkdir /dan/ (can then run a docker exec sandbox ls to see that your mkdir worked) then I copied the file to the directory I just created docker cp /home/drice/test2.txt sandbox:dan/test2.txt (where sandbox is the name of the docker container running HDP....you can get a list of containers by running docker ps) once the file is in the docker container you can then copy to hadoop docker exec sandbox hadoop fs -put /dan/test2.txt /test2.txt [root@sandbox drice]# docker exec sandbox hadoop fs -ls / Found 13 items drwxrwxrwx - yarn hadoop 0 2016-10-25 08:10 /app-logs drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:54 /apps drwxr-xr-x - yarn hadoop 0 2016-10-25 07:48 /ats drwxr-xr-x - hdfs hdfs 0 2016-10-25 08:01 /demo drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:48 /hdp drwxr-xr-x - mapred hdfs 0 2016-10-25 07:48 /mapred drwxrwxrwx - mapred hadoop 0 2016-10-25 07:48 /mr-history drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:47 /ranger drwxrwxrwx - spark hadoop 0 2017-02-28 15:05 /spark-history drwxrwxrwx - spark hadoop 0 2016-10-25 08:14 /spark2-history -rw-r--r-- 1 root hdfs 15 2017-02-28 15:04 /test.txt drwxrwxrwx - hdfs hdfs 0 2016-10-25 08:11 /tmp drwxr-xr-x - hdfs hdfs 0 2016-10-25 08:11 /user NOTE: Another way to do this is to just use the ambari file browser view to copy files graphically.
... View more
12-14-2016
02:53 AM
@Kiran Kumar have a look at this page from SAS here Looks like it is HDP 2.5 or greater.
... View more
06-16-2016
05:35 AM
Same issue Mark reported on HDP 2.4 Sandbox using sqoop import on a single table. Example command sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1 NOTE Marks workaround worked new command sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1 --driver com.mysql.jdbc.Driver
... View more
04-07-2016
01:38 PM
1 Kudo
Tom is correct, but if using virtual client such as vmware fusion you'll have to do it differently. Grab the MAC address of the VM by selecting the VM->Virtual Machine->Settings->Network Adapter->Advanced Options and copy the MAC Address vi /Library/Preferences/VMware Fusion/vmnet8/dhcp.conf add the line at the bottom of the conf file host hdp24 { hardware ethernet 00:0C:29:42:61:D7; fixed-address 192.168.245.133; }
... View more
03-22-2016
04:39 PM
@Ram Note that disks are required for NN also. See post related to sizing of NN. https://community.hortonworks.com/questions/1692/any-recommendation-on-how-to-partition-disk-space.html#answer-1762
... View more