Member since
09-29-2015
44
Posts
33
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1617 | 03-07-2017 02:30 PM | |
778 | 12-14-2016 02:53 AM | |
7146 | 12-07-2015 03:58 PM | |
2662 | 11-06-2015 07:40 PM | |
1229 | 10-26-2015 05:59 PM |
03-07-2017
02:32 PM
Once you recreate the hive table try to rerun your Pig script. Inside the pig script don't forget to add the argument -useHCatalog.....
... View more
03-07-2017
02:30 PM
2 Kudos
hello @voca voca , I ran into the same problem but realized that the totalmiles column within Hive should be a Double Column and not an INT as described in the tutorial. So if you take this block of code below and rerun in hive view. It should work for you. drop table riskfactor;
CREATE TABLE riskfactor (driverid string,events bigint,totmiles double,riskfactor float)
STORED AS ORC;
... View more
03-02-2017
01:51 PM
Great to hear that you got things working @Prasanna G
... View more
03-01-2017
02:24 PM
1 Kudo
@Prasanna G
You'll have to first sudo su as (or just use sudo docker ps). See screen shot
... View more
02-28-2017
09:10 PM
@Adedayo Adekeye Ambari is the web ui that is used to administer, monitor, and provision out a hadoop cluster. It also has the concept of VIEWs which allow for browsing the Hadoop Distributed filesystem (HDFS) as well as querying data through Hive, writing pig scripts amongst other things (even extensible to do something custom). Regardless within Ambari (example link - http:// TO AZURE PUBLIC IP>:8080/#/main/views/FILES/1.0.0/AUTO_FILES_INSTANCE ) You can log in as raj_ops (with password as raj_ops) in order to get to the files view. Don't just click the link but you'll have to change the above link to match your Azure sandbox Public IP address. This also assumes you have port 8080 open in Azures Network Security Group setting. You may want to follow the instructions on how to reset the ambari admin password - https://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/#setup-ambari-admin-password Hope this helps
... View more
02-28-2017
03:14 PM
Hi @Prasanna G You'll have to first copy the file from the local filesystem into the docker container like below. First I created a directory on my docker container like so docker exec sandbox mkdir /dan/ (can then run a docker exec sandbox ls to see that your mkdir worked) then I copied the file to the directory I just created docker cp /home/drice/test2.txt sandbox:dan/test2.txt (where sandbox is the name of the docker container running HDP....you can get a list of containers by running docker ps) once the file is in the docker container you can then copy to hadoop docker exec sandbox hadoop fs -put /dan/test2.txt /test2.txt [root@sandbox drice]# docker exec sandbox hadoop fs -ls / Found 13 items drwxrwxrwx - yarn hadoop 0 2016-10-25 08:10 /app-logs drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:54 /apps drwxr-xr-x - yarn hadoop 0 2016-10-25 07:48 /ats drwxr-xr-x - hdfs hdfs 0 2016-10-25 08:01 /demo drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:48 /hdp drwxr-xr-x - mapred hdfs 0 2016-10-25 07:48 /mapred drwxrwxrwx - mapred hadoop 0 2016-10-25 07:48 /mr-history drwxr-xr-x - hdfs hdfs 0 2016-10-25 07:47 /ranger drwxrwxrwx - spark hadoop 0 2017-02-28 15:05 /spark-history drwxrwxrwx - spark hadoop 0 2016-10-25 08:14 /spark2-history -rw-r--r-- 1 root hdfs 15 2017-02-28 15:04 /test.txt drwxrwxrwx - hdfs hdfs 0 2016-10-25 08:11 /tmp drwxr-xr-x - hdfs hdfs 0 2016-10-25 08:11 /user NOTE: Another way to do this is to just use the ambari file browser view to copy files graphically.
... View more
12-14-2016
02:53 AM
@Kiran Kumar have a look at this page from SAS here Looks like it is HDP 2.5 or greater.
... View more
09-19-2016
02:11 PM
2 Kudos
Repo Description Simple sample program on how to read/write data to HDFS Repo Info Github Repo URL https://github.com/drnice/HDFSReadWrite Github account name drnice Repo name HDFSReadWrite
... View more
Labels:
08-22-2016
12:28 PM
3 Kudos
Repo Description A utility for testing Avro File formats to ensure that are well formed. Repo Info Github Repo URL https://github.com/drnice/AvroTest Github account name drnice Repo name AvroTest
... View more
06-17-2016
09:50 PM
2 Kudos
Repo Description This spark streaming example connects a ClientApplication to a Spark Stream. As data is written to the ClientApplication on a specific socket (9087). A Spark Stream will connect to the port and read the data word count the line entered and output the data. Repo Info Github Repo URL https://github.com/drnice/Spark-Streaming Github account name drnice Repo name Spark-Streaming
... View more
Labels: