Member since
12-10-2015
58
Posts
24
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1804 | 02-17-2016 04:12 AM | |
3019 | 02-03-2016 05:15 AM | |
1677 | 01-27-2016 09:13 AM | |
4219 | 01-27-2016 07:00 AM | |
2139 | 01-02-2016 03:29 PM |
06-23-2020
06:24 AM
Please watch the below video:- https://www.youtube.com/watch?v=fBeN8VQm0_Q Please like comment share and subscribe my channel, If you loved it.
... View more
12-20-2016
06:42 PM
3 Kudos
@Suresh Bonam Not out of box. You can build custom. CSV is still an option. If your source is streaming data in real-time then Flume is a reasonable option. An alternative is Apache NiFi. Assuming the streaming in real-time and willingness for Flume, the target files to be stored to HDFS will have a similar structure (no transformation in flight). Apache NiFi could help you to perform some transformation in-flight as such the file at the target is easier to consume, e.g. Hive external tables. You could achieve something like that with Flume but with coding and pain involved. If your Excel is static then you should use something else like a MapReduce or Spark job.
... View more
02-17-2016
04:12 AM
I completed this task by dowloading hwi.*.war file from 0.12 version of hive as i didn't find it in 0.13 and 0.14
... View more
02-09-2016
04:13 PM
1 Kudo
Check your connection details and it should be like below: Host <Give IP of HiveServer2>, Port: 10000, Database: optional( you can specify your target as well, if you don't specify it will show default db). Hive Server Type: Hive Server 2, Mechanism: User Name and Password, User Name : give your name and ensure that user has access in that edge node. Paswd: XXXXX
... View more
01-29-2016
03:39 PM
@Suresh Bonam let me know if that works for you and close the thread :).
... View more
01-05-2016
05:03 PM
2 Kudos
You can do something like this: hdfs dfs -put *filename*[0-9].txt /tmp
For example: $ touch ${RANDOM}filename-$(date +"%F").txt ${RANDOM}filename.txt
$ ls *filename*.txt
17558filename-2016-01-05.txt 27880filename.txt
$ hdfs dfs -put *filename*[0-9].txt /tmp
$ hdfs dfs -ls /tmp
-rw-r--r-- 3 hdfs hdfs 0 2016-01-05 16:39 /tmp/17558filename-2016-01-05.txt
If that doesn't work, add this to the beginning: set +f
... View more
01-05-2016
03:49 PM
you can enable hcatalog commands in pig programmatically, use the steps in the following article https://community.hortonworks.com/questions/1954/hcatbin-is-not-defined-define-it-to-be-your-hcat-s.html
... View more
01-04-2016
02:15 PM
@Benjamin Leonhardi Thank you.Yeah,The script setting the environmental variable and then executing pig script in $PIG_HOME like exec /usr/hdp/2.2.8.0-3150/pig/bin/pig.distro "$@"
... View more
01-04-2016
01:59 PM
It looks like a very useful command for debugging. Never used it before. Shame it seems to be broken.
... View more
02-03-2016
05:17 AM
Thanks @Suresh Bonam...I have accepted this as best answer.
... View more