Member since
05-18-2016
71
Posts
39
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3495 | 12-16-2016 06:12 PM | |
1202 | 11-02-2016 05:35 PM | |
4620 | 10-06-2016 04:32 PM | |
1957 | 10-06-2016 04:21 PM | |
1710 | 09-12-2016 05:16 PM |
12-22-2017
06:27 PM
if you are using 2.6 version or later, you can turn on ACID and execute delete commands. you can audit your delete via your app if it is needed. Otherwise, delete, merge and update can run on Hive directly with ACID.
... View more
04-17-2017
01:24 AM
This works perfectly with Field Cloud. If you want to run some queries on phoenix by following this and Phoenix and Hbase tutorials this is an awesome demoable material
... View more
04-03-2017
06:27 PM
Also, how big is data in your table.. are you doing some sort of Limit or where clause when you run the query?
... View more
04-03-2017
06:18 PM
Are these tables External Tables? In the case of external tables you would have manually clean the folders by removing the files and folders that are referenced by the table ( using hadoop fs -rm command)
... View more
12-16-2016
06:12 PM
1 Kudo
Ambari should always run on 8080 port, so connecting to http://localhost:8080 should take you directly to the ambari login. Zeppelin runs on port 9995.
... View more
12-15-2016
09:07 PM
sudo su - hdfs, then execute your commands. the xdl3 user does not have write access to /xdl/tmp directory. Also i hope you dont have any acls setup.
... View more
12-15-2016
09:02 PM
your host name is set to "cluster name" which is incorrect. instead of the hdfs://clustername/folder/file" use hdfs://hostname/folder/file", update it with your hostname.
... View more
11-17-2016
03:30 PM
This is a great article, can we do ATLAS tagging to fields in Hbase, by tagging the external table. Can you apply Ranger policies to that??
... View more
11-02-2016
05:35 PM
As long as you are able to get the task accomplished either manually or via Ambari you will be OK.
... View more
10-06-2016
04:32 PM
Hi "Kaliyug Antagonist!!" Try setting the sqoop import as a sqoop job. The incremental data import is supported via sqoop job.. and not directly via sqoop import. check out the link for more examples https://community.hortonworks.com/questions/10710/sqoop-incremental-import-working-fine-now-i-want-k.html Hopefully this helps out.
... View more