Member since
09-15-2015
294
Posts
764
Kudos Received
81
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1586 | 07-27-2017 04:12 PM | |
4310 | 06-29-2017 10:50 PM | |
2010 | 06-21-2017 06:29 PM | |
2270 | 06-20-2017 06:22 PM | |
2058 | 06-16-2017 06:46 PM |
03-22-2017
06:01 PM
1 Kudo
Below posts also talks about Backing up Hive tables: https://community.hortonworks.com/questions/394/what-are-best-practices-for-setting-up-backup-and.html https://community.hortonworks.com/questions/78292/backup-specific-hive-table.html
... View more
03-22-2017
05:53 PM
8 Kudos
I can open the link just fine. Please see attached screenshot http://hortonworks.com/wp-content/uploads/2015/08/DataSheet_HDPCD_Java_2.2.pdf Make sure your dont have any connection issues.
... View more
03-22-2017
01:54 PM
5 Kudos
@Lifeng Ai - For your question - I solved the problem by removing "-D fs.s3a.fast.upload=true" but still don't know the reason. Do some one know why "fs.s3a.fast.upload=true" will cause java heap space problem? Yes, fs.s3a.fast.upload=true can cause Heap related issues: Read the below document: https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#Stabilizing:_S3A_Fast_Upload HADOOP-13560 fixes the problem by introducing the property fs.s3a.fast.upload.buffer But this is not yet released as part of HDP, and should be released in HDP 2.6. Hope, this clarifies the issue!
... View more
03-21-2017
10:35 PM
1 Kudo
Below is a good read on Determining HDP Memory configs: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/determine-hdp-memory-config.html
... View more
03-21-2017
09:48 PM
3 Kudos
Yes, you can use ext4 disks to store HDFS data. Once you have installed HDP, you can set the property : dfs.datanode.data.dir in hdfs-site.xml to point to any locations you want the data to be stored. Below post talks about the same: https://community.hortonworks.com/questions/89786/file-uri-required-for-dfsdatanodedatadir.html For your second question, I am not aware of any rpm which can convert existing data to HDFS. You would have to migrate the existing data to HDFS. There are many ways possible as how to migrate the data also depending upon where the data resides (on disk, in database etc). Below is one basic example of loading the data: https://hortonworks.com/hadoop-tutorial/loading-data-into-the-hortonworks-sandbox/
... View more
03-21-2017
09:10 PM
5 Kudos
@Sylvain Robert - This looks like a duplicate of : https://community.hortonworks.com/questions/89875/expression-language-for-kafka-broker-of-publishkaf.html Can you please resolve this. Thanks!
... View more
03-21-2017
08:56 PM
1 Kudo
Below is one doc for Enabling Audit Logging for HDFS and Solr: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_installing_manually_book/content/enabling_audit_logging_hdfs_solr.html
... View more
03-21-2017
08:27 PM
9 Kudos
@Faruk Berksoz - Below query should work fine: SELECT * FROM ( SELECT * FROM TB.TABLE1 ) T;
... View more
03-21-2017
08:00 PM
1 Kudo
If you want entire HDP (which includes major Hadoop components like HDFS etc) stack for learning / tutorial purposes, you can download and install Hortonworks Sandbox: https://hortonworks.com/products/sandbox/ There is another article which talks about custom install of HDP: https://community.hortonworks.com/articles/16763/cheat-sheet-and-tips-for-a-custom-install-of-horto.html
... View more