Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2125 | 11-01-2016 05:43 PM | |
6437 | 11-01-2016 05:36 PM | |
4112 | 07-01-2016 03:20 PM | |
7050 | 05-25-2016 11:36 AM | |
3423 | 05-24-2016 05:27 PM |
10-10-2015
04:44 PM
@Vedant Jain Just to confirm, ssh from ambari server to all the host works fine "password less ssh". Correct? Did you use correct private key in ambari console while installing?
... View more
10-10-2015
12:53 PM
@rxu@hortonworks.com Please provide log files entries from ambari, hbase and particular region server to start with. Ambari and HDP version OS Is Keberos in place?
... View more
10-10-2015
12:45 PM
@amcbarnett@hortonworks.com https://gist.github.com/nsabharwal/f57bb9e607114833df9b Important Ambari reinstall will give you following command while running check during the cluster install. It will save you lot of time I run with --skip=users python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users
... View more
10-10-2015
12:40 PM
1 Kudo
@Ronald McCollam Hive view and if we look into other technology stacks then Row level security is based on views.
... View more
10-10-2015
12:37 PM
@Saumil Mayani Please use/test different config groups. This is reference to old docs.
... View more
10-10-2015
12:35 PM
@rxu@hortonworks.com Thanks for posting the resolution. To gather more information , please do share few lines pointing to errors from ambari logs as it will help the forums when someone searches for relevant errors.
... View more
10-10-2015
12:32 PM
1 Kudo
@awatson@hortonworks.com Yes, both versions are supported and thanks for sharing the doc link. About sizing: Ambari, Oozie , Hive - For small to large clusters, the database size and load wont be high unless there is something wrong with the setup. Coming from Oracle DBA background, I would say that following configuration is good start 64 to 128 GB memory 16 to 32 cpu ( Dual core) OS disk - standard size ( raid 10) /oracle - 50 to 100 (raid 10) - oracle binaries /oraclelogs - anything near to 500GB (raid 1, its for oracle redo log files , not archive logs) /archivelogs - 300 to 500GB ( raid 1, oracle archive logs and make sure there is archive log rotation in place) /data - 500G to 1 TB ( raid 10, oracle tablespaces) Export backups MUST Any good DBA will stick with both RMAN and Export backups.
... View more
10-10-2015
12:15 PM
@bsaini@hortonworks.com I would start with queues. Have your configures Yarn Queues to have multiple job? Please check if there are any other jobs. By default, if there is one long job running then Hive CLI wont respond back until the running job finishes.
... View more
10-10-2015
12:14 PM
@awatson@hortonworks.com I would stick with this - Elasticsearch 1.7.2 and for rest of the pieces , you can follow the blog.
... View more