Member since
05-03-2017
74
Posts
4
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1633 | 07-15-2020 06:25 AM | |
3211 | 04-21-2020 06:25 AM |
04-01-2020
07:08 AM
@stevenmatison Thank you Sir . We are in HDP 2.6.x and if we can get Hue 4.x management pack it would be really helpful . Thanks Bharad
... View more
03-31-2020
02:48 PM
Team ,
I am following below link to install managed by Ambari .
Hdp version :- 2.6.5
Ambari version :-
https://github.com/EsharEditor/ambari-hue-service .
After following few steps and when i try to install the HUE services from ambari i am getting below error .
2020-03-31 16:43:07,626 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2020-03-31 16:43:07,630 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2020-03-31 16:43:07,631 - Group['kms'] {}
2020-03-31 16:43:07,634 - Group['livy'] {}
2020-03-31 16:43:07,634 - Group['spark'] {}
2020-03-31 16:43:07,634 - Group['ranger'] {}
2020-03-31 16:43:07,635 - Group['hue'] {}
2020-03-31 16:43:07,642 - Adding group Group['hue']
2020-03-31 16:43:07,670 - Group['hdfs'] {}
2020-03-31 16:43:07,671 - Group['zeppelin'] {}
2020-03-31 16:43:07,672 - Group['hadoop'] {}
2020-03-31 16:43:07,673 - Group['users'] {}
2020-03-31 16:43:07,673 - Group['knox'] {}
2020-03-31 16:43:07,675 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,679 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,682 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,685 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,687 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,688 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,690 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2020-03-31 16:43:07,691 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,693 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2020-03-31 16:43:07,694 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,696 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,697 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,699 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,700 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-03-31 16:43:07,701 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,703 - User['hue'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:07,721 - Adding user User['hue']
2020-03-31 16:43:08,285 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2020-03-31 16:43:08,288 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,292 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,295 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,297 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,298 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,301 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-03-31 16:43:08,303 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,307 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2020-03-31 16:43:08,317 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2020-03-31 16:43:08,317 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2020-03-31 16:43:08,318 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,319 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-03-31 16:43:08,320 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2020-03-31 16:43:08,333 - call returned (0, '57467')
2020-03-31 16:43:08,333 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 57467'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2020-03-31 16:43:08,340 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 57467'] due to not_if
2020-03-31 16:43:08,340 - Group['hdfs'] {}
2020-03-31 16:43:08,342 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2020-03-31 16:43:08,343 - User['admin'] {'fetch_nonlocal_groups': True}
2020-03-31 16:43:08,345 - FS Type:
2020-03-31 16:43:08,346 - Directory['/etc/hadoop'] {'mode': 0755}
2020-03-31 16:43:08,362 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2020-03-31 16:43:08,363 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2020-03-31 16:43:08,363 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2020-03-31 16:43:08,377 - Repository['HDP-2.6-repo-301'] {'append_to_file': False, 'base_url': 'http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-301', 'mirror_list': None}
2020-03-31 16:43:08,384 - File['/etc/yum.repos.d/ambari-hdp-301.repo'] {'content': '[HDP-2.6-repo-301]\nname=HDP-2.6-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-03-31 16:43:08,385 - Writing File['/etc/yum.repos.d/ambari-hdp-301.repo'] because contents don't match
2020-03-31 16:43:08,385 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.1050 is not created due to its tags: set([u'GPL'])
2020-03-31 16:43:08,385 - Repository['HDP-UTILS-1.1.0.22-repo-301'] {'append_to_file': True, 'base_url': 'http://private-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-301', 'mirror_list': None}
2020-03-31 16:43:08,388 - File['/etc/yum.repos.d/ambari-hdp-301.repo'] {'content': '[HDP-2.6-repo-301]\nname=HDP-2.6-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.128-2\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-301]\nname=HDP-UTILS-1.1.0.22-repo-301\nbaseurl=http://private-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-03-31 16:43:08,388 - Writing File['/etc/yum.repos.d/ambari-hdp-301.repo'] because contents don't match
2020-03-31 16:43:08,388 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:08,836 - Skipping installation of existing package unzip
2020-03-31 16:43:08,836 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:08,919 - Skipping installation of existing package curl
2020-03-31 16:43:08,919 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,003 - Skipping installation of existing package hdp-select
2020-03-31 16:43:09,007 - The repository with version 2.6.5.128-2 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2020-03-31 16:43:09,011 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
2020-03-31 16:43:09,257 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2020-03-31 16:43:09,260 - Package['wget'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,428 - Skipping installation of existing package wget
2020-03-31 16:43:09,429 - Package['tar'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,561 - Skipping installation of existing package tar
2020-03-31 16:43:09,562 - Package['asciidoc'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-03-31 16:43:09,646 - Installing package asciidoc ('/usr/bin/yum -d 0 -e 0 -y install asciidoc')
2020-03-31 16:43:10,410 - Execution of '/usr/bin/yum -d 0 -e 0 -y install asciidoc' returned 1. Error: Nothing to do
Loaded plugins: product-id
Cannot upload enabled repos report, is this client registered?
2020-03-31 16:43:10,410 - Failed to install package asciidoc. Executing '/usr/bin/yum clean metadata'
2020-03-31 16:43:10,623 - Retrying to install package asciidoc after 30 seconds
2020-03-31 16:43:41,768 - The repository with version 2.6.5.128-2 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2020-03-31 16:43:41,773 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
Any advice or solution is highly appreciated .
Regards
Bharad
... View more
Labels:
- Labels:
-
Apache Ambari
-
Cloudera Hue
11-28-2017
02:39 AM
@Matt Clarke Please let me know your thoughts ...
... View more
11-28-2017
02:38 AM
We have 2 node cluster HDF .Recently we have couple of projects migrated to Production which has to use few process groups which needs to run on primary node only .So off recently i saw the primary node is being changed constantly ,because of which we are facing lot of issue of the projects which run on primary node only .I see server is busy on GC.i want to understand how would the processors which are configured to "run on primary node only" work when the primary node keeps on changing ,is there a workaround to manually selet the primary node.
... View more
Labels:
- Labels:
-
Apache NiFi
07-03-2017
07:46 PM
@Matt Clarke
Do we have any caluclation on how to set those numbers .
... View more
06-30-2017
04:57 PM
@Wynner after making the change also .. i dont see any flowfiles getting genarated .
... View more
06-30-2017
03:59 PM
@Matt Clarke Please find the screen shots attached. I don'tcapture3.pngcapture4.png see any thing in app.logs about this processor .
... View more
06-30-2017
03:18 PM
@Shashank Chandhok for some reason i am not able to upload . But its an simple generateflowfile processor . I don't see any errors neither in bulliton board nor in app logs .
... View more
06-30-2017
02:58 PM
Hi , I am trying to generate a sample flow file using generateflowfile processor . i have scheduled it for 5 secs . i don't see any file getting generated . Does this need any additional grants . please find the screenshot of the properties . sample flow :-
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
- Next »