Member since
11-21-2016
79
Posts
3
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1336 | 02-27-2019 07:46 AM | |
560 | 05-30-2017 05:34 PM | |
1901 | 05-30-2017 11:47 AM | |
703 | 05-20-2017 03:55 PM | |
648 | 05-17-2017 02:44 PM |
02-27-2019
07:46 AM
./import-hive.sh -Dsun.security.jgss.debug=true -Djavax.security.auth.useSubjectCredsOnly=false -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=/etc/atlas/conf/atlas_jaas.conf -some typo in script parameter, after correction this import-hive.sh script completed. -Now I can see imported entity in Atlas UI.
... View more
02-27-2019
05:34 AM
Following this link - https://community.hortonworks.com/articles/61274/import-hive-metadata-into-atlas.html still getting reported error. . Community Tech Can help on this....
... View more
02-27-2019
05:30 AM
./import-hive.sh -Dsun.security.jgss.debug=true -Djavax.security.auth.useSubjectCredsOnly=false -Djava.security.krb5.conf=/etc/krb5.cont f -Djava.security.auth.login.config=/etc/atlas/conf/atlas_jaas.conf Using Hive configuration directory [/etc/hive/conf] Log file for import is /var/log/atlas/import-hive.log log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout. log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.PatternLayout. Exception in thread "main" org.apache.atlas.hook.AtlasHookException: HiveMetaStoreBridge.main() failed. at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:650) Caused by: java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:633) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 6 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 12 more Failed to import Hive Data Model!!!
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hive
06-12-2017
02:38 PM
@Sami Ahmad practicing the exam build with basic. get idea how the exam environment. Yes, if its required you have to install in exam too...depends in tasks you get it...make sure you should practice all exam objectives...
... View more
06-08-2017
01:29 AM
Thank you Jay...
... View more
06-08-2017
01:21 AM
Thank you Jay, that's true...I noticed after setting up manually host XML file..try to restart services/components thru ambari UI push the older xml configuration. my case, I have updated all host xml configuration manually, still not worked...some were settings are missing or ambari ui vs manual process not in sync...
... View more
06-07-2017
07:39 PM
HDP Experts, any recommendation/advice on this...
... View more
06-05-2017
12:35 PM
Yes Sami, that's should be fine either way you can do the task ambari url or manual process...have to complete task.
... View more
05-31-2017
02:47 PM
Please find below link in reference tab for related topic links. there's no document combined as one pdf due to all different component, have to go thru each of them. https://2xbbhjxc6wk3v21p62t8n4d4-wpengine.netdna-ssl.com/wp-content/uploads/2016/08/ExamObjectives-HCAssociate.pdf
... View more
05-31-2017
12:24 PM
just want to confirm you, trying to install ambari-server am right ? and provide the below output.... output: -cat /etc/yum.repo.d/ambari.repo -cat /etc/yum.repo.d/OS.repo <operating system repos) -yum repolist
... View more
05-31-2017
03:06 AM
followup below HCA exam objectives document. https://2xbbhjxc6wk3v21p62t8n4d4-wpengine.netdna-ssl.com/wp-content/uploads/2017/05/HCA_Data_Sheet.pdf EXAM OBJECTIVES To be fully prepared for an exam, candidates should be able to perform all of the exam’s objectives. View the
complete
list of objectives for the HCA exam at
: https://2xbbhjxc6wk3v21p62t8n4d4
-
wpengine.netdna
-
ssl.com/wp
-
content/uploads/2016/08/
ExamObjectives
-
HCAssociate.pdf PREREQUISITES
Candidates for the HCA exam should be able to perform each of the tasks in th
e list of exam objectives below: https://2xbbhjx
c6wk3v21p62t8n4d4
-
wpengine.netdna
-
ssl.com/wp
-
content/uploads/2016/08/ExamObjectives
-
HCAssociate.pdf
... View more
05-30-2017
05:34 PM
No, access/allow hortonworks documentation pages. Candidates for the HDPCD exam are provided access to the Pig, Hive, Sqoop and Flume documentation pages. Candidates for the HDPCA exam are provided access to the HPD 2.3 documentation pages. Candidates for the HDPCD:Java exam will have access to the Apache Hadoop documentation. HDP FAQ ==> https://hortonworks.com/services/training/certification/hdp-certified-developer-faq-page/
... View more
05-30-2017
12:02 PM
@Daniel Allardice, still rock configuration issue on ? if yes please provide the information/logs will further analyze.
... View more
05-30-2017
11:51 AM
Repository / Ambari server Configuration Docs: https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.0.0/bk_Installing_HDP_AMB/content/_using_a_local_repository.html https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.0.0/bk_Installing_HDP_AMB/content/_download_the_ambari_repo_lnx6.html
... View more
05-30-2017
11:47 AM
2017-05-3013:47:55,985-File['/etc/yum.repos.d/HDP-UTILS.repo']{'content':'[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://172.21.0.188/HDP-UTILS/\n\npath=/\nenabled=1\ngpgcheck=0'} -repo files not configured properly like (ambari, OS repo files), try to setup correctly.... yum repolist
... View more
05-29-2017
05:48 AM
Configure the Capacity Scheduler via Ambari url working fine and show up as expected, when I try to configure manually and following steps like updating .xml,refreshQueues/refreshNodes not show up in Ambari url. Manually update Status : <ip>:8088 ==> showing up yarn queue -status DW ==> showing up <ip>: 8080 ==> not show up in Ambari url If I do reverse like updating/adding in Ambari...it show up all the places...any recommendation for manual update to show up same values in Amabri url.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache YARN
05-29-2017
05:35 AM
can you try like this test_name instead of test-name...special characters not allowed here..... or use space as well - Test Name instead of test-name....
... View more
05-25-2017
12:16 PM
Refer docs - steps for Zeppelin http://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.1.0/bk_installation/content/installing_zeppelin.html
https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.1.0/bk_analytics/content/setting_up_zeppelin_analyze.html Importing Zeppelin Notebook Using Ambari If you would like to install Zeppelin, complete the following steps after you have
successfully installed HCP. For more information about Zeppelin, see Analyzing Enriched Data Using Apache Zeppelin. Login to Ambari at http://$AMBARI_HOST:8080 . In Ambari, click Metron>Service Actions>Zeppelin Notebook
Import. Ambari imports the Zeppelin Notebook. Login to Zeppelin at http://$ZEPPELIN_HOST:9995 . Search for the notebook named Metron - YAF Telemetry.
... View more
05-25-2017
11:57 AM
017-05-2512:11:18,585-File['/etc/yum.repos.d/HDP-UTILS.repo']{'content':'[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://10.95.70.117/HDP-UTILS-1.1.0.21/repos/centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}2017-05-2512:11:18,585-Package['unzip']{'retry_on_repo_unavailability':False,'retry_count':5}2017-05-2512:11:18,714-Skipping installation of existing package unzip2017-05-2512:11:18,714-Package['curl']{'retry_on_repo_unavailability':False,'retry_count':5}2017-05-2512:11:18,747-Skipping installation of existing package curl Repo files unavailability failed installation.
goto ambari url -> admin -> stack and version tab -> edit -> Repository Base URL validation HDP : baseurl=http://<ip>/HDP/centos7/2.x/updates/2.5.3.0
HDP-UTILS : baseurl=http://<ip>/HDP-UTILS-1.1.0.21/repos/centos7 if you are installing service via ambari url, then validate the repo base url. not required copy any HDPs repo file to local server. during the service/package installation those files push/copy to local server, if those files already exist in local machine it will overwrite based on how you setup repo base url in ambari. Also make sure CentOS repo files configured/required, during the installation it required to touch the OS depended packages. output:
yum -y clean all ls -ltra /etc/yum.repo.d/ cat ambari.repo cat HDP-UTILS.repo cat HDP.repo cat CentOS-Base.repo yum repolist
... View more
05-25-2017
02:30 AM
==Error log== 'base_url': 'http://10.95.70.117/HDP/centos7/', 'action': ['create'],
'components': [u'HDP', 'main'], 'repo_template':
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP',
'mirror_list': None} 2017-05-24 18:33:41,145 -
File['/etc/yum.repos.d/HDP.repo'] {'content':
'[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://10.95.70.117/HDP/centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-24 18:33:41,146 - Repository['HDP-UTILS-1.1.0.21'] {'base_url':
'http://10.95.70.117/HDP-UTILS-1.1.0.21/repos/centos7/', 'action':
['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template':
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS',
'mirror_list': None} 2017-05-24 18:33:41,150 -
File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content':
'[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://10.95.70.117/HDP-UTILS-1.1.0.21/repos/centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-24 18:33:41,150 - Package['unzip']
{'retry_on_repo_unavailability': False, ---------------------------- As per the log, it shows configured local repo and try to install.. please configure repo files for Ambari/OS in /etc/yum.repo.d/....also check & setup in ambari url --> versions -> Base URL for HDP & HDP-Utils repos. output : yum repolist
... View more
05-24-2017
09:34 PM
@Apurva Pathak - Don't worry about if not working something... Are you able to see the screen after starting VMs...if you not able to see welcome screen and struck into some of the services are failed/it shows starting up for HDP sandbox...best options is delete completely and try to re-import. Once you successfully imported again, try below steps. ssh root@127.0.0.1 -p 2222; java -version or Use Web Client Shell: url : 127.0.0.1:4200 root/hadoop
... View more
05-24-2017
09:27 PM
Error:Package: hadoop_2_5_0_0_1245-hdfs-2.7.3.2.5.0.0-1245.el6.x86_64 (HDP-2.5) Requires: hadoop_2_5_0_0_1245 =2.7.3.2.5.0.0-1245.el6 Available: hadoop_2_5_0_0_1245-2.7.3.2.5.0.0-1245.el6.x86_64 (HDP-2.5) hadoop_2_5_0_0_1245 =2.7.3.2.5.0.0-1245.el6 You could tryusing--skip-broken to work around the problem You could try running: rpm -Va--nofiles --nodigest --It required/touch depended package during the installation of packages. --Setup CentOS repo file, then proceed to install HDP. yum repolist
... View more
05-24-2017
07:49 PM
@cduby used below envrionment. Sandbox file: HDP_2.5_virtualbox.ova VMs Version : 5.1.6
... View more
05-24-2017
03:33 AM
Refer below document links, and configure network settings.
https://hortonworks.com/wp-content/uploads/2013/03/InstallingHortonworksSandboxonWindowsUsingVMwarePlayerv2.pdf https://community.hortonworks.com/articles/98459/how-to-configure-networks-on-the-virtualbox-hdp-26.html Can you please share the "Settings" --> "Network" setting/screenshots ?
... View more
05-24-2017
03:26 AM
1 Kudo
Yes, during deploy it will re-create following directory /usr/hdp/hadoop/. from where you trying to deploy HDP in host machine...like thru ambari url or manual process ? Have you setup correctly repo files ? output : yum repolist
... View more
05-24-2017
03:14 AM
which steps got this error : -sandbox loaded successfully on VMs. ? -after loading, you are getting any error message ? can you share the screen shot, it may help to understand and further action.
... View more
05-24-2017
03:09 AM
Connect to Sandbox using Command to access the Sandbox thru SSH: ssh root@127.0.0.1 -p 2222; java -version or Use Web Client Shell: url : 127.0.0.1:4200 root/password -Connect to sandbox ,then perform all your commands in sandbox shell...
... View more
05-24-2017
03:04 AM
many of the sandbox users have same issue reported in couple of days and need attentions to solve it.... Good Article...this one help to resolve it.
... View more
05-23-2017
05:37 PM
Refer below Docs link releated to Metron 0.3.0 to 0.3.1 https://dist.apache.org/repos/dist/release/incubator/metron/0.3.1/book-site/Upgrading.html https://dist.apache.org/repos/dist/release/incubator/metron/0.3.1/CHANGES
... View more
05-23-2017
04:53 PM
Refer : https://community.hortonworks.com/articles/98459/how-to-configure-networks-on-the-virtualbox-hdp-26.html Try to setup in VMs Network Adpter like NAT instead of Host-only options...
... View more