Member since
10-05-2016
10
Posts
0
Kudos Received
0
Solutions
12-11-2018
06:31 AM
In the middleware (RestAPI using Spring Boot) we are developing,we need to connect to several Kerberised services (Oozie,Solr,Hive Server..etc) using their Java clients (Oozie Java client,SolrJ Client, kerberos enabled JDBCC..etc ) I managed to connect to Solr and Hive Server separately (by having separate jass.conf, keytabs).
But now we need to connect to these different services within the same JVM process.Different REST endpoints can access all these services in parallel.
1) Is this possible to connect to different kerberized services with different principals (same realm)
2) Is this supported by JAAS? My jaas.conf for connecting to Solr looks like this
SolrJClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="./ambari-infra-solr.service.keytab"
storeKey=true
useTicketCache=true
debug=true
doNotPrompt=true
principal="infra-solr/server-yy-hdp-stg001.stg.xxx.zzz.local@C6KHDPSTG.LOCAL";
};
... View more
07-18-2018
02:02 AM
Greetings, I am trying to find Atlas RestAPI (v2) documents and I found here : https://atlas.apache.org/api/v2/index.html And swagger docs here : https://atlas.apache.org/api/v2/ui/index.html#!/TypesREST/createAtlasTypeDefs Notice that none of the POST requests has request Models defined in the documentation. Where can I find request models for the APIs?
... View more
Labels:
- Labels:
-
Apache Atlas
10-07-2016
07:41 AM
thank you for the prompt reply : As you mentinoed the issue is with the SmartID : Exception in thread "main" com.hortonworks.smartsense.activity.ActivityException: Failed to analyze activities
at com.hortonworks.smartsense.activity.ActivityAnalyzerFacade.main(ActivityAnalyzerFacade.java:190)
Caused by: com.hortonworks.smartsense.activity.ActivityException: Invalid smartsense ID unspecified. Please configure a vaid SmartSense ID to proceed.
at com.hortonworks.smartsense.activity.util.ActivityUtil.validateSmartSenseId(ActivityUtil.java:398)
at com.hortonworks.smartsense.activity.ActivityAnalyzerFacade.main(ActivityAnalyzerFacade.java:177)
... View more
10-07-2016
07:33 AM
Greetings, I managed to install a 19 node cluster using typical Ambari wizard. I use 2 dedicated servers for RM and NN and use 3 servers to install other services like ZK,HiveServer,SmartSense..etc
I has following other services running in the same node that I installed SmartSense Activity Analyzer : △ Activity Analyzer/SmartSense HiveMetaStore/Hive HiveServer2/Hive Infra Solr Instance/Abari Infra MySql Server/Hive WebHCat Server/Hive ZooKeeper Server/ZooKeeper HST Agent/SmartSense Metrics Monitor/Ambari Matrics When I goto the details for "Activity Analyzer", I see following info : △ Activity Analyser Stopped Activity Explorer : Started HST Server : Started HST Agents 17/17 HST Agents Live When I try to start "Activity Analyzer" it fails with the following outputs :
Any hip whats wrong with my setup ? Failed to execute command: /usr/sbin/hst activity-analyzer start ; Exit code: 255; stdout: SmartSense Activity PID at: /var/run/smartsense-activity-analyzer/activity-analyzer.pid
SmartSense Activity out at: /var/log/smartsense-activity/activity-analyzer.out
SmartSense Activity log at: /var/log/smartsense-activity/activity-analyzer.log
Waiting for activity analyzer to start...................
SmartSense Activity Analyzer failed to start, with exitcode -1. Check /var/log/smartsense-activity/activity-analyzer.log for more information.
; stderr:
2016-10-07 03:07:13,695 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-10-07 03:07:13,697 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-10-07 03:07:13,698 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-10-07 03:07:13,713 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-10-07 03:07:13,713 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-10-07 03:07:13,727 - checked_call returned (0, '')
2016-10-07 03:07:13,728 - Ensuring that hadoop has the correct symlink structure
2016-10-07 03:07:13,728 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-07 03:07:13,823 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-10-07 03:07:13,824 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-10-07 03:07:13,826 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-10-07 03:07:13,841 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-10-07 03:07:13,841 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-10-07 03:07:13,856 - checked_call returned (0, '')
2016-10-07 03:07:13,857 - Ensuring that hadoop has the correct symlink structure
2016-10-07 03:07:13,857 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-07 03:07:13,857 - Group['livy'] {}
2016-10-07 03:07:13,858 - Group['spark'] {}
2016-10-07 03:07:13,858 - Group['zeppelin'] {}
2016-10-07 03:07:13,859 - Group['hadoop'] {}
2016-10-07 03:07:13,859 - Group['users'] {}
2016-10-07 03:07:13,859 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,859 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,860 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,860 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,860 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,861 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-10-07 03:07:13,861 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,861 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,862 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,862 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-10-07 03:07:13,863 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,863 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,863 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,864 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,864 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-07 03:07:13,864 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-07 03:07:13,865 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-10-07 03:07:13,869 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-10-07 03:07:13,869 - Group['hdfs'] {}
2016-10-07 03:07:13,870 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-10-07 03:07:13,870 - FS Type:
2016-10-07 03:07:13,870 - Directory['/etc/hadoop'] {'mode': 0755}
2016-10-07 03:07:13,878 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-10-07 03:07:13,879 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-10-07 03:07:13,892 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2016-10-07 03:07:13,897 - Skipping Execute[('setenforce', '0')] due to not_if
2016-10-07 03:07:13,897 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2016-10-07 03:07:13,899 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2016-10-07 03:07:13,899 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2016-10-07 03:07:13,902 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2016-10-07 03:07:13,903 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2016-10-07 03:07:13,903 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2016-10-07 03:07:13,911 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2016-10-07 03:07:13,911 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2016-10-07 03:07:13,911 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2016-10-07 03:07:13,914 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2016-10-07 03:07:13,917 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
Command: mkdir -p /var/lib/smartsense/activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/lib/smartsense/activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /etc/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /etc/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /etc/smartsense-activity/conf
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /etc/smartsense-activity/conf
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Writing configs to : /etc/smartsense-activity/conf/activity.ini
2016-10-07 03:07:14,088 - File['/etc/smartsense-activity/conf/activity-env.sh'] {'owner': 'root', 'content': InlineTemplate(...), 'group': 'root'}
2016-10-07 03:07:14,090 - File['/etc/smartsense-activity/conf/log4j.properties'] {'content': InlineTemplate(...), 'mode': 0644}
Command: mkdir -p /var/lib/smartsense/activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/lib/smartsense/activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /etc/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /etc/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/log/smartsense-activity
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /etc/smartsense-activity/conf
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /etc/smartsense-activity/conf
Exit code: 0
Std Out: None
Std Err: None
Command: mkdir -p /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: chown -R 'root':'root' /var/run/smartsense-activity-analyzer
Exit code: 0
Std Out: None
Std Err: None
Command: /usr/sbin/hst activity-analyzer start
Exit code: 255
Std Out: SmartSense Activity PID at: /var/run/smartsense-activity-analyzer/activity-analyzer.pid
SmartSense Activity out at: /var/log/smartsense-activity/activity-analyzer.out
SmartSense Activity log at: /var/log/smartsense-activity/activity-analyzer.log
Waiting for activity analyzer to start...................
SmartSense Activity Analyzer failed to start, with exitcode -1. Check /var/log/smartsense-activity/activity-analyzer.log for more information.
Std Err: None
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Hortonworks SmartSense