Member since
04-03-2019
962
Posts
1743
Kudos Received
146
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 14988 | 03-08-2019 06:33 PM | |
| 6169 | 02-15-2019 08:47 PM | |
| 5098 | 09-26-2018 06:02 PM | |
| 12584 | 09-07-2018 10:33 PM | |
| 7443 | 04-25-2018 01:55 AM |
01-08-2017
02:07 PM
1 Kudo
@Sankar T Below is one example of oozie command line to get status of all the SUCCEEDED jobs Note - You can redirect this output to some file and get the time conversion done --> get the resulted job IDs and put it in Hive/Hbase as per your requirement. Hope this helps! [root@prodnode1 ~]# oozie jobs -oozie http://prodnode2:11000/oozie -len 1000000 -filter status=SUCCEEDED
Job ID App Name Status User Group Started Ended
------------------------------------------------------------------------------------------------------------------------------------
0000007-161206152234228-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplication1SUCCEEDED falcon - 2016-12-14 07:28 GMT 2016-12-14 07:28 GMT
------------------------------------------------------------------------------------------------------------------------------------
0000006-161206152234228-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplication1SUCCEEDED falcon - 2016-12-13 07:28 GMT 2016-12-13 07:28 GMT
------------------------------------------------------------------------------------------------------------------------------------
0000005-161206152234228-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplication1SUCCEEDED falcon - 2016-12-12 07:28 GMT 2016-12-12 07:28 GMT
------------------------------------------------------------------------------------------------------------------------------------
0000004-161206152234228-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplication1SUCCEEDED falcon - 2016-12-11 07:28 GMT 2016-12-11 07:28 GMT
------------------------------------------------------------------------------------------------------------------------------------
0000003-161206152234228-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplication1SUCCEEDED falcon - 2016-12-10 07:28 GMT 2016-12-10 07:28 GMT
-----------------------------------------------------------------------------------------
...
[Output truncated]
0000003-160926083516131-oozie-oozi-W FALCON_FEED_RETENTION_hdfsreplicationSUCCEEDED falcon - 2016-10-08 05:55 GMT 2016-10-08 07:27 GMT
------------------------------------------------------------------------------------------------------------------------------------
[root@prodnode1 ~]#
... View more
01-05-2017
05:35 PM
@Uvaraj Seerangan - Please check answers given below and accept an appropriate answer once your issue is resolved.
... View more
01-05-2017
05:32 PM
2 Kudos
@Sankar T Oozie stores workflow definitions in a backend DB as blobs, it would be complex to fetch it from DB directly. Best way is to use Oozie command line. Please refer below link for more details about Oozie CLI https://oozie.apache.org/docs/3.1.3-incubating/DG_CommandLineTool.html OR if you are interested in Oozie REST API then here is the doc https://oozie.apache.org/docs/4.0.0/WebServicesAPI.html Hope this information helps! Please revert if you need any other details about Oozie! 🙂
... View more
01-05-2017
05:23 PM
2 Kudos
@chitrartha sur In addition to above answers, Please refer below article and let us know if you face any further issues. https://community.hortonworks.com/articles/40658/configure-hive-view-for-kerberized-cluster.html
... View more
01-04-2017
03:15 PM
2 Kudos
@chennuri gouri shankar Please launch hive shell in DEBUG mode with below command and try to run same query and post logs in here. hive --hiveconf hive.root.logger=DEBUG,console
... View more
01-02-2017
03:47 PM
2 Kudos
SYMPTOM While adding new service to the HDP cluster, it fails with below error: Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 37, in <module>
AfterInstallHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 31, in hook
setup_hdp_symlinks()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 49, in setup_hdp_symlinks
hdp_select.select_all(version)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/hdp_select.py", line 122, in select_all
Execute(command, only_if = only_if_command)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.4.2.0-258 | tail -1`' returned 1. Traceback (most recent call last):
File "/usr/bin/hdp-select", line 382, in <module>
printVersions()
File "/usr/bin/hdp-select", line 239, in printVersions
result[tuple(map(int, versionRegex.split(f)))] = f
ValueError: invalid literal for int() with base 10: 'smartsense'
ERROR: set command takes 2 parameters, instead of 1
usage: hdp-select [-h] [<command>] [<package>] [<version>]
Set the selected version of HDP.
positional arguments:
<command> One of set, status, versions, or packages
<package> the package name to set
<version> the HDP version to set
optional arguments:
-h, --help show this help message and exit
-r, --rpm-mode if true checks if there is symlink exists and creates the symlink if it doesn't
Commands:
set : set the package to a specified version
status : show the version of the package
versions : show the currently installed versions
packages : show the individual package names . ROOT CAUSE There is an extra unwanted directory under /usr/hdp on Ambari server Please see below output: [root@prodnode1 hdp]# ls -lrt /usr/hdp/
total 16
drwxr-xr-x. 19 root root 4096 Sep 26 08:31 2.4.2.0-258
drwxr-xr-x. 2 root root 4096 Nov 21 13:28 current
drwxr-xr-x. 2 root root 4096 Jan 2 15:37 smartsense
drwxr-xr-x. 3 root root 4096 Jan 2 15:39 share
[root@prodnode1 hdp]# Note - It's not recommended to put any directory under /usr/hdp/ except 'share', 'current', 'versioned directory' . RESOLUTION Move /usr/hdp/<unwanted-directory> to some other location than '/usr/hdp' See below output: #Before moving directory
[root@prodnode1 hdp]# hdp-select versions
Traceback (most recent call last):
File "/usr/bin/hdp-select", line 382, in <module>
printVersions()
File "/usr/bin/hdp-select", line 239, in printVersions
result[tuple(map(int, versionRegex.split(f)))] = f
ValueError: invalid literal for int() with base 10: 'smartsense'
[root@prodnode1 hdp]#
#Move directory outside
[root@prodnode1 hdp]# mv /usr/hdp/smartsense/ /root/
[root@prodnode1 hdp]#
#After moving directory
[root@prodnode1 hdp]# hdp-select versions
2.4.2.0-258 . Please comment if you have any question. Happy Hadooping!! 🙂
... View more
Labels:
12-29-2016
02:45 PM
Thank you so much @irfan aziz for the confirmation. I'm accepting answer given by @Michael Young. Please feel free to accept appropriate answer if required.
... View more
12-21-2016
05:23 PM
2 Kudos
SYMPTOM We get below error while installing new HDP version packages before upgrading to latest HDP version on SUSE linux. 2016-12-21 13:46:47,919 - Package Manager failed to install packages. Error: Execution of '/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm livy_2_3_2_0_2950' returned 104. File 'repomd.xml' from repository 'AMBARI-2.4.1.0.repo' is unsigned, continue? [yes/no] (no): no
Error building the cache:
[|] Valid metadata not found at specified URL(s)
Warning: Disabling repository 'AMBARI-2.4.1.0.repo' because of the above error.
File 'repomd.xml' from repository 'HDP.repo' is unsigned, continue? [yes/no] (no): no
Error building the cache:
[|] Valid metadata not found at specified URL(s)
Warning: Disabling repository 'HDP.repo' because of the above error.
No provider of 'livy_2_3_2_0_2950' found.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 376, in install_packages
retry_count=agent_stack_retry_count
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 58, in action_upgrade
self.upgrade_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py", line 62, in upgrade_package
return self.install_package(name, use_repos, skip_repos, is_upgrade)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py", line 57, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm livy_2_3_2_0_2950' returned 104. File 'repomd.xml' from repository 'AMBARI-2.4.1.0.repo' is unsigned, continue? [yes/no] (no): no
Error building the cache:
[|] Valid metadata not found at specified URL(s)
Warning: Disabling repository 'AMBARI-2.4.1.0.repo' because of the above error.
File 'repomd.xml' from repository 'HDP.repo' is unsigned, continue? [yes/no] (no): no
Error building the cache: . ROOT CAUSE This is a BUG reported under https://issues.apache.org/jira/browse/AMBARI-19186 for SUSE linux if we are using unsigned repo. .
WORKAROUND N/A . RESOLUTION Apply patch given at https://issues.apache.org/jira/browse/AMBARI-19186 Steps to Apply the patch: 1. Take a backup of /usr/lib/ambari-agent/lib/resource_management/libraries/functions/packages_analyzer.py 2. Edit /usr/lib/ambari-agent/lib/resource_management/libraries/functions/packages_analyzer.py with your favorite editor(I use vim) 3. Find the line with "--installed-only" E.g ["sudo", "zypper", "search", "--installed-only", "--details"], 4. Replace it with: ["sudo", "zypper", "--no-gpg-checks", "search", "--installed-only", "--details"], 5. Find the line with "--uninstalled-only" ["sudo", "zypper", "search", "--uninstalled-only", "--details"], 6. Replace it with: ["sudo", "zypper", "--no-gpg-checks", "search", "--uninstalled-only", "--details"], . Note - If the host where you are having this issue is a ambari-agent, you only need to apply patch on below file: /usr/lib/ambari-agent/lib/resource_management/libraries/functions/packages_analyzer.py If the host where you are having an issue is ambari-server, you need to apply patch on below files: /usr/lib/ambari-server/lib/resource_management/libraries/functions/packages_analyzer.py /usr/lib/ambari-agent/lib/resource_management/libraries/functions/packages_analyzer.py . Hope this information helps! Please comment if you have any questions. Happy Hadooping!! 🙂
... View more
12-20-2016
08:33 PM
2 Kudos
@irfan aziz Can you please check your resource manager logs to see what went wrong? log location would be(by default) /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-<hostname>.log You can post the error here if you need any further help!
... View more
12-20-2016
04:14 PM
@Vishal Prakash Shah - Can you please post value of your yarn.timeline-service.ttl-ms?
... View more