Member since
01-15-2016
37
Posts
13
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6550 | 04-24-2017 01:48 PM | |
2632 | 06-21-2016 12:05 PM | |
3075 | 03-07-2016 10:43 PM |
06-22-2016
05:42 PM
@rmolina The webhcat response time did end up seeming to be the result of the issue. I do believe that being able to block off a certain amount of memory for WebHCat specifically ( I think this may already be available with templeton.mapper.memory.mb ), but that is just the mapper memory and I haven't looked too much farther into it. When there are no other users using the cluster, the Pig GUI view will run fine, but as that is not going to be the case for most Prod clusters that we deploy, I think that being able to set a reserve specifically in the WebHCat-Env or WebHCat-Site could prove to be useful in making sure the resources are properly allocated.
... View more
06-21-2016
12:05 PM
From what I can tell this ended up being an available resources issue. I logged back in at midnight when all users had left and everything seems to be working correctly.
Some of the time the Pig Job will say that it failed to start, but in the stderr/stdout it will show the results of the DUMP that I am trying to perform and since it was working fine in the grunt CLI, this was a very tricky problem to uncover.
... View more
06-20-2016
06:10 PM
@Rahul Pathak This is what the current listing of hadoop.proxyuser is
... View more
06-20-2016
04:49 PM
@Artem Ervits Do you have any ideas?
... View more
06-20-2016
04:25 PM
Hello All,
I will preface with, I have seen multiple questions of similar nature and have tried each of the solutions, but to no avail on my end and feel that a more in depth explanation may help others as well if they are to ever arrive at a similar issue. The pig view fails, but the Grunt> CLI runs fine, so I am thinking that it may be a PIG View configuration error. I started by researching the jira located at https://issues.apache.org/jira/browse/AMBARI-12738
I am trying to use the Pig View in Ambari 2.2.1 on HDP 2.4.2 and am running into a multitude of errors. The script that I am running is logs = LOAD 'server_logs.error_logs' USING org.apache.hive.hcatalog.pig.HCatLoader();
DUMP logs;
The job will fail with a "Job failed to start" Error which then only has a stack trace of java.net.SocketTimeoutException: Read timed out
java.net.SocketTimeoutException: Read timed out In the history logs within the view I receive the following error only File /user/admin/pig/jobs/errlogs_20-06-2016-15-11-39/stderr not found. I have tried this for user hdfs and admin the same problem remains, I have also just tried to load a file with PigStorage('|'), but that also returned me the same issue. Using both Tez and MR ExecTypes, I receive the same error. The NameNode and ResourceManager are both in High Availability mode. I have added the appropriate proxyuser configs to both the core-site and hcat-site in HDFS and Hive configurations.
I have restarted all services and Ambari-Server The stderr file is created within the /user/admin/pig/jobs/errlogs_20-06-2016-15-11-39/ directory, but does not have anything written to it. The admin/pig/ directory has full go+w 777 -R permissions, but when the stderr file is created it will only show as having 644 permissions. Against my better judgement I issued an hdfs dfs -chmod -R 777 /user command to see if it was an underlying permissions issue on a file unbeknownst to me, but that also left me with the same outcome. The Resource Manager Logs show that the application is submitted and continues to hang in the RUNNING state even after the job has been noted as "Failed to Start" through Ambari. yarn application -list shows that there are no running Applications as well. Has anyone figured out a solution to this problem? The stacktraces are not helpful, given they do not output more than 1-2 lines of information. My Pig View Cluster configuration is as follows:
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Pig
03-07-2016
10:43 PM
1 Kudo
The problem ended up being the Ambari NiFi service instance, I used the Ambari API to delete the service from Ambari and reinstalled the packages and everything worked as planned. Thank you for all your help.
... View more
03-07-2016
02:41 PM
Including StdOut as per request: ...
2016-03-07 08:06:30,467 - Installing package snappy ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 snappy')
2016-03-07 08:06:32,147 - Package['snappy-devel'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:32,147 - Installing package snappy-devel ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 snappy-devel')
2016-03-07 08:06:33,333 - Package['hadoop_2_4_*-libhdfs'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:33,334 - Installing package hadoop_2_4_*-libhdfs ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'hadoop_2_4_*-libhdfs'')
2016-03-07 08:06:34,689 - Package['zip'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:34,690 - Installing package zip ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 zip')
2016-03-07 08:06:35,745 - Package['extjs'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:35,746 - Installing package extjs ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 extjs')
2016-03-07 08:06:36,864 - Package['oozie_2_4_*'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:36,865 - Installing package oozie_2_4_* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'oozie_2_4_*'')
2016-03-07 08:06:38,149 - Package['falcon_2_4_*'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:38,149 - Installing package falcon_2_4_* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'falcon_2_4_*'')
2016-03-07 08:06:39,233 - Package['tez_2_4_*'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:39,234 - Installing package tez_2_4_* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'tez_2_4_*'')
2016-03-07 08:06:40,344 - Package['flume_2_4_*'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:40,345 - Installing package flume_2_4_* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'flume_2_4_*'')
2016-03-07 08:06:41,570 - Package['git'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:41,571 - Installing package git ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 git')
2016-03-07 08:06:42,551 - Package['java-1.7.0-openjdk-devel'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:42,551 - Installing package java-1.7.0-openjdk-devel ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 java-1.7.0-openjdk-devel')
2016-03-07 08:06:43,608 - Package['apache-maven-3.2*'] {'use_repos': ['HDP-2.4.0.0-169', 'HDP-UTILS-2.4.0.0-169'], 'skip_repos': ['HDP-*']}
2016-03-07 08:06:43,608 - Installing package apache-maven-3.2* ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'apache-maven-3.2*'')
2016-03-07 08:06:44,706 - Package Manager failed to install packages. Error: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'apache-maven-3.2*'' returned 1. Error: Nothing to do
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 376, in install_packages
skip_repos=[self.REPO_FILE_NAME_PREFIX + "*"] if OSCheck.is_redhat_family() else [])
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 49, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.4.0.0-169,HDP-UTILS-2.4.0.0-169 'apache-maven-3.2*'' returned 1. Error: Nothing to do
2016-03-07 08:06:45,247 - Installation of packages failed. Checking if installation was partially complete
2016-03-07 08:06:45,247 - Old versions: ['2.3.4.0-3485', '2.4.0.0-169']
2016-03-07 08:06:45,270 - New versions: ['2.3.4.0-3485', '2.4.0.0-169']
2016-03-07 08:06:45,434 - Deltas: set([])
... View more
03-07-2016
02:38 PM
I am really just confused as to why Ambari is trying to install an apache maven package. I have Apache Maven 3.2.5 installed and "mvn -v" confirms that
... View more
03-07-2016
01:47 PM
I used the file given under my OS and tried yum install ambari-server and yum upgrade server but both responded with "Package ambari-server-2.2.1.0-161.x86_64 already installed and latest version"
... View more
03-07-2016
03:15 AM
1 Kudo
Thank you! That did it for me, the application host is separate from the HDP Cluster, so I had to scp the .xml over, but it then worked as planned.
... View more
- « Previous
- Next »