Member since
03-13-2017
13
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1387 | 02-11-2018 11:39 PM | |
2803 | 03-14-2017 12:40 AM |
02-11-2018
11:39 PM
1 Kudo
I re-run it today and everything works..
... View more
02-08-2018
10:53 PM
Hi people,
Can someone help me with this issues ? We are installing HDP 2.6 and 1 of our cluster servers is failing.
The error message is :
Failed to execute command: rpm -qa | grep smartsense- || yum -y install smartsense-hst || rpm -i /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm; Exit code: 1; stdout: Loaded plugins: product-id, rhnplugin, search-disabled-repos, subscription-
: manager
This system is receiving updates from RHN Classic or Red Hat Satellite.
Resolving Dependencies
--> Running transaction check
---> Package smartsense-hst.x86_64 0:1.4.4.2.6.1.0-143 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
================================================================================
Package Arch Version Repository Size
================================================================================
Installing:
smartsense-hst x86_64 1.4.4.2.6.1.0-143 ambari-2.6.1.0 284 M
Transaction Summary
================================================================================
Install 1 Package
Total download size: 284 M
Installed size: 295 M
Downloading packages:
; stderr: http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.1.0/smartsense/smartsense-hst-1.4.4.2.6.1.0-143.x86_64.rpm: [Errno -1] Package does not match intended download. Suggestion: run yum --enablerepo=ambari-2.6.1.0 clean metadata
Trying other mirror.
Error downloading packages:
smartsense-hst-1.4.4.2.6.1.0-143.x86_64: [Errno 256] No more mirrors to try.
error: File not found by glob: /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
stdout: /var/lib/ambari-agent/data/output-586.txt
2018-02-09 09:49:24,353 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-02-09 09:49:24,358 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-02-09 09:49:24,359 - Group['hdfs'] {}
2018-02-09 09:49:24,360 - Group['hadoop'] {}
2018-02-09 09:49:24,360 - Group['users'] {}
2018-02-09 09:49:24,361 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,364 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,366 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,368 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,370 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-09 09:49:24,372 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-09 09:49:24,374 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,374 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-02-09 09:49:24,375 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,376 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,377 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,378 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,379 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-09 09:49:24,380 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-09 09:49:24,382 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-02-09 09:49:24,390 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-02-09 09:49:24,390 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-02-09 09:49:24,391 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-09 09:49:24,393 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-09 09:49:24,394 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-02-09 09:49:24,407 - call returned (0, '1020')
2018-02-09 09:49:24,408 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-02-09 09:49:24,416 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] due to not_if
2018-02-09 09:49:24,417 - Group['hdfs'] {}
2018-02-09 09:49:24,418 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-02-09 09:49:24,419 - FS Type:
2018-02-09 09:49:24,419 - Directory['/etc/hadoop'] {'mode': 0755}
2018-02-09 09:49:24,436 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-02-09 09:49:24,437 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-02-09 09:49:24,453 - Repository['HDP-2.6-repo-5'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-5', 'mirror_list': None}
2018-02-09 09:49:24,465 - File['/etc/yum.repos.d/ambari-hdp-5.repo'] {'content': '[HDP-2.6-repo-5]\nname=HDP-2.6-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-09 09:49:24,466 - Writing File['/etc/yum.repos.d/ambari-hdp-5.repo'] because contents don't match
2018-02-09 09:49:24,466 - Repository['HDP-2.6-GPL-repo-5'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-5', 'mirror_list': None}
2018-02-09 09:49:24,469 - File['/etc/yum.repos.d/ambari-hdp-5.repo'] {'content': '[HDP-2.6-repo-5]\nname=HDP-2.6-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-5]\nname=HDP-2.6-GPL-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-09 09:49:24,470 - Writing File['/etc/yum.repos.d/ambari-hdp-5.repo'] because contents don't match
2018-02-09 09:49:24,470 - Repository['HDP-UTILS-1.1.0.22-repo-5'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-5', 'mirror_list': None}
2018-02-09 09:49:24,473 - File['/etc/yum.repos.d/ambari-hdp-5.repo'] {'content': '[HDP-2.6-repo-5]\nname=HDP-2.6-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-5]\nname=HDP-2.6-GPL-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-5]\nname=HDP-UTILS-1.1.0.22-repo-5\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-09 09:49:24,474 - Writing File['/etc/yum.repos.d/ambari-hdp-5.repo'] because contents don't match
2018-02-09 09:49:24,474 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-09 09:49:25,475 - Skipping installation of existing package unzip
2018-02-09 09:49:25,476 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-09 09:49:25,497 - Skipping installation of existing package curl
2018-02-09 09:49:25,497 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-09 09:49:25,518 - Skipping installation of existing package hdp-select
2018-02-09 09:49:25,523 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-02-09 09:49:25,529 - Skipping stack-select on SMARTSENSE because it does not exist in the stack-select package structure.
thisUser:root.
configuredUserName:root.
returning userMap:
installing using command: {sudo} rpm -qa | grep smartsense- || {sudo} yum -y install smartsense-hst || {sudo} rpm -i /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
Command: rpm -qa | grep smartsense- || yum -y install smartsense-hst || rpm -i /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
Exit code: 1
Std Out: Loaded plugins: product-id, rhnplugin, search-disabled-repos, subscription-
: manager
This system is receiving updates from RHN Classic or Red Hat Satellite.
Resolving Dependencies
--> Running transaction check
---> Package smartsense-hst.x86_64 0:1.4.4.2.6.1.0-143 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
================================================================================
Package Arch Version Repository Size
================================================================================
Installing:
smartsense-hst x86_64 1.4.4.2.6.1.0-143 ambari-2.6.1.0 284 M
Transaction Summary
================================================================================
Install 1 Package
Total download size: 284 M
Installed size: 295 M
Downloading packages:
Std Err: http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.1.0/smartsense/smartsense-hst-1.4.4.2.6.1.0-143.x86_64.rpm: [Errno -1] Package does not match intended download. Suggestion: run yum --enablerepo=ambari-2.6.1.0 clean metadata
Trying other mirror.
Error downloading packages:
smartsense-hst-1.4.4.2.6.1.0-143.x86_64: [Errno 256] No more mirrors to try.
error: File not found by glob: /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
2018-02-09 09:49:35,454 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-02-09 09:49:35,459 - Skipping stack-select on SMARTSENSE because it does not exist in the stack-select package structure.
Command failed after 1 tries
When I run the script manuall on the server I got this error message:
smartsense-hst-1.4.4.2.6.1.0-1 FAILED http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.1.0/smartsense/smartsense-hst-1.4.4.2.6.1.0-143.x86_64.rpm: [Errno -1] Package does not match intended download. Suggestion: run yum --enablerepo=ambari-2.6.1.0 clean metadata Trying other mirror. Error downloading packages: smartsense-hst-1.4.4.2.6.1.0-143.x86_64: [Errno 256] No more mirrors to try. error: File not found by glob: /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
07-04-2017
05:16 AM
False alarm everyone, my data quality is not good
... View more
06-30-2017
01:26 AM
Here is my code : %spark.spark val file=sc.textFile("/apps/hive/migration/unique_visitors_RT_craigieburn.csv") case class RT_craigieburn(
event_year:Integer,
event_month:Integer,
event_day:Integer,
event_hour:Integer,
event_minute:Integer,
event_second:Integer,
mac_address:String) val realtimeTable = file.map(s => s.split(",")).filter(s => s(0) != "event_year").map(
s => RT_craigieburn(s(0).toInt,
s(1).toInt,
s(2).toInt,
s(3).toInt,
s(4).toInt,
s(5).toInt,
s(6).replaceAll("\"", "")
)
).toDF() realtimeTable.registerTempTable("craigieburn") When I run, it says : file: org.apache.spark.rdd.RDD[String] = /apps/hive/migration/unique_visitors_RT_craigieburn.csv MapPartitionsRDD[46] at textFile at <console>:29
defined class RT_craigieburn
realtimeTable: org.apache.spark.sql.DataFrame = [event_year: int, event_month: int, event_day: int, event_hour: int, event_minute: int, event_second: int, mac_address: string] But when I tried to run spark sql %spark.sql select event_year
from craigieburn java.lang.NumberFormatException: For input string: ""
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:592)
at java.lang.Integer.parseInt(Integer.java:615)
at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229)
at scala.collection.immutable.StringOps.toInt(StringOps.scala:31)
at $line115675017136.$read$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$anonfun$3.apply(<console>:34)
at $line115675017136.$read$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$anonfun$3.apply(<console>:34)
at scala.collection.Iterator$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.sql.execution.SparkPlan$anonfun$5.apply(SparkPlan.scala:212)
at org.apache.spark.sql.execution.SparkPlan$anonfun$5.apply(SparkPlan.scala:212)
at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:1857)
at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:1857)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748) Can someone help me ? Thanks
... View more
Labels:
- Labels:
-
Apache Spark
03-20-2017
06:40 AM
My Create table script : CREATE TABLE IF NOT EXISTS ruckus.UV_MACADD_RT (
event_year INT,
event_month INT,
event_day INT,
event_hour INT,
event_minute INT,
event_second INT,
mac_address String
) COMMENT 'Visitors Realtime - Mac Address - using 15 Minutes' PARTITIONED BY (venue_id String) CLUSTERED BY (event_year) INTO 3 BUCKETS ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t' STORED AS orc
TBLPROPERTIES ('transactional'='true'); The table created successfully, but when I try to delete : delete from ruckus.uv_macadd_rt where venue_id='abc'; I got this error message. org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations. FYI : I have turned on ACID transaction using Ambari. Any idea what is happening ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
03-14-2017
12:40 AM
I found it, There is permission issues with the temporary folder. hdfs dfs -chmod -R 777 /tmp/hive Run this command and it fixed it.
... View more
03-13-2017
11:59 PM
Sorry, I copied wrong create table script. This is the correct one, and it has 8 fields. CREATE TABLE IF NOT EXISTS UV_MACADD_RT ( venue_id
String, event_year
INT, event_month
INT, event_day
INT, event_hour
INT, event_minute
INT, event_second
INT, mac_address
String ); And this is the insert script ... use ruckus;
INSERT INTO TABLE uv_macadd_rt VALUES ('centre01',2017,1,15,23,45,0,'00037F000000'); still throwing the same exception.
... View more
03-13-2017
06:41 AM
Hi, I am quite new to Big Data and working on Hive. I have a simple table in hive and trying to insert a record into the table CREATE TABLE IF NOT EXISTS UV_MACADD_RT ( venue_id
String, event_year
INT, event_month
INT, event_day
INT, event_hour
INT, mac_address
String ) COMMENT 'Visitors - Mac Address - using 15 Minutes' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' STORED
AS TEXTFILE; ==================== Then I tried to run this: USE ruckus;
INSERT INTO TABLE uv_macadd_rt VALUES ('centre01',2017,1,15,23,45,0,'00037F000000'); ============== But when I run it, I got this...... can someone help ? java.lang.Exception: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
java.lang.Exception: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.ambari.view.hive2.resources.jobs.JobService.getOne(JobService.java:142)
at sun.reflect.GeneratedMethodAccessor990.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1507)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter.doFilter(AmbariAuthorizationFilter.java:257)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.jwt.JwtAuthenticationFilter.doFilter(JwtAuthenticationFilter.java:96)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.apache.ambari.server.security.authentication.AmbariAuthenticationFilter.doFilter(AmbariAuthenticationFilter.java:88)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariUserAuthorizationFilter.doFilter(AmbariUserAuthorizationFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.MethodOverrideFilter.doFilter(MethodOverrideFilter.java:72)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.AmbariPersistFilter.doFilter(AmbariPersistFilter.java:47)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.AmbariViewsMDCLoggingFilter.doFilter(AmbariViewsMDCLoggingFilter.java:54)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.ViewThrottleFilter.doFilter(ViewThrottleFilter.java:161)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:294)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:499)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:427)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:212)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:201)
at org.apache.ambari.server.controller.AmbariHandlerList.handle(AmbariHandlerList.java:139)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:973)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1035)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:641)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:231)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
at org.apache.hive.jdbc.HiveStatement.waitForOperationToComplete(HiveStatement.java:348)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:251)
at org.apache.ambari.view.hive2.HiveJdbcConnectionDelegate.execute(HiveJdbcConnectionDelegate.java:49)
at org.apache.ambari.view.hive2.actor.StatementExecutor.runStatement(StatementExecutor.java:87)
at org.apache.ambari.view.hive2.actor.StatementExecutor.handleMessage(StatementExecutor.java:70)
at org.apache.ambari.view.hive2.actor.HiveActor.onReceive(HiveActor.java:38)
at akka.actor.UntypedActor$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
... View more
Labels:
- Labels:
-
Apache Hive