Member since
04-11-2016
174
Posts
29
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3389 | 06-28-2017 12:24 PM | |
2556 | 06-09-2017 07:20 AM | |
7094 | 08-18-2016 11:39 AM | |
5273 | 08-12-2016 09:05 AM | |
5425 | 08-09-2016 09:24 AM |
01-02-2017
12:36 PM
2 Kudos
HDP-2.5.0.0(2.5.0.0-1245) Query-1 returns 43 rows which include both the equipment no.s 0552094 and : select equipmentnumber, dimensionid, max(datemessage) maxdtmsg
from group_datascientist.fact_rtc_se
where (equipmentnumber = 0552094 or equipmentnumber = 1857547) and datemessage < '2016-03-01'
group by equipmentnumber, dimensionid order by equipmentnumber, maxdtmsg; +-------------------------+--------------------+-------------+--+
| equipmentnumber | dimensionid | maxdtmsg |
+-------------------------+--------------------+-------------+--+
| 0552094 | 33393 | 2016-01-11 |
| 0552094 | 23537 | 2016-01-15 |
| 0552094 | 26115 | 2016-01-28 |
| 0552094 | 23680 | 2016-01-29 |
| 0552094 | 23664 | 2016-01-29 |
| 0552094 | 73714 | 2016-01-29 |
| 0552094 | 23530 | 2016-02-02 |
| 0552094 | 23742 | 2016-02-03 |
| 0552094 | 840502 | 2016-02-04 |
| 0552094 | 322547 | 2016-02-18 |
| 0552094 | 24234 | 2016-02-19 |
| 0552094 | 24419 | 2016-02-19 |
| 0552094 | 324917 | 2016-02-24 |
| 0552094 | 1371670 | 2016-02-25 |
| 0552094 | 156684 | 2016-02-26 |
| 0552094 | 86745 | 2016-02-26 |
| 0552094 | 687957 | 2016-02-28 |
| 0552094 | 103128 | 2016-02-29 |
| 0552094 | 36081 | 2016-02-29 |
| 0552094 | 18943 | 2016-02-29 |
| 1857547 | 927956 | 2016-01-08 |
| 1857547 | 749597 | 2016-01-15 |
| 1857547 | 312955 | 2016-01-15 |
| 1857547 | 1117802 | 2016-01-20 |
| 1857547 | 903606 | 2016-01-27 |
| 1857547 | 196616 | 2016-01-29 |
| 1857547 | 621571 | 2016-02-05 |
| 1857547 | 175172 | 2016-02-08 |
| 1857547 | 663615 | 2016-02-10 |
| 1857547 | 194722 | 2016-02-11 |
| 1857547 | 175415 | 2016-02-12 |
| 1857547 | 241920 | 2016-02-15 |
| 1857547 | 1292068 | 2016-02-15 |
| 1857547 | 185040 | 2016-02-16 |
| 1857547 | 181682 | 2016-02-17 |
| 1857547 | 1234825 | 2016-02-18 |
| 1857547 | 1444875 | 2016-02-18 |
| 1857547 | 1175541 | 2016-02-19 |
| 1857547 | 179475 | 2016-02-19 |
| 1857547 | 1363760 | 2016-02-23 |
| 1857547 | 203597 | 2016-02-24 |
| 1857547 | 815551 | 2016-02-29 |
| 1857547 | 18943 | 2016-02-29 |
+-------------------------+--------------------+-------------+--+
43 rows selected In the query-2, I just used the IN operator for better readability but now only 23 rows are returned and only equip. no. 1857547 rows : select equipmentnumber, dimensionid, max(datemessage) maxdtmsg
from group_datascientist.fact_rtc_se
where equipmentnumber IN(0552094,1857547) and datemessage < '2016-03-01'
group by equipmentnumber, dimensionid order by equipmentnumber,maxdtmsg; +-------------------------+--------------------+-------------+--+
| equipmentnumber | dimensionid | maxdtmsg |
+-------------------------+--------------------+-------------+--+
| 1857547 | 927956 | 2016-01-08 |
| 1857547 | 749597 | 2016-01-15 |
| 1857547 | 312955 | 2016-01-15 |
| 1857547 | 1117802 | 2016-01-20 |
| 1857547 | 903606 | 2016-01-27 |
| 1857547 | 196616 | 2016-01-29 |
| 1857547 | 621571 | 2016-02-05 |
| 1857547 | 175172 | 2016-02-08 |
| 1857547 | 663615 | 2016-02-10 |
| 1857547 | 194722 | 2016-02-11 |
| 1857547 | 175415 | 2016-02-12 |
| 1857547 | 241920 | 2016-02-15 |
| 1857547 | 1292068 | 2016-02-15 |
| 1857547 | 185040 | 2016-02-16 |
| 1857547 | 181682 | 2016-02-17 |
| 1857547 | 1234825 | 2016-02-18 |
| 1857547 | 1444875 | 2016-02-18 |
| 1857547 | 179475 | 2016-02-19 |
| 1857547 | 1175541 | 2016-02-19 |
| 1857547 | 1363760 | 2016-02-23 |
| 1857547 | 203597 | 2016-02-24 |
| 1857547 | 18943 | 2016-02-29 |
| 1857547 | 815551 | 2016-02-29 |
+-------------------------+--------------------+-------------+--+
23 rows selected What am I missing ?
... View more
Labels:
- Labels:
-
Apache Hive
12-22-2016
12:32 PM
1 Kudo
HDP-2.5.0.0 using Ambari 2.4.0.1, Spark 2.0.1. I am having a Scala code that reads a 108MB csv file and uses the RandomForest. I run the following command : /usr/hdp/current/spark2-client/bin/spark-submit --class samples.FuelModel --master yarn --deploy-mode cluster --driver-memory 8g spark-assembly-1.0.jar The console output : 16/12/22 09:07:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/12/22 09:07:25 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
16/12/22 09:07:26 INFO TimelineClientImpl: Timeline service address: http://l4326pp.sss.com:8188/ws/v1/timeline/
16/12/22 09:07:26 INFO AHSProxy: Connecting to Application History server at l4326pp.sss.com/138.106.33.132:10200
16/12/22 09:07:26 INFO Client: Requesting a new application from cluster with 4 NodeManagers
16/12/22 09:07:26 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (204800 MB per container)
16/12/22 09:07:26 INFO Client: Will allocate AM container, with 9011 MB memory including 819 MB overhead
16/12/22 09:07:26 INFO Client: Setting up container launch context for our AM
16/12/22 09:07:26 INFO Client: Setting up the launch environment for our AM container
16/12/22 09:07:26 INFO Client: Preparing resources for our AM container
16/12/22 09:07:27 INFO YarnSparkHadoopUtil: getting token for namenode: hdfs://prodhadoop/user/ojoqcu/.sparkStaging/application_1481607361601_8315
16/12/22 09:07:27 INFO DFSClient: Created HDFS_DELEGATION_TOKEN token 79178 for ojoqcu on ha-hdfs:prodhadoop
16/12/22 09:07:28 INFO metastore: Trying to connect to metastore with URI thrift://l4327pp.sss.com:9083
16/12/22 09:07:29 INFO metastore: Connected to metastore.
16/12/22 09:07:29 INFO YarnSparkHadoopUtil: HBase class not found java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
16/12/22 09:07:29 INFO Client: Source and destination file systems are the same. Not copying hdfs:/lib/spark2_2.0.1.tar.gz
16/12/22 09:07:29 INFO Client: Uploading resource file:/localhome/ojoqcu/code/debug/Rikard/spark-assembly-1.0.jar -> hdfs://prodhadoop/user/ojoqcu/.sparkStaging/application_1481607361601_8315/spark-assembly-1.0.jar
16/12/22 09:07:29 INFO Client: Uploading resource file:/tmp/spark-ff9db580-00db-476e-9086-377c60bc7e2a/__spark_conf__1706674327523194508.zip -> hdfs://prodhadoop/user/ojoqcu/.sparkStaging/application_1481607361601_8315/__spark_conf__.zip
16/12/22 09:07:29 WARN Client: spark.yarn.am.extraJavaOptions will not take effect in cluster mode
16/12/22 09:07:29 INFO SecurityManager: Changing view acls to: ojoqcu
16/12/22 09:07:29 INFO SecurityManager: Changing modify acls to: ojoqcu
16/12/22 09:07:29 INFO SecurityManager: Changing view acls groups to:
16/12/22 09:07:29 INFO SecurityManager: Changing modify acls groups to:
16/12/22 09:07:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ojoqcu); groups with view permissions: Set(); users with modify permissions: Set(ojoqcu); groups with modify permissions: Set()
16/12/22 09:07:29 INFO Client: Submitting application application_1481607361601_8315 to ResourceManager
16/12/22 09:07:30 INFO YarnClientImpl: Submitted application application_1481607361601_8315
16/12/22 09:07:31 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:07:31 INFO Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: dataScientist
start time: 1482394049862
final status: UNDEFINED
tracking URL: http://l4327pp.sss.com:8088/proxy/application_1481607361601_8315/
user: ojoqcu
16/12/22 09:07:32 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:07:33 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:07:34 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:07:35 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:35 INFO Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: N/A
ApplicationMaster host: 138.106.33.145
ApplicationMaster RPC port: 0
queue: dataScientist
start time: 1482394049862
final status: UNDEFINED
tracking URL: http://l4327pp.sss.com:8088/proxy/application_1481607361601_8315/
user: ojoqcu
16/12/22 09:07:36 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:37 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:38 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:39 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:40 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:41 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:42 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:43 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:44 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:45 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:46 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:47 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:48 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:49 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:50 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:51 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:52 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:53 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:54 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:55 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:56 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:57 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:58 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:07:59 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:00 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:01 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:02 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:03 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:04 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:05 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:06 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:07 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:08 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:09 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:10 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:11 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:12 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:13 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:14 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:15 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:16 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:17 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:18 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:19 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:20 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:21 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:22 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:23 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:24 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:25 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:26 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:27 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:28 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:29 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:30 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:31 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:32 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:33 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:34 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:35 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:36 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:37 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:38 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:39 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:40 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:41 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:42 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:43 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:44 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:45 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:46 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:47 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:48 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:49 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:50 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:51 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:52 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:08:52 INFO Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: dataScientist
start time: 1482394049862
final status: UNDEFINED
tracking URL: http://l4327pp.sss.com:8088/proxy/application_1481607361601_8315/
user: ojoqcu
16/12/22 09:08:53 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:08:54 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:08:55 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:08:56 INFO Client: Application report for application_1481607361601_8315 (state: ACCEPTED)
16/12/22 09:08:57 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:57 INFO Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: N/A
ApplicationMaster host: 138.106.33.144
ApplicationMaster RPC port: 0
queue: dataScientist
start time: 1482394049862
final status: UNDEFINED
tracking URL: http://l4327pp.sss.com:8088/proxy/application_1481607361601_8315/
user: ojoqcu
16/12/22 09:08:58 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:08:59 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:00 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:01 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:02 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:03 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:04 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:05 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:06 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:07 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:08 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:09 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:10 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:11 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:12 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:13 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:14 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:15 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:16 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:17 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:18 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:19 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:20 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:21 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:22 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:23 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:24 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:25 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:26 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:27 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:28 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:29 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:30 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:31 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:32 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:33 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:34 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:35 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:36 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:37 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:38 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:39 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:40 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:41 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:42 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:43 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:44 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:45 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:46 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:47 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:48 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:49 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:50 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:51 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:52 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:53 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:54 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:55 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:56 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:57 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:58 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:09:59 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:00 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:01 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:02 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:03 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:04 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:05 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:06 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:07 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:08 INFO Client: Application report for application_1481607361601_8315 (state: RUNNING)
16/12/22 09:10:09 INFO Client: Application report for application_1481607361601_8315 (state: FINISHED)
16/12/22 09:10:09 INFO Client:
client token: Token { kind: YARN_CLIENT_TOKEN, service: }
diagnostics: User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 28.0 failed 4 times, most recent failure: Lost task 1.3 in stage 28.0 (TID 59, l4328pp.sss.com): ExecutorLostFailure (executor 3 exited caused by one of the running tasks) Reason: Container marked as failed: container_e63_1481607361601_8315_02_000005 on host: l4328pp.sss.com. Exit status: 143. Diagnostics: Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
Killed by external signal
Driver stacktrace:
ApplicationMaster host: 138.106.33.144
ApplicationMaster RPC port: 0
queue: dataScientist
start time: 1482394049862
final status: FAILED
tracking URL: http://l4327pp.sss.com:8088/proxy/application_1481607361601_8315/
user: ojoqcu
Exception in thread "main" org.apache.spark.SparkException: Application application_1481607361601_8315 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1132)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1175)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/12/22 09:10:09 INFO ShutdownHookManager: Shutdown hook called
16/12/22 09:10:09 INFO ShutdownHookManager: Deleting directory /tmp/spark-ff9db580-00db-476e-9086-377c60bc7e2a377c60bc7e2a Partial output from the YARN log from one of the nodes : 2016-12-22 09:08:54,205 INFO container.ContainerImpl (ContainerImpl.java:handle(1163)) - Container container_e63_1481607361601_8315_02_000001 transitioned from LOCALIZED to RUNNING
2016-12-22 09:08:54,206 INFO runtime.DelegatingLinuxContainerRuntime (DelegatingLinuxContainerRuntime.java:pickContainerRuntime(67)) - Using container runtime: DefaultLinuxContainerRuntime
2016-12-22 09:08:56,779 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(375)) - Starting resource-monitoring for container_e63_1481607361601_8315_02_000001
2016-12-22 09:08:56,810 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 545.1 MB of 12 GB physical memory used; 10.3 GB of 25.2 GB virtual memory used
2016-12-22 09:08:59,850 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 804.6 MB of 12 GB physical memory used; 10.4 GB of 25.2 GB virtual memory used
2016-12-22 09:09:02,886 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 973.3 MB of 12 GB physical memory used; 10.4 GB of 25.2 GB virtual memory used
2016-12-22 09:09:05,921 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:08,957 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:12,000 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:15,037 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:18,080 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:21,116 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:24,153 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.2 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:27,193 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.4 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:30,225 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.4 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:33,270 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.4 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:36,302 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.4 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:39,339 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.6 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:42,376 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.6 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:45,422 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.6 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:48,459 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.6 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:51,502 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.7 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:54,539 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.7 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:09:57,583 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.7 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:10:00,611 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.7 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:10:02,059 INFO ipc.Server (Server.java:saslProcess(1538)) - Auth successful for appattempt_1481607361601_8315_000002 (auth:SIMPLE)
2016-12-22 09:10:02,061 INFO authorize.ServiceAuthorizationManager (ServiceAuthorizationManager.java:authorize(137)) - Authorization successful for appattempt_1481607361601_8315_000002 (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.api.ContainerManagementProtocolPB
2016-12-22 09:10:02,062 INFO containermanager.ContainerManagerImpl (ContainerManagerImpl.java:startContainerInternal(810)) - Start request for container_e63_1481607361601_8315_02_000005 by user ojoqcu
2016-12-22 09:10:02,063 INFO application.ApplicationImpl (ApplicationImpl.java:transition(304)) - Adding container_e63_1481607361601_8315_02_000005 to application application_1481607361601_8315
2016-12-22 09:10:02,063 INFO container.ContainerImpl (ContainerImpl.java:handle(1163)) - Container container_e63_1481607361601_8315_02_000005 transitioned from NEW to LOCALIZING
2016-12-22 09:10:02,063 INFO containermanager.AuxServices (AuxServices.java:handle(215)) - Got event CONTAINER_INIT for appId application_1481607361601_8315
2016-12-22 09:10:02,063 INFO yarn.YarnShuffleService (YarnShuffleService.java:initializeContainer(184)) - Initializing container container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:02,063 INFO yarn.YarnShuffleService (YarnShuffleService.java:initializeContainer(270)) - Initializing container container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:02,063 INFO container.ContainerImpl (ContainerImpl.java:handle(1163)) - Container container_e63_1481607361601_8315_02_000005 transitioned from LOCALIZING to LOCALIZED
2016-12-22 09:10:02,063 INFO nodemanager.NMAuditLogger (NMAuditLogger.java:logSuccess(89)) - USER=ojoqcuIP=138.106.33.144OPERATION=Start Container RequestTARGET=ContainerManageImplRESULT=SUCCESSAPPID=application_1481607361601_8315CONTAINERID=container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:02,078 INFO container.ContainerImpl (ContainerImpl.java:handle(1163)) - Container container_e63_1481607361601_8315_02_000005 transitioned from LOCALIZED to RUNNING
2016-12-22 09:10:02,078 INFO runtime.DelegatingLinuxContainerRuntime (DelegatingLinuxContainerRuntime.java:pickContainerRuntime(67)) - Using container runtime: DefaultLinuxContainerRuntime
2016-12-22 09:10:03,612 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(375)) - Starting resource-monitoring for container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:03,647 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 34005 for container-id container_e63_1481607361601_8315_02_000005: 452.4 MB of 4 GB physical memory used; 3.1 GB of 8.4 GB virtual memory used
2016-12-22 09:10:03,673 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.7 GB of 12 GB physical memory used; 10.5 GB of 25.2 GB virtual memory used
2016-12-22 09:10:06,708 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 34005 for container-id container_e63_1481607361601_8315_02_000005: 815.3 MB of 4 GB physical memory used; 3.1 GB of 8.4 GB virtual memory used
2016-12-22 09:10:06,738 INFO monitor.ContainersMonitorImpl (ContainersMonitorImpl.java:run(464)) - Memory usage of ProcessTree 32969 for container-id container_e63_1481607361601_8315_02_000001: 1.8 GB of 12 GB physical memory used; 10.6 GB of 25.2 GB virtual memory used
2016-12-22 09:10:09,094 WARN privileged.PrivilegedOperationExecutor (PrivilegedOperationExecutor.java:executePrivilegedOperation(170)) - Shell execution returned exit code: 143. Privileged Execution Operation Output:
main : command provided 1
main : run as user is ojoqcu
main : requested yarn user is ojoqcu
Getting exit code file...
Creating script paths...
Writing pid file...
Writing to tmp file /opt/hdfsdisks/sdh/yarn/local/nmPrivate/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005/container_e63_1481607361601_8315_02_000005.pid.tmp
Writing to cgroup task files...
Creating local dirs...
Launching container...
Getting exit code file...
Creating script paths...
Full command array for failed execution:
[/usr/hdp/current/hadoop-yarn-nodemanager/bin/container-executor, ojoqcu, ojoqcu, 1, application_1481607361601_8315, container_e63_1481607361601_8315_02_000005, /opt/hdfsdisks/sdg/yarn/local/usercache/ojoqcu/appcache/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005, /opt/hdfsdisks/sdb/yarn/local/nmPrivate/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005/launch_container.sh, /opt/hdfsdisks/sdk/yarn/local/nmPrivate/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005/container_e63_1481607361601_8315_02_000005.tokens, /opt/hdfsdisks/sdh/yarn/local/nmPrivate/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005/container_e63_1481607361601_8315_02_000005.pid, /opt/hdfsdisks/sdb/yarn/local%/opt/hdfsdisks/sdc/yarn/local%/opt/hdfsdisks/sdd/yarn/local%/opt/hdfsdisks/sde/yarn/local%/opt/hdfsdisks/sdf/yarn/local%/opt/hdfsdisks/sdg/yarn/local%/opt/hdfsdisks/sdh/yarn/local%/opt/hdfsdisks/sdi/yarn/local%/opt/hdfsdisks/sdj/yarn/local%/opt/hdfsdisks/sdk/yarn/local%/opt/hdfsdisks/sdl/yarn/local%/opt/hdfsdisks/sdm/yarn/local, /opt/hdfsdisks/sdb/yarn/log%/opt/hdfsdisks/sdc/yarn/log%/opt/hdfsdisks/sdd/yarn/log%/opt/hdfsdisks/sde/yarn/log%/opt/hdfsdisks/sdf/yarn/log%/opt/hdfsdisks/sdg/yarn/log%/opt/hdfsdisks/sdh/yarn/log%/opt/hdfsdisks/sdi/yarn/log%/opt/hdfsdisks/sdj/yarn/log%/opt/hdfsdisks/sdk/yarn/log%/opt/hdfsdisks/sdl/yarn/log%/opt/hdfsdisks/sdm/yarn/log, cgroups=none]
2016-12-22 09:10:09,095 WARN runtime.DefaultLinuxContainerRuntime (DefaultLinuxContainerRuntime.java:launchContainer(107)) - Launch container failed. Exception:
org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.privileged.PrivilegedOperationException: ExitCodeException exitCode=143:
at org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.privileged.PrivilegedOperationExecutor.executePrivilegedOperation(PrivilegedOperationExecutor.java:175)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DefaultLinuxContainerRuntime.launchContainer(DefaultLinuxContainerRuntime.java:103)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DelegatingLinuxContainerRuntime.launchContainer(DelegatingLinuxContainerRuntime.java:89)
at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:392)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: ExitCodeException exitCode=143:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:933)
at org.apache.hadoop.util.Shell.run(Shell.java:844)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1123)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.privileged.PrivilegedOperationExecutor.executePrivilegedOperation(PrivilegedOperationExecutor.java:150)
... 9 more
2016-12-22 09:10:09,095 WARN nodemanager.LinuxContainerExecutor (LinuxContainerExecutor.java:launchContainer(400)) - Exit code from container container_e63_1481607361601_8315_02_000005 is : 143
2016-12-22 09:10:09,095 INFO container.ContainerImpl (ContainerImpl.java:handle(1163)) - Container container_e63_1481607361601_8315_02_000005 transitioned from RUNNING to EXITED_WITH_FAILURE
2016-12-22 09:10:09,095 INFO launcher.ContainerLaunch (ContainerLaunch.java:cleanupContainer(425)) - Cleaning up container container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:09,095 INFO runtime.DelegatingLinuxContainerRuntime (DelegatingLinuxContainerRuntime.java:pickContainerRuntime(67)) - Using container runtime: DefaultLinuxContainerRuntime
2016-12-22 09:10:09,114 INFO nodemanager.LinuxContainerExecutor (LinuxContainerExecutor.java:deleteAsUser(537)) - Deleting absolute path : /opt/hdfsdisks/sdb/yarn/local/usercache/ojoqcu/appcache/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:09,114 INFO nodemanager.LinuxContainerExecutor (LinuxContainerExecutor.java:deleteAsUser(537)) - Deleting absolute path : /opt/hdfsdisks/sdc/yarn/local/usercache/ojoqcu/appcache/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:09,115 INFO nodemanager.LinuxContainerExecutor (LinuxContainerExecutor.java:deleteAsUser(537)) - Deleting absolute path : /opt/hdfsdisks/sdd/yarn/local/usercache/ojoqcu/appcache/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:09,115 INFO nodemanager.LinuxContainerExecutor (LinuxContainerExecutor.java:deleteAsUser(537)) - Deleting absolute path : /opt/hdfsdisks/sde/yarn/local/usercache/ojoqcu/appcache/application_1481607361601_8315/container_e63_1481607361601_8315_02_000005
2016-12-22 09:10:09,115 WARN nodemanager.NMAuditLogger (NMAuditLogger.java:logFailure(150)) - USER=ojoqcuOPERATION=Container Finished - FailedTARGET=ContainerImplRESULT=FAILUREDESCRIPTION=Container failed with state: EXITED_WITH_FAILUREAPPID=application_1481607361601_8315CONTAINERID=container_e63_1481607361601_8315_02_000005 At least one container is killed, probably, due to OutOfMemory as the heap space overflows. Attached are the screenshots from the SparkUI. How shall I proceed :
Would the using higher values for options like --driver-memory
and --executor-memory help ? Is there some Spark setting that needs to be changed via Ambari ?
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
11-23-2016
08:03 AM
2 Kudos
Production system : HDP-2.5.0.0 using Ambari 2.4.0.1 Aplenty demands coming in for executing a range of code(Java MR etc., Scala, Spark, R) atop the HDP but from a desktop Windows machine IDE i.e execute the code locally from IDE but it gets submitted, runs on the remote cluster and prints back the output. For Spark and R, we have R-Studio set-up. The challenge lies with Java, Scala and so on, also, people use a range of IDEs from Eclipse to IntelliJ Idea. I am aware that the Eclipse Hadoop plugin is NOT actively maintained and also has aplenty bugs when working with latest versions of Hadoop, IntelliJ Idea I couldn't find reliable inputs from the official website. I believe the Hive and HBase client API is a reliable way to connect from Eclipse etc. but I am skeptical about executing MR or other custom Java/Scala code. I referred several threads like this and this, however, I still have the question that is any IDE like Eclipse/Intellij Idea having an official support for Hadoop ? Even the Spring Data for Hadoop seems to lost traction, it anyways didn't work as expected 2 years ago 😉 As a realistic alternative, which tool/plugin/library should be used to test the MR and other Java/Scala code 'locally' i.e on the desktop machine using a standalone version of the cluster ? Note : I do not wish to work against/in the sandbox, its about connecting to the prod. cluster directly.
... View more
11-04-2016
09:47 AM
HDP-2.5.0.0 using Ambari 2.4.0.1 I was trying to execute a Hive query like : SELECT DISTINCT ...... ,ROW_NUMBER() OVER(PARTITION BY .....) when I got the error : Error: Error while compiling statement: FAILED: SemanticException SELECT DISTINCT not allowed in the presence of windowing functions when CBO is off (state=42000,code=40000) I ended up in this Apache Hive JIRA case. I have the following questions :
The said JIRA case talks about Hive versions 2.x. Given the current HDP version, I am using Hive 1.2.1.2.5, yet, I have the same error(which is possible, logically) The JIRA case is 'resolved' in mid-October, the patch seems to be available but I guess it is for Hive 2.x Are there ways to get notified/track and implement these patches(here it's Hive but, in general, for all services) between two releases i.e now I am using 2.5 but do I have to live without the patches till the next version is released ?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
10-13-2016
07:45 AM
I think your approach of 'load the data into orc partitioned tables using dynamic partitions pulling from the external tables' is good enough to achieve partitioning. I'm curious to try out if a Hive managed, ORC dynamically PARTITIONED table can be directly created from the external Avro based table so that one can specify only the partitioning key AND not the whole set of columns : Current Step-3. create table dimoriginal_orc ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' TBLPROPERTIES ('orc.compress'='ZLIB') AS select * from dimoriginal_avro_compressed;
... View more
10-12-2016
06:57 PM
4 Kudos
Upgraded to HDP-2.5.0.0 using Ambari 2.4.0.1 There are several SQL Server and Oracle database schema that need to imported to HDFS/Hive. The current approach is working fine : Sqoop import from RDBMS to HDFS in avro format Creation of a Hive external table atop the avro files Copying the data from the Hive external table in a managed, ORC table as 'CREATE TABLE ... AS SELECT * FROM ...' Many tables in the SQL Server and Oracle schema are partitioned. As per the Sqoop documentation, at least for Oracle, it seems that that the data on HDFS can be 'partitioned' based on the source table partitions, similar options don't seem to exist for SQL Server. I have the following questions : Can Sqoop figure out the column(s) on which the source table is partitioned ? Irrespective of the source db, can the files resulting in Step 1. above be 'partitioned' (stored in different directories) on the HDFS? Assuming that partitioning won't help in step 1., would it make sense in step 2. ? If yes, will the the ORC table in Step 3. inherit the partitions ? Assuming that partitioning is possible only in step 3. : A repetition of question 1. - can the table's partitioning column be determined auto. and used auto. as well The Sqoop create hive table doesn't help with the partitioning, also, this approach means again hitting the source db, even though for just for the metadata
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
10-11-2016
10:35 AM
It worked :)
Can you provide the JIRA bug link ?
... View more
10-11-2016
09:36 AM
HDP-2.5.0.0 using Ambari 2.4.0.1 A Sqoop import to avro fails with the following error : 16/10/11 08:26:32 INFO mapreduce.Job: Job job_1476162030393_0002 running in uber mode : false
16/10/11 08:26:32 INFO mapreduce.Job: map 0% reduce 0%
16/10/11 08:26:40 INFO mapreduce.Job: map 25% reduce 0%
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:41 INFO mapreduce.Job: map 0% reduce 0%
16/10/11 08:26:42 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:46 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:48 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:52 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:57 INFO mapreduce.Job: map 100% reduce 0%
16/10/11 08:26:57 INFO mapreduce.Job: Job job_1476162030393_0002 failed with state FAILED due to: Task failed task_1476162030393_0002_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0 The YARN application log ends with : FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
at org.apache.sqoop.mapreduce.AvroOutputFormat.getRecordWriter(AvroOutputFormat.java:97)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) The original installation had the following libraries under /usr/hdp/2.5.0.0-1245/sqoop/lib: avro-mapred-1.8.0-hadoop2.jar, parquet-avro-1.4.1.jar, avro-1.8.0.jar I tried first replacing(ONLY one jar at a time under the lib) avro-mapred-1.8.0-hadoop2.jar with avro-mapred-1.8.1-hadoop2.jar and avro-mapred-1.7.7-hadoop2.jar. When that didn't help, I tried using the jars from the HDP 2.4 distribution viz. avro-1.7.5.jar and avro-mapred-1.7.5-hadoop2.jar, yet the error persisted. How shall I fix the error ?
... View more
Labels:
- Labels:
-
Apache Sqoop
09-23-2016
07:39 AM
1 Kudo
HDP-2.5.0.0 using Ambari 2.4.0.1(both upgraded from HDP 2.4 installations) I had used the following command to import a SQL Server table in a HCatalog MANAGED table in ORC format : sqoop import --null-string '\\N' --null-non-string '\\N' --hive-delims-replacement '\0D' --num-mappers 8 --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database VERA_ODP_DW_dbo --hcatalog-table DimSnapshot --create-hcatalog-table --hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="ZLIB")' --validate --connect 'jdbc:sqlserver://server_name;database=ODP_DW' --username uname --password passwd --table DimSnapshot -- --schema dbo 2>&1| tee -a ODP_DW_dbo.DimSnapshot.log Now, I wish to do an incremental load IN A SINGLE STEP but I am facing the following challenges(I am testing for just one row to start with) :
Sqoop doesn't support HCatalog incremental load sqoop import --null-string '\\N' --null-non-string '\\N' --hive-delims-replacement '\0D' --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database VERA_ODP_DW_dbo --hcatalog-table dimsnapshot --incremental append --table dimsnapshot --check-column SnapShot_Id --last-value 10456476 --connect 'jdbc:sqlserver://server_name;database=ODP_DW' --username uname --password passwd -- --schema dbo
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/09/22 16:36:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
16/09/22 16:36:47 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/09/22 16:36:47 WARN tool.BaseSqoopTool: Output field/record delimiter options are not useful in HCatalog jobs for most of the output types except text based formats is text. It is better to use --hive-import in those cases. For non text formats,
Append mode for imports is not compatible with HCatalog. Please remove the parameter--append-mode
The --hive-import works(forced to use --map-column-hive for a binary column in the source db which was imported without such mapping during the first HCatalog import) but fails for the ORC format but the part files are created on the HDFS under /user/<user-name>/dimsnapshot : -bash-4.2$ sqoop import --null-string '\\N' --null-non-string '\\N' --hive-delims-replacement '\0D' --incremental append --table dimsnapshot --check-column SnapShot_Id --last-value 10456476 --hive-import --hive-table vera_odp_dw_dbo.dimsnapshot --map-column-hive ENGINE_RUNTIME_UNIT=binary --connect 'jdbc:sqlserver://server_name;database=ODP_DW' --username uname --password passwd --table dimsnapshot -- --schema dbo
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/09/23 09:11:44 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
16/09/23 09:11:44 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/09/23 09:11:44 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/09/23 09:11:44 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/09/23 09:11:44 INFO manager.SqlManager: Using default fetchSize of 1000
16/09/23 09:11:44 INFO manager.SQLServerManager: We will use schema dbo
16/09/23 09:11:44 INFO tool.CodeGenTool: Beginning code generation
16/09/23 09:11:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [dbo].[dimsnapshot] AS t WHERE 1=0
16/09/23 09:11:45 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-ojoqcu/compile/61154b6349bbdd9a60aba2b4a1d4a919/dimsnapshot.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/09/23 09:11:47 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-ojoqcu/compile/61154b6349bbdd9a60aba2b4a1d4a919/dimsnapshot.jar
16/09/23 09:11:48 INFO tool.ImportTool: Maximal id query for free form incremental import: SELECT MAX([SnapShot_Id]) FROM [dbo].[dimsnapshot]
16/09/23 09:11:48 INFO tool.ImportTool: Incremental import based on column [SnapShot_Id]
16/09/23 09:11:48 INFO tool.ImportTool: Lower bound value: 10456476
16/09/23 09:11:48 INFO tool.ImportTool: Upper bound value: 10456477
16/09/23 09:11:48 INFO mapreduce.ImportJobBase: Beginning import of dimsnapshot
16/09/23 09:11:48 INFO impl.TimelineClientImpl: Timeline service address: http://l4373t.sss.com:8188/ws/v1/timeline/
16/09/23 09:11:48 INFO client.RMProxy: Connecting to ResourceManager at l4283t.sss.com/138.106.9.80:8050
16/09/23 09:11:49 INFO client.AHSProxy: Connecting to Application History server at l4373t.sss.com/138.106.5.5:10200
16/09/23 09:11:50 INFO db.DBInputFormat: Using read commited transaction isolation
16/09/23 09:11:50 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN([SnapShot_Id]), MAX([SnapShot_Id]) FROM [dbo].[dimsnapshot] WHERE ( [SnapShot_Id] > 10456476 AND [SnapShot_Id] <= 10456477 )
16/09/23 09:11:50 INFO db.IntegerSplitter: Split size: 0; Num splits: 4 from: 10456477 to: 10456477
16/09/23 09:11:50 INFO mapreduce.JobSubmitter: number of splits:1
16/09/23 09:11:50 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1474453939755_0013
16/09/23 09:11:51 INFO impl.YarnClientImpl: Submitted application application_1474453939755_0013
16/09/23 09:11:51 INFO mapreduce.Job: The url to track the job: http://l4283t.sss.com:8088/proxy/application_1474453939755_0013/
16/09/23 09:11:51 INFO mapreduce.Job: Running job: job_1474453939755_0013
16/09/23 09:11:58 INFO mapreduce.Job: Job job_1474453939755_0013 running in uber mode : false
16/09/23 09:11:58 INFO mapreduce.Job: map 0% reduce 0%
16/09/23 09:12:04 INFO mapreduce.Job: map 100% reduce 0%
16/09/23 09:12:04 INFO mapreduce.Job: Job job_1474453939755_0013 completed successfully
16/09/23 09:12:04 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=164186
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=131
HDFS: Number of bytes written=217
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=3973
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=3973
Total vcore-milliseconds taken by all map tasks=3973
Total megabyte-milliseconds taken by all map tasks=16273408
Map-Reduce Framework
Map input records=1
Map output records=1
Input split bytes=131
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=71
CPU time spent (ms)=1440
Physical memory (bytes) snapshot=234348544
Virtual memory (bytes) snapshot=5465473024
Total committed heap usage (bytes)=217055232
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=217
16/09/23 09:12:04 INFO mapreduce.ImportJobBase: Transferred 217 bytes in 16.2893 seconds (13.3216 bytes/sec)
16/09/23 09:12:04 INFO mapreduce.ImportJobBase: Retrieved 1 records.
16/09/23 09:12:04 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
16/09/23 09:12:04 INFO util.AppendUtils: Appending to directory dimsnapshot
16/09/23 09:12:04 INFO util.AppendUtils: Using found partition 4
16/09/23 09:12:04 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [dbo].[dimsnapshot] AS t WHERE 1=0
16/09/23 09:12:04 WARN hive.TableDefWriter: Column od_import_dt had to be cast to a less precise type in Hive
16/09/23 09:12:04 WARN hive.TableDefWriter: Column snapshot_export_dt had to be cast to a less precise type in Hive
16/09/23 09:12:04 WARN hive.TableDefWriter: Column INSERTION_DATE had to be cast to a less precise type in Hive
16/09/23 09:12:04 WARN hive.TableDefWriter: Column utcDate had to be cast to a less precise type in Hive
16/09/23 09:12:04 INFO hive.HiveImport: Loading uploaded data into Hive
16/09/23 09:12:04 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/09/23 09:12:04 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
Logging initialized using configuration in jar:file:/usr/hdp/2.5.0.0-1245/hive/lib/hive-common-1.2.1000.2.5.0.0-1245.jar!/hive-log4j.properties
OK
Time taken: 2.031 seconds
FAILED: SemanticException [Error 30019]: The file that you are trying to load does not match the file format of the destination table. Destination table is stored as ORC but the file being loaded is not a valid ORC file.
-bash-4.2$ How shall I proceed ?
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Sqoop
09-09-2016
02:46 PM
a defined HDP version - not necessarily the latest Yes, that's correct possibly a "built in" data set that is provisioned with the VM The 'built-in' data set will be different/customized for each VM spawned(as it will be used by different roles) The tutorial seems informative but I have a question - can Vagrant connect to the prod. cluster WITHOUT MAJOR changes to the prod. machines and spawn VMs as required with custom data sets ? Apologies if it sounds stupid but I'm unable to visualize how Vagrant will work with the prod. cluster
... View more