Ivy Default Cache set to: /home/spark/.ivy2/cache The jars for the packages stored in: /home/spark/.ivy2/jars http://repo.hortonworks.com/content/groups/public/ added as a remote repository with the name: repo-1 :: loading settings :: url = jar:file:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/apache/ivy/core/settings/ivysettings.xml com.hortonworks#shc-core added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 confs: [default] found com.hortonworks#shc-core;1.0.1-1.6-s_2.10 in repo-1 found org.apache.hbase#hbase-server;1.1.2 in repo-1 found org.apache.hbase#hbase-protocol;1.1.2 in repo-1 found org.apache.hbase#hbase-annotations;1.1.2 in repo-1 found com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 in repo-1 found log4j#log4j;1.2.17 in repo-1 found junit#junit;4.11 in repo-1 found org.hamcrest#hamcrest-core;1.3 in repo-1 found com.google.protobuf#protobuf-java;2.5.0 in repo-1 found org.apache.hbase#hbase-procedure;1.1.2 in repo-1 found com.google.guava#guava;12.0.1 in repo-1 found com.google.code.findbugs#jsr305;1.3.9 in repo-1 found org.apache.hbase#hbase-client;1.1.2 in repo-1 found commons-codec#commons-codec;1.9 in repo-1 found commons-io#commons-io;2.4 in repo-1 found commons-lang#commons-lang;2.6 in repo-1 found io.netty#netty-all;4.0.23.Final in repo-1 found org.apache.zookeeper#zookeeper;3.4.6 in repo-1 found org.slf4j#slf4j-api;1.7.7 in repo-1 found org.slf4j#slf4j-log4j12;1.6.1 in repo-1 found org.apache.htrace#htrace-core;3.1.0-incubating in repo-1 found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in repo-1 found org.codehaus.jackson#jackson-core-asl;1.9.13 in repo-1 found org.jruby.jcodings#jcodings;1.0.8 in repo-1 found org.jruby.joni#joni;2.1.2 in repo-1 found commons-httpclient#commons-httpclient;3.1 in repo-1 found commons-collections#commons-collections;3.2.1 in repo-1 found com.yammer.metrics#metrics-core;2.2.0 in repo-1 found com.sun.jersey#jersey-core;1.9 in repo-1 found com.sun.jersey#jersey-server;1.9 in repo-1 found commons-cli#commons-cli;1.2 in repo-1 found org.apache.commons#commons-math;2.2 in repo-1 found org.mortbay.jetty#jetty;6.1.26 in repo-1 found org.mortbay.jetty#jetty-util;6.1.26 in repo-1 found org.mortbay.jetty#jetty-sslengine;6.1.26 in repo-1 found org.mortbay.jetty#jsp-2.1;6.1.14 in repo-1 found org.mortbay.jetty#jsp-api-2.1;6.1.14 in repo-1 found org.mortbay.jetty#servlet-api-2.5;6.1.14 in repo-1 found org.codehaus.jackson#jackson-jaxrs;1.9.13 in repo-1 found tomcat#jasper-compiler;5.5.23 in repo-1 found org.jamon#jamon-runtime;2.3.1 in repo-1 found com.lmax#disruptor;3.3.0 in repo-1 found org.apache.hbase#hbase-prefix-tree;1.1.2 in repo-1 found org.mortbay.jetty#servlet-api;2.5-20081211 in repo-1 found tomcat#jasper-runtime;5.5.23 in repo-1 found commons-el#commons-el;1.0 in repo-1 found org.apache.hbase#hbase-common;1.1.2 in repo-1 found org.apache.avro#avro;1.7.6 in repo-1 found com.thoughtworks.paranamer#paranamer;2.3 in repo-1 found org.xerial.snappy#snappy-java;1.0.5 in repo-1 found org.apache.commons#commons-compress;1.4.1 in repo-1 found org.tukaani#xz;1.0 in repo-1 :: resolution report :: resolve 7767ms :: artifacts dl 27ms :: modules in use: com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 from repo-1 in [default] com.google.code.findbugs#jsr305;1.3.9 from repo-1 in [default] com.google.guava#guava;12.0.1 from repo-1 in [default] com.google.protobuf#protobuf-java;2.5.0 from repo-1 in [default] com.hortonworks#shc-core;1.0.1-1.6-s_2.10 from repo-1 in [default] com.lmax#disruptor;3.3.0 from repo-1 in [default] com.sun.jersey#jersey-core;1.9 from repo-1 in [default] com.sun.jersey#jersey-server;1.9 from repo-1 in [default] com.thoughtworks.paranamer#paranamer;2.3 from repo-1 in [default] com.yammer.metrics#metrics-core;2.2.0 from repo-1 in [default] commons-cli#commons-cli;1.2 from repo-1 in [default] commons-codec#commons-codec;1.9 from repo-1 in [default] commons-collections#commons-collections;3.2.1 from repo-1 in [default] commons-el#commons-el;1.0 from repo-1 in [default] commons-httpclient#commons-httpclient;3.1 from repo-1 in [default] commons-io#commons-io;2.4 from repo-1 in [default] commons-lang#commons-lang;2.6 from repo-1 in [default] io.netty#netty-all;4.0.23.Final from repo-1 in [default] junit#junit;4.11 from repo-1 in [default] log4j#log4j;1.2.17 from repo-1 in [default] org.apache.avro#avro;1.7.6 from repo-1 in [default] org.apache.commons#commons-compress;1.4.1 from repo-1 in [default] org.apache.commons#commons-math;2.2 from repo-1 in [default] org.apache.hbase#hbase-annotations;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-client;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-common;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-prefix-tree;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-procedure;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-protocol;1.1.2 from repo-1 in [default] org.apache.hbase#hbase-server;1.1.2 from repo-1 in [default] org.apache.htrace#htrace-core;3.1.0-incubating from repo-1 in [default] org.apache.zookeeper#zookeeper;3.4.6 from repo-1 in [default] org.codehaus.jackson#jackson-core-asl;1.9.13 from repo-1 in [default] org.codehaus.jackson#jackson-jaxrs;1.9.13 from repo-1 in [default] org.codehaus.jackson#jackson-mapper-asl;1.9.13 from repo-1 in [default] org.hamcrest#hamcrest-core;1.3 from repo-1 in [default] org.jamon#jamon-runtime;2.3.1 from repo-1 in [default] org.jruby.jcodings#jcodings;1.0.8 from repo-1 in [default] org.jruby.joni#joni;2.1.2 from repo-1 in [default] org.mortbay.jetty#jetty;6.1.26 from repo-1 in [default] org.mortbay.jetty#jetty-sslengine;6.1.26 from repo-1 in [default] org.mortbay.jetty#jetty-util;6.1.26 from repo-1 in [default] org.mortbay.jetty#jsp-2.1;6.1.14 from repo-1 in [default] org.mortbay.jetty#jsp-api-2.1;6.1.14 from repo-1 in [default] org.mortbay.jetty#servlet-api;2.5-20081211 from repo-1 in [default] org.mortbay.jetty#servlet-api-2.5;6.1.14 from repo-1 in [default] org.slf4j#slf4j-api;1.7.7 from repo-1 in [default] org.slf4j#slf4j-log4j12;1.6.1 from repo-1 in [default] org.tukaani#xz;1.0 from repo-1 in [default] org.xerial.snappy#snappy-java;1.0.5 from repo-1 in [default] tomcat#jasper-compiler;5.5.23 from repo-1 in [default] tomcat#jasper-runtime;5.5.23 from repo-1 in [default] :: evicted modules: org.slf4j#slf4j-api;1.6.4 by [org.slf4j#slf4j-api;1.7.7] in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 53 | 7 | 7 | 1 || 52 | 0 | --------------------------------------------------------------------- :: retrieving :: org.apache.spark#spark-submit-parent confs: [default] 0 artifacts copied, 52 already retrieved (0kB/41ms) 17/01/31 10:40:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/31 10:40:28 INFO SecurityManager: Changing view acls to: spark 17/01/31 10:40:28 INFO SecurityManager: Changing modify acls to: spark 17/01/31 10:40:28 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark) 17/01/31 10:40:28 INFO HttpServer: Starting HTTP Server 17/01/31 10:40:28 INFO Server: jetty-8.y.z-SNAPSHOT 17/01/31 10:40:28 INFO AbstractConnector: Started SocketConnector@0.0.0.0:45774 17/01/31 10:40:28 INFO Utils: Successfully started service 'HTTP class server' on port 45774. 17/01/31 10:40:31 INFO SparkContext: Running Spark version 1.6.1 17/01/31 10:40:31 INFO SecurityManager: Changing view acls to: spark 17/01/31 10:40:31 INFO SecurityManager: Changing modify acls to: spark 17/01/31 10:40:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark) 17/01/31 10:40:31 INFO Utils: Successfully started service 'sparkDriver' on port 38436. 17/01/31 10:40:32 INFO Slf4jLogger: Slf4jLogger started 17/01/31 10:40:32 INFO Remoting: Starting remoting 17/01/31 10:40:32 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 52800. 17/01/31 10:40:32 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@167.114.241.46:52800] 17/01/31 10:40:32 INFO SparkEnv: Registering MapOutputTracker 17/01/31 10:40:32 INFO SparkEnv: Registering BlockManagerMaster 17/01/31 10:40:32 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-8c2dc144-a89d-4d85-a52a-87d9ea6ab311 17/01/31 10:40:32 INFO MemoryStore: MemoryStore started with capacity 511.1 MB 17/01/31 10:40:32 INFO SparkEnv: Registering OutputCommitCoordinator 17/01/31 10:40:32 INFO Server: jetty-8.y.z-SNAPSHOT 17/01/31 10:40:32 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 17/01/31 10:40:32 INFO Utils: Successfully started service 'SparkUI' on port 4040. 17/01/31 10:40:32 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://167.114.241.46:4040 17/01/31 10:40:32 INFO HttpFileServer: HTTP File server directory is /tmp/spark-c48591a5-750f-4b19-852a-1ece60fa652c/httpd-642cdc30-5261-4c7d-8c8b-21757cf856db 17/01/31 10:40:32 INFO HttpServer: Starting HTTP Server 17/01/31 10:40:32 INFO Server: jetty-8.y.z-SNAPSHOT 17/01/31 10:40:32 INFO AbstractConnector: Started SocketConnector@0.0.0.0:46726 17/01/31 10:40:32 INFO Utils: Successfully started service 'HTTP file server' on port 46726. 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/usr/hdp/current/phoenix-client/phoenix-server.jar at http://167.114.241.46:46726/jars/phoenix-server.jar with timestamp 1485859232676 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.hortonworks_shc-core-1.0.1-1.6-s_2.10.jar at http://167.114.241.46:46726/jars/com.hortonworks_shc-core-1.0.1-1.6-s_2.10.jar with timestamp 1485859232678 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-server-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-server-1.1.2.jar with timestamp 1485859232689 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-common-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-common-1.1.2.jar with timestamp 1485859232691 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.avro_avro-1.7.6.jar at http://167.114.241.46:46726/jars/org.apache.avro_avro-1.7.6.jar with timestamp 1485859232692 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-protocol-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-protocol-1.1.2.jar with timestamp 1485859232705 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-procedure-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-procedure-1.1.2.jar with timestamp 1485859232706 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-client-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-client-1.1.2.jar with timestamp 1485859232710 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-httpclient_commons-httpclient-3.1.jar at http://167.114.241.46:46726/jars/commons-httpclient_commons-httpclient-3.1.jar with timestamp 1485859232711 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-codec_commons-codec-1.9.jar at http://167.114.241.46:46726/jars/commons-codec_commons-codec-1.9.jar with timestamp 1485859232712 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-collections_commons-collections-3.2.1.jar at http://167.114.241.46:46726/jars/commons-collections_commons-collections-3.2.1.jar with timestamp 1485859232713 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at http://167.114.241.46:46726/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1485859232714 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.google.guava_guava-12.0.1.jar at http://167.114.241.46:46726/jars/com.google.guava_guava-12.0.1.jar with timestamp 1485859232720 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.google.protobuf_protobuf-java-2.5.0.jar at http://167.114.241.46:46726/jars/com.google.protobuf_protobuf-java-2.5.0.jar with timestamp 1485859232722 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.sun.jersey_jersey-core-1.9.jar at http://167.114.241.46:46726/jars/com.sun.jersey_jersey-core-1.9.jar with timestamp 1485859232723 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.sun.jersey_jersey-server-1.9.jar at http://167.114.241.46:46726/jars/com.sun.jersey_jersey-server-1.9.jar with timestamp 1485859232725 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-cli_commons-cli-1.2.jar at http://167.114.241.46:46726/jars/commons-cli_commons-cli-1.2.jar with timestamp 1485859232726 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-io_commons-io-2.4.jar at http://167.114.241.46:46726/jars/commons-io_commons-io-2.4.jar with timestamp 1485859232726 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-lang_commons-lang-2.6.jar at http://167.114.241.46:46726/jars/commons-lang_commons-lang-2.6.jar with timestamp 1485859232727 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.commons_commons-math-2.2.jar at http://167.114.241.46:46726/jars/org.apache.commons_commons-math-2.2.jar with timestamp 1485859232731 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/log4j_log4j-1.2.17.jar at http://167.114.241.46:46726/jars/log4j_log4j-1.2.17.jar with timestamp 1485859232732 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.zookeeper_zookeeper-3.4.6.jar at http://167.114.241.46:46726/jars/org.apache.zookeeper_zookeeper-3.4.6.jar with timestamp 1485859232735 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_jetty-6.1.26.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_jetty-6.1.26.jar with timestamp 1485859232736 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_jetty-util-6.1.26.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_jetty-util-6.1.26.jar with timestamp 1485859232737 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_jetty-sslengine-6.1.26.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_jetty-sslengine-6.1.26.jar with timestamp 1485859232737 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_jsp-2.1-6.1.14.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_jsp-2.1-6.1.14.jar with timestamp 1485859232740 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_jsp-api-2.1-6.1.14.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_jsp-api-2.1-6.1.14.jar with timestamp 1485859232741 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_servlet-api-2.5-6.1.14.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_servlet-api-2.5-6.1.14.jar with timestamp 1485859232741 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.codehaus.jackson_jackson-core-asl-1.9.13.jar at http://167.114.241.46:46726/jars/org.codehaus.jackson_jackson-core-asl-1.9.13.jar with timestamp 1485859232742 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.codehaus.jackson_jackson-mapper-asl-1.9.13.jar at http://167.114.241.46:46726/jars/org.codehaus.jackson_jackson-mapper-asl-1.9.13.jar with timestamp 1485859232744 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.codehaus.jackson_jackson-jaxrs-1.9.13.jar at http://167.114.241.46:46726/jars/org.codehaus.jackson_jackson-jaxrs-1.9.13.jar with timestamp 1485859232744 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/tomcat_jasper-compiler-5.5.23.jar at http://167.114.241.46:46726/jars/tomcat_jasper-compiler-5.5.23.jar with timestamp 1485859232746 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.jamon_jamon-runtime-2.3.1.jar at http://167.114.241.46:46726/jars/org.jamon_jamon-runtime-2.3.1.jar with timestamp 1485859232746 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/io.netty_netty-all-4.0.23.Final.jar at http://167.114.241.46:46726/jars/io.netty_netty-all-4.0.23.Final.jar with timestamp 1485859232750 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.htrace_htrace-core-3.1.0-incubating.jar at http://167.114.241.46:46726/jars/org.apache.htrace_htrace-core-3.1.0-incubating.jar with timestamp 1485859232754 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.lmax_disruptor-3.3.0.jar at http://167.114.241.46:46726/jars/com.lmax_disruptor-3.3.0.jar with timestamp 1485859232755 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar at http://167.114.241.46:46726/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar with timestamp 1485859232755 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/junit_junit-4.11.jar at http://167.114.241.46:46726/jars/junit_junit-4.11.jar with timestamp 1485859232756 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-annotations-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-annotations-1.1.2.jar with timestamp 1485859232756 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.hamcrest_hamcrest-core-1.3.jar at http://167.114.241.46:46726/jars/org.hamcrest_hamcrest-core-1.3.jar with timestamp 1485859232757 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.google.code.findbugs_jsr305-1.3.9.jar at http://167.114.241.46:46726/jars/com.google.code.findbugs_jsr305-1.3.9.jar with timestamp 1485859232757 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.jruby.jcodings_jcodings-1.0.8.jar at http://167.114.241.46:46726/jars/org.jruby.jcodings_jcodings-1.0.8.jar with timestamp 1485859232761 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.jruby.joni_joni-2.1.2.jar at http://167.114.241.46:46726/jars/org.jruby.joni_joni-2.1.2.jar with timestamp 1485859232761 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.slf4j_slf4j-api-1.7.7.jar at http://167.114.241.46:46726/jars/org.slf4j_slf4j-api-1.7.7.jar with timestamp 1485859232762 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.slf4j_slf4j-log4j12-1.6.1.jar at http://167.114.241.46:46726/jars/org.slf4j_slf4j-log4j12-1.6.1.jar with timestamp 1485859232762 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.hbase_hbase-prefix-tree-1.1.2.jar at http://167.114.241.46:46726/jars/org.apache.hbase_hbase-prefix-tree-1.1.2.jar with timestamp 1485859232762 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/tomcat_jasper-runtime-5.5.23.jar at http://167.114.241.46:46726/jars/tomcat_jasper-runtime-5.5.23.jar with timestamp 1485859232763 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.mortbay.jetty_servlet-api-2.5-20081211.jar at http://167.114.241.46:46726/jars/org.mortbay.jetty_servlet-api-2.5-20081211.jar with timestamp 1485859232763 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/commons-el_commons-el-1.0.jar at http://167.114.241.46:46726/jars/commons-el_commons-el-1.0.jar with timestamp 1485859232764 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/com.thoughtworks.paranamer_paranamer-2.3.jar at http://167.114.241.46:46726/jars/com.thoughtworks.paranamer_paranamer-2.3.jar with timestamp 1485859232764 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.xerial.snappy_snappy-java-1.0.5.jar at http://167.114.241.46:46726/jars/org.xerial.snappy_snappy-java-1.0.5.jar with timestamp 1485859232768 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.apache.commons_commons-compress-1.4.1.jar at http://167.114.241.46:46726/jars/org.apache.commons_commons-compress-1.4.1.jar with timestamp 1485859232769 17/01/31 10:40:32 INFO SparkContext: Added JAR file:/home/spark/.ivy2/jars/org.tukaani_xz-1.0.jar at http://167.114.241.46:46726/jars/org.tukaani_xz-1.0.jar with timestamp 1485859232769 17/01/31 10:40:33 INFO TimelineClientImpl: Timeline service address: http://hdp-node14.affinytix.com:8188/ws/v1/timeline/ 17/01/31 10:40:33 INFO RMProxy: Connecting to ResourceManager at hdp-node13.affinytix.com/167.114.241.49:8050 17/01/31 10:40:34 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 17/01/31 10:40:34 INFO Client: Requesting a new application from cluster with 21 NodeManagers 17/01/31 10:40:34 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (4096 MB per container) 17/01/31 10:40:34 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 17/01/31 10:40:34 INFO Client: Setting up container launch context for our AM 17/01/31 10:40:34 INFO Client: Setting up the launch environment for our AM container 17/01/31 10:40:34 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://hdp-node10.affinytix.com:8020/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 17/01/31 10:40:34 INFO Client: Preparing resources for our AM container 17/01/31 10:40:34 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://hdp-node10.affinytix.com:8020/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 17/01/31 10:40:34 INFO Client: Source and destination file systems are the same. Not copying hdfs://hdp-node10.affinytix.com:8020/hdp/apps/2.4.2.0-258/spark/spark-hdp-assembly.jar 17/01/31 10:40:34 INFO Client: Uploading resource file:/usr/hdp/current/hbase-client/conf/hbase-site.xml -> hdfs://hdp-node10.affinytix.com:8020/user/spark/.sparkStaging/application_1485852877339_0007/hbase-site.xml 17/01/31 10:40:34 INFO Client: Uploading resource file:/usr/hdp/current/hive-client/conf/hive-site.xml -> hdfs://hdp-node10.affinytix.com:8020/user/spark/.sparkStaging/application_1485852877339_0007/hive-site.xml 17/01/31 10:40:35 INFO Client: Uploading resource file:/tmp/spark-c48591a5-750f-4b19-852a-1ece60fa652c/__spark_conf__3792827788098058013.zip -> hdfs://hdp-node10.affinytix.com:8020/user/spark/.sparkStaging/application_1485852877339_0007/__spark_conf__3792827788098058013.zip 17/01/31 10:40:35 INFO SecurityManager: Changing view acls to: spark 17/01/31 10:40:35 INFO SecurityManager: Changing modify acls to: spark 17/01/31 10:40:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark) 17/01/31 10:40:35 INFO Client: Submitting application 7 to ResourceManager 17/01/31 10:40:35 INFO YarnClientImpl: Submitted application application_1485852877339_0007 17/01/31 10:40:35 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1485852877339_0007 and attemptId None 17/01/31 10:40:36 INFO Client: Application report for application_1485852877339_0007 (state: ACCEPTED) 17/01/31 10:40:36 INFO Client: client token: N/A diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1485859235240 final status: UNDEFINED tracking URL: http://hdp-node13.affinytix.com:8088/proxy/application_1485852877339_0007/ user: spark 17/01/31 10:40:37 INFO Client: Application report for application_1485852877339_0007 (state: ACCEPTED) 17/01/31 10:40:38 INFO Client: Application report for application_1485852877339_0007 (state: ACCEPTED) 17/01/31 10:40:39 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) 17/01/31 10:40:39 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> hdp-node13.affinytix.com, PROXY_URI_BASES -> http://hdp-node13.affinytix.com:8088/proxy/application_1485852877339_0007), /proxy/application_1485852877339_0007 17/01/31 10:40:39 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 17/01/31 10:40:39 INFO Client: Application report for application_1485852877339_0007 (state: RUNNING) 17/01/31 10:40:39 INFO Client: client token: N/A diagnostics: N/A ApplicationMaster host: 167.114.241.4 ApplicationMaster RPC port: 0 queue: default start time: 1485859235240 final status: UNDEFINED tracking URL: http://hdp-node13.affinytix.com:8088/proxy/application_1485852877339_0007/ user: spark 17/01/31 10:40:39 INFO YarnClientSchedulerBackend: Application application_1485852877339_0007 has started running. 17/01/31 10:40:39 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42414. 17/01/31 10:40:39 INFO NettyBlockTransferService: Server created on 42414 17/01/31 10:40:39 INFO BlockManagerMaster: Trying to register BlockManager 17/01/31 10:40:39 INFO BlockManagerMasterEndpoint: Registering block manager 167.114.241.46:42414 with 511.1 MB RAM, BlockManagerId(driver, 167.114.241.46, 42414) 17/01/31 10:40:39 INFO BlockManagerMaster: Registered BlockManager 17/01/31 10:40:39 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1485852877339_0007 17/01/31 10:40:43 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdp-node16.affinytix.com:42746) with ID 1 17/01/31 10:40:43 INFO BlockManagerMasterEndpoint: Registering block manager hdp-node16.affinytix.com:34636 with 511.1 MB RAM, BlockManagerId(1, hdp-node16.affinytix.com, 34636) 17/01/31 10:40:44 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (hdp-node9.affinytix.com:45774) with ID 2 17/01/31 10:40:44 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 17/01/31 10:40:44 INFO SparkILoop: Created spark context.. 17/01/31 10:40:44 INFO BlockManagerMasterEndpoint: Registering block manager hdp-node9.affinytix.com:56360 with 511.1 MB RAM, BlockManagerId(2, hdp-node9.affinytix.com, 56360) 17/01/31 10:40:45 INFO HiveContext: Initializing execution hive, version 1.2.1 17/01/31 10:40:45 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.2.0-258 17/01/31 10:40:45 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.2.0-258 17/01/31 10:40:45 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 17/01/31 10:40:45 INFO ObjectStore: ObjectStore, initialize called 17/01/31 10:40:45 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 17/01/31 10:40:45 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 17/01/31 10:40:48 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 17/01/31 10:40:49 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 17/01/31 10:40:49 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 17/01/31 10:40:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 17/01/31 10:40:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 17/01/31 10:40:52 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY 17/01/31 10:40:52 INFO ObjectStore: Initialized ObjectStore 17/01/31 10:40:52 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 17/01/31 10:40:52 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 17/01/31 10:40:52 INFO HiveMetaStore: Added admin role in metastore 17/01/31 10:40:52 INFO HiveMetaStore: Added public role in metastore 17/01/31 10:40:52 INFO HiveMetaStore: No user is added in admin role, since config is empty 17/01/31 10:40:52 INFO HiveMetaStore: 0: get_all_databases 17/01/31 10:40:52 INFO audit: ugi=spark ip=unknown-ip-addr cmd=get_all_databases 17/01/31 10:40:52 INFO HiveMetaStore: 0: get_functions: db=default pat=* 17/01/31 10:40:52 INFO audit: ugi=spark ip=unknown-ip-addr cmd=get_functions: db=default pat=* 17/01/31 10:40:52 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. 17/01/31 10:40:53 INFO SessionState: Created local directory: /tmp/97bb9cd6-c2cf-43a4-83d3-342e1e03f2f6_resources 17/01/31 10:40:53 INFO SessionState: Created HDFS directory: /tmp/hive/spark/97bb9cd6-c2cf-43a4-83d3-342e1e03f2f6 17/01/31 10:40:53 INFO SessionState: Created local directory: /tmp/spark/97bb9cd6-c2cf-43a4-83d3-342e1e03f2f6 17/01/31 10:40:53 INFO SessionState: Created HDFS directory: /tmp/hive/spark/97bb9cd6-c2cf-43a4-83d3-342e1e03f2f6/_tmp_space.db 17/01/31 10:40:53 INFO HiveContext: default warehouse location is /user/hive/warehouse 17/01/31 10:40:53 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 17/01/31 10:40:53 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.2.0-258 17/01/31 10:40:53 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.2.0-258 17/01/31 10:40:53 INFO metastore: Trying to connect to metastore with URI thrift://hdp-node15.affinytix.com:9083 17/01/31 10:40:53 INFO metastore: Connected to metastore. 17/01/31 10:40:53 INFO SessionState: Created local directory: /tmp/2f3d91a2-517a-472b-a4a9-2d7b611b8e10_resources 17/01/31 10:40:54 INFO SessionState: Created HDFS directory: /tmp/hive/spark/2f3d91a2-517a-472b-a4a9-2d7b611b8e10 17/01/31 10:40:54 INFO SessionState: Created local directory: /tmp/spark/2f3d91a2-517a-472b-a4a9-2d7b611b8e10 17/01/31 10:40:54 INFO SessionState: Created HDFS directory: /tmp/hive/spark/2f3d91a2-517a-472b-a4a9-2d7b611b8e10/_tmp_space.db 17/01/31 10:40:54 INFO SparkILoop: Created sql context (with Hive support).. 17/01/31 11:08:29 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 363.8 KB, free 363.8 KB) 17/01/31 11:08:29 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 31.6 KB, free 395.5 KB) 17/01/31 11:08:29 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 167.114.241.46:42414 (size: 31.6 KB, free: 511.1 MB) 17/01/31 11:08:29 INFO SparkContext: Created broadcast 0 from save at :72 17/01/31 11:08:29 INFO RecoverableZooKeeper: Process identifier=hconnection-0x308b8e8f connecting to ZooKeeper ensemble=localhost:2181 17/01/31 11:08:29 INFO ZooKeeper: Client environment:zookeeper.version=3.4.6-258--1, built on 04/25/2016 05:22 GMT 17/01/31 11:08:29 INFO ZooKeeper: Client environment:host.name=hdp-node12 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.version=1.8.0_60 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.vendor=Oracle Corporation 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.home=/usr/jdk64/jdk1.8.0_60/jre 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.class.path=/usr/hdp/current/spark-client/conf/:/usr/hdp/2.4.2.0-258/spark/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/spark/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.4.2.0-258/spark/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.4.2.0-258/spark/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/hadoop-client/conf/ 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp 17/01/31 11:08:29 INFO ZooKeeper: Client environment:java.compiler= 17/01/31 11:08:29 INFO ZooKeeper: Client environment:os.name=Linux 17/01/31 11:08:29 INFO ZooKeeper: Client environment:os.arch=amd64 17/01/31 11:08:29 INFO ZooKeeper: Client environment:os.version=3.10.0-229.1.2.el7.x86_64 17/01/31 11:08:29 INFO ZooKeeper: Client environment:user.name=spark 17/01/31 11:08:29 INFO ZooKeeper: Client environment:user.home=/home/spark 17/01/31 11:08:29 INFO ZooKeeper: Client environment:user.dir=/home/spark/samuel 17/01/31 11:08:29 INFO ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x308b8e8f0x0, quorum=localhost:2181, baseZNode=/hbase 17/01/31 11:08:29 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:29 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:29 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:29 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:29 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid 17/01/31 11:08:30 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:30 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:30 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:30 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:30 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid 17/01/31 11:08:31 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:31 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:32 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:32 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:33 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:33 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:33 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:33 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:33 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid 17/01/31 11:08:34 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:34 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:34 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:34 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:35 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:35 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:35 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:35 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:36 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:36 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:36 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:36 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:38 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:38 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:38 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:38 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid 17/01/31 11:08:38 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:39 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:39 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:39 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:39 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:40 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:40 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:40 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:40 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:41 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:41 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:41 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:41 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:42 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:42 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:42 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:42 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:44 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:44 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:44 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:44 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:45 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:45 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:45 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:45 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:46 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:46 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:46 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:46 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) 17/01/31 11:08:46 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid 17/01/31 11:08:46 ERROR RecoverableZooKeeper: ZooKeeper exists failed after 4 attempts 17/01/31 11:08:46 WARN ZKUtil: hconnection-0x308b8e8f0x0, quorum=localhost:2181, baseZNode=/hbase Unable to set watcher on znode (/hbase/hbaseid) org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045) at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:221) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:541) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.(ConnectionManager.java:635) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189) at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.createTable(HBaseRelation.scala:84) at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:59) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at $line15.$read$$iwC$$iwC.(:72) at $line15.$read$$iwC.(:83) at $line15.$read.(:85) at $line15.$read$.(:89) at $line15.$read$.() at $line15.$eval$.(:7) at $line15.$eval$.() at $line15.$eval.$print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pasteCommand(SparkILoop.scala:825) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$NullaryCmd.apply(LoopCommands.scala:76) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/01/31 11:08:46 ERROR ZooKeeperWatcher: hconnection-0x308b8e8f0x0, quorum=localhost:2181, baseZNode=/hbase Received unexpected KeeperException, re-throwing exception org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045) at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:221) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:541) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.(ConnectionManager.java:635) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189) at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.createTable(HBaseRelation.scala:84) at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:59) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at $line15.$read$$iwC$$iwC.(:72) at $line15.$read$$iwC.(:83) at $line15.$read.(:85) at $line15.$read$.(:89) at $line15.$read$.() at $line15.$eval$.(:7) at $line15.$eval$.() at $line15.$eval.$print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pasteCommand(SparkILoop.scala:825) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$NullaryCmd.apply(LoopCommands.scala:76) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/01/31 11:08:46 WARN ZooKeeperRegistry: Can't retrieve clusterId from Zookeeper org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045) at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:221) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:541) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.(ConnectionManager.java:635) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$getConnection$1.apply(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$$anonfun$1.apply(HBaseConnectionCache.scala:102) at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189) at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:102) at org.apache.spark.sql.execution.datasources.hbase.HBaseConnectionCache$.getConnection(HBaseConnectionCache.scala:109) at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.createTable(HBaseRelation.scala:84) at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:59) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at $line15.$read$$iwC$$iwC.(:72) at $line15.$read$$iwC.(:83) at $line15.$read.(:85) at $line15.$read$.(:89) at $line15.$read$.() at $line15.$eval$.(:7) at $line15.$eval$.() at $line15.$eval.$print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pasteCommand(SparkILoop.scala:825) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$8.apply(SparkILoop.scala:345) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$LoopCommand$$anonfun$nullary$1.apply(LoopCommands.scala:65) at scala.tools.nsc.interpreter.LoopCommands$NullaryCmd.apply(LoopCommands.scala:76) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/01/31 11:08:47 INFO ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error) 17/01/31 11:08:47 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125) ....... (erased similar line for resizing reasons) 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 17/01/31 11:28:16 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 17/01/31 11:28:16 INFO SparkUI: Stopped Spark web UI at http://167.114.241.46:4040 17/01/31 11:28:16 INFO YarnClientSchedulerBackend: Interrupting monitor thread 17/01/31 11:28:16 INFO YarnClientSchedulerBackend: Shutting down all executors 17/01/31 11:28:16 INFO YarnClientSchedulerBackend: Asking each executor to shut down 17/01/31 11:28:16 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 17/01/31 11:28:16 INFO YarnClientSchedulerBackend: Stopped 17/01/31 11:28:16 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/01/31 11:28:16 INFO MemoryStore: MemoryStore cleared 17/01/31 11:28:16 INFO BlockManager: BlockManager stopped 17/01/31 11:28:16 INFO BlockManagerMaster: BlockManagerMaster stopped 17/01/31 11:28:16 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/01/31 11:28:16 INFO SparkContext: Successfully stopped SparkContext 17/01/31 11:28:16 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/01/31 11:28:16 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 17/01/31 11:28:16 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. 17/01/31 11:28:16 INFO ShutdownHookManager: Shutdown hook called 17/01/31 11:28:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-4c10be32-0d10-419b-b1b7-9cb2dc778055 17/01/31 11:28:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-c48591a5-750f-4b19-852a-1ece60fa652c 17/01/31 11:28:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-a5f6fe7f-f0cb-477d-87c9-5209eda05d44 17/01/31 11:28:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-c48591a5-750f-4b19-852a-1ece60fa652c/httpd-642cdc30-5261-4c7d-8c8b-21757cf856db ./spark-shell-shc.sh: line 8: --driver-memory: command not found