root@slave0:/tmp# sh -x /root/sparkrun.sh + PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin + date + spark-submit --jars /root/ojdbc6.jar /root/stest.py 17/06/05 11:57:53 INFO SparkContext: Running Spark version 1.6.1 17/06/05 11:57:54 INFO SecurityManager: Changing view acls to: root 17/06/05 11:57:54 INFO SecurityManager: Changing modify acls to: root 17/06/05 11:57:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 17/06/05 11:57:54 INFO Utils: Successfully started service 'sparkDriver' on port 56091. 17/06/05 11:57:54 INFO Slf4jLogger: Slf4jLogger started 17/06/05 11:57:55 INFO Remoting: Starting remoting 17/06/05 11:57:55 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.10.12.4:58272] 17/06/05 11:57:55 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 58272. 17/06/05 11:57:55 INFO SparkEnv: Registering MapOutputTracker 17/06/05 11:57:55 INFO SparkEnv: Registering BlockManagerMaster 17/06/05 11:57:55 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-df5ce14b-43ef-475a-a138-c3576282f671 17/06/05 11:57:55 INFO MemoryStore: MemoryStore started with capacity 511.1 MB 17/06/05 11:57:55 INFO SparkEnv: Registering OutputCommitCoordinator 17/06/05 11:57:55 INFO Server: jetty-8.y.z-SNAPSHOT 17/06/05 11:57:55 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 17/06/05 11:57:55 INFO Utils: Successfully started service 'SparkUI' on port 4040. 17/06/05 11:57:55 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.10.12.4:4040 17/06/05 11:57:55 INFO HttpFileServer: HTTP File server directory is /tmp/spark-e895f635-8c32-46fc-b411-af68eacbd03a/httpd-ecface93-ff5d-4482-99d5-ce36fa65a391 17/06/05 11:57:55 INFO HttpServer: Starting HTTP Server 17/06/05 11:57:55 INFO Server: jetty-8.y.z-SNAPSHOT 17/06/05 11:57:55 INFO AbstractConnector: Started SocketConnector@0.0.0.0:48348 17/06/05 11:57:55 INFO Utils: Successfully started service 'HTTP file server' on port 48348. 17/06/05 11:57:55 INFO SparkContext: Added JAR file:/root/ojdbc6.jar at http://10.10.12.4:48348/jars/ojdbc6.jar with timestamp 1496644075628 17/06/05 11:57:55 INFO Utils: Copying /root/stest.py to /tmp/spark-e895f635-8c32-46fc-b411-af68eacbd03a/userFiles-5f4af472-8693-4918-b549-f4e76993be5c/stest.py 17/06/05 11:57:55 INFO SparkContext: Added file file:/root/stest.py at file:/root/stest.py with timestamp 1496644075872 17/06/05 11:57:55 INFO Executor: Starting executor ID driver on host localhost 17/06/05 11:57:55 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57462. 17/06/05 11:57:55 INFO NettyBlockTransferService: Server created on 57462 17/06/05 11:57:55 INFO BlockManagerMaster: Trying to register BlockManager 17/06/05 11:57:55 INFO BlockManagerMasterEndpoint: Registering block manager localhost:57462 with 511.1 MB RAM, BlockManagerId(driver, localhost, 57462) 17/06/05 11:57:55 INFO BlockManagerMaster: Registered BlockManager 17/06/05 11:57:56 INFO EventLoggingListener: Logging events to hdfs:///spark-history/local-1496644075924 Traceback (most recent call last): File "/root/stest.py", line 17, in df = sqlContext.read.format("jdbc").option("driver", "oracle.jdbc.OracleDriver").option("url","jdbc:oracle:thin:NE/Network_147@10.77.1.147:1521/ELLDEV").option("dbtable","NE.INTER_APP_EVENT").load() File "/usr/hdp/2.4.2.0-258/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 139, in load File "/usr/hdp/2.4.2.0-258/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in __call__ File "/usr/hdp/2.4.2.0-258/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 45, in deco File "/usr/hdp/2.4.2.0-258/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o42.load. : java.sql.SQLRecoverableException: Io exception: Broken pipe at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:101) at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:133) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:199) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:263) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:521) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:418) at oracle.jdbc.driver.PhysicalConnection.(PhysicalConnection.java:508) at oracle.jdbc.driver.T4CConnection.(T4CConnection.java:203) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:33) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:510) at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:61) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:52) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:120) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:91) at org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:57) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381) at py4j.Gateway.invoke(Gateway.java:259) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:209) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.SocketException: Broken pipe at java.net.SocketOutputStream.socketWrite0(Native Method) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) at java.net.SocketOutputStream.write(SocketOutputStream.java:153) at oracle.net.ns.DataPacket.send(DataPacket.java:150) at oracle.net.ns.NetOutputStream.flush(NetOutputStream.java:180) at oracle.net.ns.NetInputStream.getNextPacket(NetInputStream.java:169) at oracle.net.ns.NetInputStream.read(NetInputStream.java:117) at oracle.net.ns.NetInputStream.read(NetInputStream.java:92) at oracle.net.ns.NetInputStream.read(NetInputStream.java:77) at oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1034) at oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMAREngine.java:1010) at oracle.jdbc.driver.T4CTTIoauthenticate.receiveOauth(T4CTTIoauthenticate.java:760) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:368) ... 23 more 17/06/05 12:07:21 INFO SparkContext: Invoking stop() from shutdown hook 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 17/06/05 12:07:21 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 17/06/05 12:07:21 INFO SparkUI: Stopped Spark web UI at http://10.10.12.4:4040 17/06/05 12:07:21 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/06/05 12:07:21 INFO MemoryStore: MemoryStore cleared 17/06/05 12:07:21 INFO BlockManager: BlockManager stopped 17/06/05 12:07:21 INFO BlockManagerMaster: BlockManagerMaster stopped 17/06/05 12:07:21 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/06/05 12:07:21 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/06/05 12:07:21 INFO SparkContext: Successfully stopped SparkContext 17/06/05 12:07:21 INFO ShutdownHookManager: Shutdown hook called 17/06/05 12:07:21 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 17/06/05 12:07:21 INFO ShutdownHookManager: Deleting directory /tmp/spark-e895f635-8c32-46fc-b411-af68eacbd03a/pyspark-f2e4f05f-6dc6-46ad-8065-e6baf223b7a7 17/06/05 12:07:21 INFO ShutdownHookManager: Deleting directory /tmp/spark-e895f635-8c32-46fc-b411-af68eacbd03a 17/06/05 12:07:21 INFO ShutdownHookManager: Deleting directory /tmp/spark-e895f635-8c32-46fc-b411-af68eacbd03a/httpd-ecface93-ff5d-4482-99d5-ce36fa65a391