Member since
02-08-2016
21
Posts
0
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3044 | 02-17-2016 01:25 AM | |
6192 | 02-17-2016 01:09 AM |
10-12-2016
01:50 AM
Hello There, Can anyone please let me know how can i restart spark streaming. Below code for stop the streaming ssc.stop(stopSparkContext = false, stopGracefully = true) and again when i try to start it . it is giving me bellow error java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after stopping a context is not supported
at org.apache.spark.streaming.dstream.DStream.validateAtInit(DStream.scala:224) ~[spark-streaming_2.11-1.4.1.jar:1.4.1]
at org.apache.spark.streaming.dstream.DStream.<init>(DStream.scala:64) ~[spark-streaming_2.11-1.4.1.jar:1.4.1]
at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:41) ~[spark-streaming_2.11-1.4.1.jar:1.4.1]
at org.apache.spark.streaming.dstream.ReceiverInputDStream.<init>(ReceiverInputDStream.scala:41) ~[spark-streaming_2.11-1.4.1.jar:1.4.1]
at org.apache.spark.streaming.twitter.TwitterInputDStream.<init>(TwitterInputDStream.scala:46) ~[spark-streaming-twitter_2.11-1.4.1.jar:1.4.1]
at org.apache.spark.streaming.twitter.TwitterUtils$.createStream(TwitterUtils.scala:44) ~[spark-streaming-twitter_2.11-1.4.1.jar:1.4.1]
at org.sach.KibanaElasticSearch.TwitterTransmitter$.Fetchallinfo(TwitterTransmitter.scala:39) ~[classes/:na]
at org.sach.KibanaElasticSearch.CallServlet.doPost(CallServlet.java:58) ~[classes/:na]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) ~[servlet-api.jar:na]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) ~[servlet-api.jar:na]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292) ~[catalina.jar:8.0.37]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) ~[catalina.jar:8.0.37]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) ~[tomcat-websocket.jar:8.0.37]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240) ~[catalina.jar:8.0.37]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) ~[catalina.jar:8.0.37]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212) ~[catalina.jar:8.0.37]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106) [catalina.jar:8.0.37]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) [catalina.jar:8.0.37]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141) [catalina.jar:8.0.37]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [catalina.jar:8.0.37]
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616) [catalina.jar:8.0.37]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) [catalina.jar:8.0.37]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:528) [catalina.jar:8.0.37]
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1100) [tomcat-coyote.jar:8.0.37]
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:687) [tomcat-coyote.jar:8.0.37]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1520) [tomcat-coyote.jar:8.0.37]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1476) [tomcat-coyote.jar:8.0.37]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_65]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_65]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-util.jar:8.0.37]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_65]
... View more
Labels:
- Labels:
-
Apache Spark
06-20-2016
04:13 AM
yeah... its working.. i was able to find out its hex value (its control-v + control-a) character but was not applying '$' symbol. Thank you so much .. 🙂 really appreciated..
... View more
06-20-2016
04:10 AM
Nope.. Same error..
... View more
06-20-2016
03:41 AM
It is not working.. pfb log for your refernece . hduser@jgusbihdpmaster:/tmp$ sqoop export --connect jdbc:oracle:thin:olap7964/olap7964@192.168.2.135:1521:ORCLOBIA11G --username XXXXXXX --password XXXXXXX --export-dir /user/hive/warehouse/abc.db/sach_sport --table SACH_SPORT --fields-terminated-by '\^A' -m 1 Warning: /opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 16/06/20 03:37:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.1 16/06/20 03:37:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. Cannot understand character argument: \^A
... View more
06-20-2016
03:16 AM
Hi There, my file is "^A " delimted which i want to export using sqoop . do you know how can we handle such special charater .
... View more
Labels:
- Labels:
-
Apache Sqoop
03-20-2016
09:11 AM
there are some configuraiton paramter I changed and checked whether sample map reduce program is running or not.
... View more
02-17-2016
01:25 AM
Cup service was running on ububtu server and it creates error file which takes more than 200gb size . i stopped it and delete log file . Now there is no any issue .
... View more
02-17-2016
01:09 AM
Yarn was not properly configured. I configured it and now can execute any map-reduce/sqoop command
... View more
02-11-2016
10:28 PM
thanks for the reply.. I have done the needful but still facing same issue. when i check log file of resource manager of namenode . i got folliwng .. How can i resolve this issue 2016-02-11 22:24:54,915 INFO org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger: USER=cloudera-scm IP=192.168.2.156 OPERATION=Submit Application Request TARGET=ClientRMService RESULT=SUCCESS APPID=application_1455257014816_0002 2016-02-11 22:25:44,163 INFO org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8031: readAndProcess from client 192.168.2.100 threw exception [java.io.IOException: Connection reset by peer] java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:384) at org.apache.hadoop.ipc.Server.channelRead(Server.java:2641) at org.apache.hadoop.ipc.Server.access$2800(Server.java:134) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1488) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:774) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:647) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:618)
... View more
02-11-2016
10:15 PM
Hello, When i execute any spoop command then it shows it is running . (find below log of it ) but when i check it at given url it is showing it is in Accepted state for more than 2 ours Please asist. 6/02/11 22:09:34 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(A), MAX(A) FROM SQOOPTBLTEST 16/02/11 22:09:35 INFO mapreduce.JobSubmitter: number of splits:4 16/02/11 22:09:36 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1455257014816_0001 16/02/11 22:09:37 INFO impl.YarnClientImpl: Submitted application application_1455257014816_0001 16/02/11 22:09:37 INFO mapreduce.Job: The url to track the job: http://jgusbihdpmaster.jadeglobal.com:8088/proxy/application_1455257014816_0001/ 16/02/11 22:09:37 INFO mapreduce.Job: Running job: job_1455257014816_0001
... View more
Labels:
- Labels:
-
Apache Sqoop
-
MapReduce