Member since
09-28-2017
5
Posts
0
Kudos Received
0
Solutions
11-17-2017
09:54 PM
Hi I am importing some data from oracle using the ojdbc6.jar . I am adding this jar using he hiveContext. I am then doing simple group by and count aggregations. But when I want to convert the aggregated spark datframe to R then it takes forever aprox (20 to 30) mins for that conversion. There are like only 4 to 5 rows in this dataframe. # Connection to sparkR from Rstudio
if (nchar(Sys.getenv("SPARK_HOME")) < 1) {Sys.setenv(SPARK_HOME = "/usr/hdp/current/spark-client")}
library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))
sc <- sparkR.init(master = "local[*]", sparkEnvir = list(spark.driver.memory="2g"))
sqlContext <- sparkRSQL.init(sc)
hiveContext <- sparkRHive.init(sc)
#Adding the OJDBC6 Jar
df_1 <- sql(hiveContext,"add jar /usr/hdp/2.6.1.0-129/spark/lib/ojdbc6.jar")
#CONNECTION TO ORACLE AND AND STORING IN RDD
df <- loadDF(hiveContext,source="jdbc",url="jdbc:oracle:thin:ks**d/password@135**********",dbtable="(SELECT EXTERNAL_ORDER_NUM,cast(partition_date as Date)as PARTITION_DATE,CHANNEL,ENTERPRISE_TYPE,LOSG_STATUS,LOSG_SUBSTATUS,PARTNER_NAME,SERVICE,PAYMENT_ARRANGEMENT,LNP,ORDER_STATUS,FULFILLMENT_METHOD,CONTRACT_LENGTH,PRODUCT_CATEGORY,BYOD_FLAG from my_table where partition_date >'10-NOV-17')",driver="oracle.jdbc.driver.OracleDriver")
#Loading the Data in temp table
registerTempTable(df, "df11")
#Performing simple GROUP BY AND COUNT
new_df1 <- sql(hiveContext, "SELECT count(EXTERNAL_ORDER_NUM) as COUNT,PARTITION_DATE FROM df11 GROUP BY PARTITION_DATE ORDER BY PARTITION_DATE")
final_frame <- as.data.frame(new_df1) #This final step takes like 30 mins to execute
I am using HDP 2.6.1 on virtualbox , centos 7 , spark version 1.6.3 I think i am either going wrong on how to add jar (ojdbc6.jar)file to all the nodes from RSTUDIO ==> sparkR or the way I am connecting to sparkR. Here is the log if i run the final_frame <- as.data.frame(new_df1)
> final_frame <- as.data.frame(new_df1)
17/11/17 21:47:35 INFO SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:-2
17/11/17 21:47:35 INFO DAGScheduler: Registering RDD 3 (dfToCols at NativeMethodAccessorImpl.java:-2)
17/11/17 21:47:35 INFO DAGScheduler: Got job 0 (dfToCols at NativeMethodAccessorImpl.java:-2) with 200 output partitions
17/11/17 21:47:35 INFO DAGScheduler: Final stage: ResultStage 1 (dfToCols at NativeMethodAccessorImpl.java:-2)
17/11/17 21:47:35 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
17/11/17 21:47:35 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
17/11/17 21:47:35 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at dfToCols at NativeMethodAccessorImpl.java:-2), which has no missing parents
17/11/17 21:47:35 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 12.3 KB, free 1247.2 MB)
17/11/17 21:47:35 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 5.7 KB, free 1247.2 MB)
17/11/17 21:47:35 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:38269 (size: 5.7 KB, free: 1247.2 MB)
17/11/17 21:47:35 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1008
17/11/17 21:47:35 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at dfToCols at NativeMethodAccessorImpl.java:-2)
17/11/17 21:47:35 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/11/17 21:47:35 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 1959 bytes)
17/11/17 21:47:35 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/11/17 21:47:35 INFO Executor: Fetching http://localhost:43477/jars/ojdbc6.jar with timestamp 1510955214958
17/11/17 21:47:35 INFO Utils: Fetching http://localhost:43477/jars/ojdbc6.jar to /tmp/spark-e6f37cae-3e7a-4eee-8c63-491b96002ccf/userFiles-63f43854-534b-494a-83a4-c1c0e4ec8113/fetchFileTemp5998330918630913912.tmp
17/11/17 21:47:35 INFO Executor: Adding file:/tmp/spark-e6f37cae-3e7a-4eee-8c63-491b96002ccf/userFiles-63f43854-534b-494a-83a4-c1c0e4ec8113/ojdbc6.jar to class loader
17/11/17 21:47:41 INFO GenerateMutableProjection: Code generated in 89.087265 ms
17/11/17 21:47:41 INFO GenerateUnsafeProjection: Code generated in 8.82499 ms
17/11/17 21:47:41 INFO GenerateMutableProjection: Code generated in 7.437145 ms
17/11/17 21:47:41 INFO GenerateUnsafeRowJoiner: Code generated in 5.154208 ms
I might have done some obvious mistake as I am new to hadoop. Please help.
... View more
Labels:
10-24-2017
11:41 PM
Hi, Following is the order.json I am trying to load in hive. {
"customerOrderNumber" : "xxxxxxxxxxxxxx",
"orderType" : "CREATE",
"createdDate" : 1448386296401,
"submittedDate" : 1448386665566,
"productGroups" : {
"roup" : [ {
"id" : "GROUP_01",
"name" : "xxxx",
"type" : "xx",
"sequence" : 1,
"characteristics" : {
"losgCharacteristics" : {
"losgReferenceId" : "12345",
"losgType" : "UNK",
"productCategory" : "WIRE",
"wirelessLOSCharacteristics" : {
"mobileNumber" : "8654xxxxx"
}
}
}
} ]
}
}
I looked at various other post which had given suggestion for similar problem. I finally created hive table using hcat -e "Create Table Order_JSON(
customerOrderNumber string,
orderType string,
createdDate string,
submittedDate string,
productGroups struct<roup:array<struct<id:string,name:string,type:string,sequence:int,characteristics:struct<losgCharacteristics:struct<losgReferenceId:string,losgType:string,productCategory:string,wirelessLOSCharacteristics: struct<mobileNumber:string>>>>>>
)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe';"
The table was created successfully. But when I am trying to load data from the HDFS to table using ambari I am getting the following error LOAD DATA INPATH '/user/maria_dev/data/order.json' INTO TABLE Order_JSON; java.lang.Exception: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
java.lang.Exception: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
at org.apache.ambari.view.hive20.resources.jobs.JobService.getOne(JobService.java:147)
at sun.reflect.GeneratedMethodAccessor428.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1507)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter.doFilter(AmbariAuthorizationFilter.java:287)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authentication.AmbariDelegatingAuthenticationFilter.doFilter(AmbariDelegatingAuthenticationFilter.java:132)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariUserAuthorizationFilter.doFilter(AmbariUserAuthorizationFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.MethodOverrideFilter.doFilter(MethodOverrideFilter.java:72)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.AmbariPersistFilter.doFilter(AmbariPersistFilter.java:47)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.AmbariViewsMDCLoggingFilter.doFilter(AmbariViewsMDCLoggingFilter.java:54)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.ViewThrottleFilter.doFilter(ViewThrottleFilter.java:161)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:125)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:125)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:294)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:499)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:427)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:212)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:201)
at org.apache.ambari.server.controller.AmbariHandlerList.handle(AmbariHandlerList.java:150)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:973)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1035)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:641)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:231)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
at org.apache.hive.jdbc.HiveStatement.waitForOperationToComplete(HiveStatement.java:348)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:251)
at org.apache.ambari.view.hive20.HiveJdbcConnectionDelegate.execute(HiveJdbcConnectionDelegate.java:49)
at org.apache.ambari.view.hive20.actor.StatementExecutor.runStatement(StatementExecutor.java:91)
at org.apache.ambari.view.hive20.actor.StatementExecutor.handleMessage(StatementExecutor.java:72)
at org.apache.ambari.view.hive20.actor.HiveActor.onReceive(HiveActor.java:38)
at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) If I try to query the table "Select * from Order_JSON " I get the following error. {"trace":"org.apache.hive.service.cli.HiveSQLException: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\norg.apache.hive.service.cli.HiveSQLException: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\tat org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:264)\n\tat org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:250)\n\tat org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:373)\n\tat org.apache.ambari.view.hive2.actor.ResultSetIterator.getNext(ResultSetIterator.java:120)\n\tat org.apache.ambari.view.hive2.actor.ResultSetIterator.handleMessage(ResultSetIterator.java:79)\n\tat org.apache.ambari.view.hive2.actor.HiveActor.onReceive(HiveActor.java:38)\n\tat akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)\n\tat akka.actor.Actor$class.aroundReceive(Actor.scala:467)\n\tat akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)\n\tat akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)\n\tat akka.actor.ActorCell.invoke(ActorCell.scala:487)\n\tat akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)\n\tat akka.dispatch.Mailbox.run(Mailbox.scala:220)\n\tat akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)\n\tat scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)\n\tat scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)\n\tat scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)\n\tat scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)\nCaused by: org.apache.hive.service.cli.HiveSQLException: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\tat org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:414)\n\tat org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:233)\n\tat org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:784)\n\tat org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:520)\n\tat org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:709)\n\tat org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557)\n\tat org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542)\n\tat org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)\n\tat org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)\n\tat org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)\n\tat org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\tat org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:520)\n\tat org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:427)\n\tat org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146)\n\tat org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1765)\n\tat org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:409)\n\t... 13 more\nCaused by: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\tat org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:179)\n\tat org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:501)\n\t... 17 more\nCaused by: java.lang.RuntimeException: org.codehaus.jackson.JsonParseException:Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 0])\n at [Source: java.io.ByteArrayInputStream@706a53c8; line: 1, column: 5]\n\tat org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433)\n\tat org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521)\n\tat org.codehaus.jackson.impl.JsonParserMinimalBase._reportInvalidEOF(JsonParserMinimalBase.java:454)\n\tat org.codehaus.jackson.impl.JsonParserBase._handleEOF(JsonParserBase.java:473)\n\tat org.codehaus.jackson.impl.Utf8StreamParser._skipWSOrEnd(Utf8StreamParser.java:2327)\n\tat org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:444)\n\tat org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:172)\n\t... 18 more\n","message":"Failed to fetch next batch for the Resultset","status":500} Can anyone guide me how do I solve this Problem. I am using HDP 2.6.1 on Centos 7 through a Virtualbox. I am new to Hadoop please forgive if asking a silly question. But do need your help.
... View more
Labels:
10-09-2017
07:56 PM
Awesome!!!! It worked. Thanks @jnarayanan . Great help. @Timothy Spann thanks for trying to help. Appreciate it. I was using 8 gb ram. Other details posted in question.
... View more
10-08-2017
09:11 PM
I am following the tutorial at https://hortonworks.com/tutorial/predicting-airline-delays-using-sparkr/#step-2--setup-sparkr-on-rstudio It requires installation of RSTUDIO. I can download the file wget https://download2.rstudio.org/rstudio-server-rhel-1.0.153-x86_64.rpm I am getting error while installing . sudo yum install --nogpgcheck rstudio-server-rhel-1.0.153-x86_64.rpm Have included the image of error. Please guide. I am new to HDP sandbox and Hadoop too. I am using HDP-2.6.1 on virtual box and CENTOS 7
... View more
Labels:
09-28-2017
08:45 AM
Hello I am unable to install Rstudio on HDP sand box 2.6. Please guide me where should I go and check for details. I have tried this link but I did not succeed . https://community.hortonworks.com/content/kbentry/69424/setting-up-rstudio-on-hortonworks-docker-sandbox-2.html
... View more