Member since
06-18-2015
55
Posts
34
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1277 | 03-04-2016 02:39 AM | |
1764 | 12-29-2015 09:42 AM |
04-12-2016
06:47 AM
Hi, I am on HDP 2.3.4 ( 3 node cluster) , My HBase scans are slow after inserting a million row data As I am new bee to HBase, Any suggestions experts can provide me to tune performance. Would really appreciate the help. Thanks, Divya
... View more
Labels:
- Labels:
-
Apache HBase
03-17-2016
08:52 AM
As mentioned in my comment I already tried ,but it didn't work .
... View more
03-15-2016
06:13 AM
1 Kudo
I downloaded Hive-Hbase-Handler1.2.jar and renamed to hive-hbase-1.2.1.2.3.4.0-3485.jar and uploaded to usr/hdp/hive/lib Now when I try to create table based on hbase , I am getting error ,Unable to connect to Thrift Server.
... View more
03-15-2016
05:29 AM
1 Kudo
Hi, Mistakenly I deleted(did delete forever) the hive-hbase handler jar from my HDP 2.3.4 cluster . From where can I download now Thanks, Divya
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive
03-04-2016
02:39 AM
2 Kudos
Able to resolve it as missing one of the jars files hbase-hadoop-compat.jar:
... View more
03-02-2016
03:09 AM
3 Kudos
Hi, I have registered the hive external table on hbase table. When I try to access that through hiveContext getting below error org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in sta ge 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, i p-172-31-29-201.ap-southeast-1.compute.internal): java.lang.RuntimeException: hb ase-default.xml file seems to be for an older version of HBase (null), this vers ion is 1.1.2.2.3.4.0-3485
at org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBase Configuration.java:71)
at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseCon figuration.java:81) I have already placed hbase-default.xml ,hbase-site.xml in spark/conf by setting the below property to true <property>
<name>hbase.defaults.for.version.skip</name>
<value>true</value>
<description>Set to true to skip the 'hbase.defaults.for.version' check.
Setting this to true can be useful in contexts other than
the other side of a maven generation; i.e. running in an
IDE. You'll want to set this boolean to true to avoid
seeing the RuntimeException complaint: "hbase-default.xml file
seems to be for and old version of HBase (\${hbase.version}), this
version is X.X.X-SNAPSHOT"</description>
</property> Spark code : import org.apache.spark.sql.hive.HiveContextval hiveContext = new HiveContext(sc)val df = hiveContext.sql("select * from test")df.show Adding these jars while submitting starting spark shell /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar
/usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler-1.2.1.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hive/lib/htrace-core-3.1.0-incubating.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-client-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-common-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-protocol-1.1.2.2.3.4.0-3485.jar
/usr/hdp/2.3.4.0-3485/hbase/lib/hbase-server-1.1.2.2.3.4.0-3485.jar
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive
-
Apache Spark
02-19-2016
02:40 AM
1 Kudo
@asinghal Why doesn't it throw error when I run the same command HDP 2.3.2 sandbox ? In sandbox it works fine and I couldn't see Jackson dependencies conflicts error .
... View more
02-18-2016
09:09 AM
2 Kudos
Hi,I am getting following error while starting spark shell with phoenix clients spark-shell --jars /usr/hdp/current/phoenix-
client/phoenix-4.4.0.2.3.4.0-3485-client.jar --driver-class-path /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --master yarn-client StackTrace: INFO TimelineClientImpl: Timeline service address: http://ip-xxx-xx-xx-xxx.ap-southeast-1.compute.internal:8188/ws/v1/timeline/
java.lang.NoSuchMethodError: org.codehaus.jackson.map.ObjectMapper.setSerializationInclusion(Lorg/codehaus/jackson/map/annotate/JsonSerialize$Inclusion;)Lorg/codehaus/jackson/map/ObjectMapper;
at org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider.configObjectMapper(YarnJacksonJaxbJsonProvider.java:59)
at org.apache.hadoop.yarn.util.timeline.TimelineUtils.<clinit>(TimelineUtils.java:50)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:172)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:108)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $iwC$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:685)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.lang.NullPointerException
at org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:685)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
Googled and found there is Jackson dependency is not available for Hadoop 2.x version(SPARK-5108)
Is the above errors related to above mentioned issue . Thanks,
... View more
Labels:
- Labels:
-
Apache Phoenix
-
Apache Spark
02-18-2016
08:43 AM
1 Kudo
As SparkUI port was internal ip of Ec2 ,that's the reason I wasn't able to view the current running jobs in Spark UI. To resolve this issue ,had to configure SSH Tunnel.
... View more
02-18-2016
08:40 AM
1 Kudo
@Artem Ervits : I had to configure SSH Tunnel as my cluster was running on EC2. Thanks a lot.
... View more