Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

java.lang.RuntimeException: Exception while committing to database.

Highlighted

java.lang.RuntimeException: Exception while committing to database.

Guru

I am trying to write the data into hbase through phoenix but getting Db commit exception.

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Courier; color: #4c2f2d; background-color: #dfdbc4}

WARN TaskSetManager: Lost task 7.0 in stage 2.0 (TID 14, ): java.lang.RuntimeException: Exception while committing to database.

at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:87)

at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:39)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1036)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1034)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1034)

at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1042)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1014)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)

at org.apache.spark.scheduler.Task.run(Task.scala:88)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:745)

Suppressed: java.lang.RuntimeException: org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 2000 actions: IOException: 2000 times,

at org.apache.phoenix.mapreduce.PhoenixRecordWriter.close(PhoenixRecordWriter.java:62)

at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$5.apply$mcV$sp(PairRDDFunctions.scala:1043)

at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1215)

... 8 more

Caused by: org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 2000 actions: IOException: 2000 times,

at org.apache.phoenix.execute.MutationState.commit(MutationState.java:444)

at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:461)

at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:458)

at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)

at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:458)

at org.apache.phoenix.mapreduce.PhoenixRecordWriter.close(PhoenixRecordWriter.java:59)

... 10 more

3 REPLIES 3
Highlighted

Re: java.lang.RuntimeException: Exception while committing to database.

Super Collaborator

@Saurabh any background? What is the phoenix version and does sqlline client works well for similar tasks?

Highlighted

Re: java.lang.RuntimeException: Exception while committing to database.

Guru
@Sergey Soldatov

Actually I am trying to create table in hbase from hive table also using spark.

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 14.0px Calibri}

spark-submit --verbose --master yarn --deploy-mode client --conf spark.yarn.executor.memoryOverhead=600 --num-executors 50 --conf spark.executor.memory=30G --conf spark.yarn.queue=default --conf spark.driver.memory=30G --conf spark.kryoserializer.buffer.max=512M --conf spark.driver.cores=6 --conf spark.executor.cores=6 --conf spark.executor.instances=6 --conf spark.driver.maxResultSize=20G --conf spark.shuffle.memory.fraction=0.6 --conf spark.storage.memory.fraction=0.4 --conf spark.sql.shuffle.partitions=500 --conf "spark.executor.extraJavaOptions=-XX:MaxPermSize=1024m -XX:PermSize=256m" --jars $(echo /usr/hdp/2.3.4.0-3485/hive/lib/*.jar | tr ' ' ','),/usr/hdp/2.3.4.0-3485/phoenix/phoenix-4.4.0.2.3.4.0-3485-client.jar,/usr/hdp/2.3.4.0-3485/phoenix/lib/phoenix-spark-4.4.0.2.3.4.0-3485.jar --class bigdata.HBaseOperations /apps/comp/target/comp-intel-1.0-SNAPSHOT.jar cis raw_final CIS.RAW hdfs://HDPHA <zookeeper server>:2181:/hbase-unsecure jdbc:phoenix:<zookeeper server>:2181:/hbase-unsecure org.apache.phoenix.jdbc.PhoenixDriver insert 2016-12-09

Re: java.lang.RuntimeException: Exception while committing to database.

Rising Star

Hi @Saurabh

We looked into this yesterday and figure out, this is caused by Ranger.

In the region sever jstack, we found RS rpc handler threads gets stuck at getLocalHostName() which is requested by ranger audit . Ranger, which is performing hostname lookup for every put operation, and thus slowing down the whole write operation. It is known issue, https://issues.apache.org/jira/browse/RANGER-809 . We tested this after disabling hbase-ranger plugin and job went fine. This issue is fixed in hdp-2.4. Will share the patch for your version in the case.

"B.defaultRpcServer.handler=58,queue=4,port=16020" daemon prio=10 tid=0x00007f6b71c2d800 nid=0xb3b6 runnable [0x00007f6b35c8a000]
   java.lang.Thread.State: RUNNABLE
at java.net.Inet4AddressImpl.getLocalHostName(Native Method)
at java.net.InetAddress.getLocalHost(InetAddress.java:1444)
at org.apache.ranger.audit.provider.MiscUtil.getHostname(MiscUtil.java:159)
at org.apache.ranger.plugin.audit.RangerDefaultAuditHandler.populateDefaults(RangerDefaultAuditHandler.java:169)
at org.apache.ranger.plugin.audit.RangerDefaultAuditHandler.getAuthzEvents(RangerDefaultAuditHandler.java:106)
at org.apache.ranger.authorization.hbase.HbaseAuditHandlerImpl.getAuthzEvents(HbaseAuditHandlerImpl.java:45)
at org.apache.ranger.plugin.audit.RangerDefaultAuditHandler.processResult(RangerDefaultAuditHandler.java:50)
at org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl.isAccessAllowed(RangerPolicyEngineImpl.java:142)
at org.apache.ranger.plugin.service.RangerBasePlugin.isAccessAllowed(RangerBasePlugin.java:149)
at org.apache.ranger.authorization.hbase.AuthorizationSession.authorize(AuthorizationSession.java:198)
at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.evaluateAccess(RangerAuthorizationCoprocessor.java:444)
at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.requirePermission(RangerAuthorizationCoprocessor.java:528)
at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.prePut(RangerAuthorizationCoprocessor.java:989)
at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.prePut(RangerAuthorizationCoprocessor.java:1089)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$30.call(RegionCoprocessorHost.java:902)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1748)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1705)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.prePut(RegionCoprocessorHost.java:898)
at org.apache.hadoop.hbase.regionserver.HRegion.doPreMutationHook(HRegion.java:2839)
at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2814)
at org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:2760)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:692)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:654)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2032)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32213)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Don't have an account?
Coming from Hortonworks? Activate your account here