Member since
05-18-2016
23
Posts
4
Kudos Received
0
Solutions
07-06-2018
01:05 AM
@Nikhil Silsarma "scan.setEndRow" API is not available with my hbase version= -version:
"Version 1.1.2.2.5.5.0-157" Will It work with "scan.setStopRow()" API ? Let me try and will update ..
... View more
07-05-2018
09:59 AM
@Nikhil Silsarma Specific records (On Rowprefix value)
... View more
07-04-2018
06:24 AM
Hello , My hbase table rowkey are in following format: vin1ts1 vin1ts2 vin1ts3 vin2ts1 vin2ts2 vin2ts3 vin2ts4 etc.... I would like to get the last available key of each ROWPREFIX , where ROWPREFIX ={VIN1, VIN2, VIN3,...} Example: for RowPrefix= VIN1, It should return "Rowkey=vin1ts3" & its values. -------- I am trying to use the reverse scan feature as below: Scan scan = new Scan(); scan.setCaching(1)
FilterList allFilters = new FilterList(FilterList.Operator.MUST_PASS_ALL);
allFilters.addFilter(new PrefixFilter(Bytes.toBytes(prefixFilterValue)));
scan.setFilter(allFilters);
scan.setReversed(true); //Read the latest available key and value
scan.setMaxResultSize(1);
ResultScanner scanner = tblConn.getScanner(scan); Result result = scanner.next();
LOGGER.log(Level.INFO, "Latest Key " + Bytes.toString(result.getRow())); scanner.close(); ----------- Above code works but takes around ~40 second to retrieve the target rowkey. Is there any better approach to get the same since 40 second is not sufficing our business condition ? Or do I need to set any scan property to reduce the scanner time ? Any pointers would be appreciated.. CLUSER INFO: HDP: HDP-2.5.5.0 hbase -version: Version 1.1.2.2.5.5.0-157, r002c801447187c620c26ffc130ff17a9b0a62ac1, Fri Apr 21 01:13:10 UTC 2017 Regards, Revan
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
07-27-2016
06:10 AM
Thank You!
... View more
07-27-2016
04:26 AM
oozie job -oozie http://<Oozie_Server_Host>:11000/oozie/ -config cca.properties -submit
Error: E0501 : E0501: Could not perform authorization operation, Operation category READ is not supported in state standby at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87) at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1932) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1313) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3928) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1109) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2206) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2202) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2200) Does any one has any idea that why we get such error ?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Oozie
-
Apache YARN
07-20-2016
04:50 AM
When I am trying to access my hive databases from Hive View, am getting above error. Does anyone has any idea about this that why we get such error from hive view ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
-
Apache Ranger
07-11-2016
05:37 AM
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
-
Apache YARN
07-05-2016
04:24 AM
Labels:
- Labels:
-
Apache Ambari
-
Apache Oozie
-
Cloudera Hue
06-23-2016
11:37 PM
I tries this link: http://info.hortonworks.com/SizingGuide.html But its not working.
... View more
06-23-2016
11:35 PM
1 Kudo
Labels:
- Labels:
-
Apache Hadoop
-
Apache YARN