Member since
05-18-2016
23
Posts
4
Kudos Received
0
Solutions
07-06-2018
01:05 AM
@Nikhil Silsarma "scan.setEndRow" API is not available with my hbase version= -version:
"Version 1.1.2.2.5.5.0-157" Will It work with "scan.setStopRow()" API ? Let me try and will update ..
... View more
07-05-2018
09:59 AM
@Nikhil Silsarma Specific records (On Rowprefix value)
... View more
07-04-2018
06:24 AM
Hello , My hbase table rowkey are in following format: vin1ts1 vin1ts2 vin1ts3 vin2ts1 vin2ts2 vin2ts3 vin2ts4 etc.... I would like to get the last available key of each ROWPREFIX , where ROWPREFIX ={VIN1, VIN2, VIN3,...} Example: for RowPrefix= VIN1, It should return "Rowkey=vin1ts3" & its values. -------- I am trying to use the reverse scan feature as below: Scan scan = new Scan(); scan.setCaching(1)
FilterList allFilters = new FilterList(FilterList.Operator.MUST_PASS_ALL);
allFilters.addFilter(new PrefixFilter(Bytes.toBytes(prefixFilterValue)));
scan.setFilter(allFilters);
scan.setReversed(true); //Read the latest available key and value
scan.setMaxResultSize(1);
ResultScanner scanner = tblConn.getScanner(scan); Result result = scanner.next();
LOGGER.log(Level.INFO, "Latest Key " + Bytes.toString(result.getRow())); scanner.close(); ----------- Above code works but takes around ~40 second to retrieve the target rowkey. Is there any better approach to get the same since 40 second is not sufficing our business condition ? Or do I need to set any scan property to reduce the scanner time ? Any pointers would be appreciated.. CLUSER INFO: HDP: HDP-2.5.5.0 hbase -version: Version 1.1.2.2.5.5.0-157, r002c801447187c620c26ffc130ff17a9b0a62ac1, Fri Apr 21 01:13:10 UTC 2017 Regards, Revan
... View more
Labels:
10-13-2016
08:00 AM
Here is describe extended: hive> describe extended h_estimate_orczlib;OKuserid varchar(15) apptypeid varchar(15) ts string power decimal(38,18) ds string # Partition Information # col_name data_type comment ds string Detailed Table Information Table(tableName:h_estimate_orczlib, dbName:umbrella, owner:Revan, createTime:1476180746, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:userid, type:varchar(15), comment:null), FieldSchema(name:apptypeid, type:varchar(15), comment:null), FieldSchema(name:ts, type:string, comment:null), FieldSchema(name:power, type:decimal(38,18), comment:null), FieldSchema(name:ds, type:string, comment:null)], location:gs://<BucketName>/HiveDB/umbrella/h_estimate_orczlib, inputFormat:org.apache.hadoop.hive.ql.io.orc.OrcInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.ql.io.orc.OrcSerde, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[FieldSchema(name:ds, type:string, comment:null)], parameters:{orc.compress=ZLIB, transient_lastDdlTime=1476180746, comment=this}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE)Time taken: 0.094 seconds, Fetched: 12 row(s)hive>
... View more
10-13-2016
07:59 AM
1 Kudo
Here is describe extended:
hive> describe extended h_estimate_orczlib;OKuserid varchar(15) apptypeid varchar(15) ts string power decimal(38,18) ds string # Partition Information # col_name data_type comment ds string Detailed Table Information Table(tableName:h_estimate_orczlib, dbName:umbrella, owner:Revan, createTime:1476180746, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:userid, type:varchar(15), comment:null), FieldSchema(name:apptypeid, type:varchar(15), comment:null), FieldSchema(name:ts, type:string, comment:null), FieldSchema(name:power, type:decimal(38,18), comment:null), FieldSchema(name:ds, type:string, comment:null)], location:gs://<BucketName>/HiveDB/umbrella/h_estimate_orczlib, inputFormat:org.apache.hadoop.hive.ql.io.orc.OrcInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.ql.io.orc.OrcSerde, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[FieldSchema(name:ds, type:string, comment:null)], parameters:{orc.compress=ZLIB, transient_lastDdlTime=1476180746, comment=this}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE)Time taken: 0.094 seconds, Fetched: 12 row(s)hive>
... View more
10-13-2016
03:04 AM
1 Kudo
Failed with exception java.io.IOException:java.io.IOException: Somehow read -1 bytes trying to skip 6257 more bytes t
o seek to position 6708, size: 1290047 Does anyone has any idea about how to fix this ?
... View more
Labels:
07-28-2016
07:50 AM
Error: E0803 : E0803: IO error, java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient I have added following properties in workflow: <credentials>
<credential name="hcat" type="hcat">
<property> <name>hcat.metastore.uri</name> <value>thrift://<Host_Name>:9083</value> </property> <property> <name>hcat.metastore.principal</name> <value>hive/_HOST@EXAMPLE.COM</value> </property> </credential> </credentials> Does anyone has any idea about this error ? Any help much apprecisted!
... View more
07-27-2016
06:10 AM
Thank You!
... View more
07-27-2016
04:26 AM
oozie job -oozie http://<Oozie_Server_Host>:11000/oozie/ -config cca.properties -submit
Error: E0501 : E0501: Could not perform authorization operation, Operation category READ is not supported in state standby at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87) at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1932) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1313) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3928) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1109) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:851) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2206) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2202) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2200) Does any one has any idea that why we get such error ?
... View more
Labels:
07-20-2016
04:50 AM
When I am trying to access my hive databases from Hive View, am getting above error. Does anyone has any idea about this that why we get such error from hive view ?
... View more
Labels:
07-12-2016
06:07 AM
We have 5 node Hadoop cluster in which, One node-manager is failing again and again with following error. I copied this log from the same host : Nodemanager log diretcory: /var/log/hadoop-yarn/yarn/yarn-yarn-nodemanager-SlaveHostname.log LOCALIZING
2016-07-12 15:01:10,593 INFO containermanager.AuxServices (AuxServices.java:handle(196)) - Got event CONTAINER_INIT for appId application_1467937373251_0002
2016-07-12 15:01:10,594 INFO yarn.YarnShuffleService (YarnShuffleService.java:initializeContainer(183)) - Initializing container container_e16_1467937373251_0002_02_000001
2016-07-12 15:01:10,596 ERROR event.AsyncDispatcher (AsyncDispatcher.java:dispatch(189)) - Error in dispatcher thread
java.lang.NoSuchMethodError: org.apache.hadoop.yarn.util.FSDownload.createStatusCacheLoader(Lorg/apache/hadoop/conf/Configuration;)Lcom/google/common/cache/CacheLoader;
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.handleInitContainerResources(ResourceLocalizationService.java:442)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.handle(ResourceLocalizationService.java:392)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.handle(ResourceLocalizationService.java:140)
at org.apache.hadoop.yarn.event.AsyncDispatcher.dispatch(AsyncDispatcher.java:183)
at org.apache.hadoop.yarn.event.AsyncDispatcher$1.run(AsyncDispatcher.java:109)
at java.lang.Thread.run(Thread.java:745)
2016-07-12 15:01:10,597 INFO event.AsyncDispatcher (AsyncDispatcher.java:run(290)) - Exiting, bbye..
2016-07-12 15:01:10,660 INFO ipc.Server (Server.java:run(949)) - IPC Server Responder: starting Does anyone has any idea about this error ?
... View more
Labels:
06-23-2016
11:37 PM
I tries this link: http://info.hortonworks.com/SizingGuide.html But its not working.
... View more
06-15-2016
01:41 AM
Simply Which languge should we use for SPARK ML as a fresher ?
... View more