Member since
03-28-2016
49
Posts
4
Kudos Received
0
Solutions
06-24-2019
07:04 PM
Thank you very much for this answer.
... View more
05-22-2019
04:18 AM
Hi , we are in the process of creating HDP 2.6 cluster where in RHEL OS will be integrated with AD for authentication. We will using AD as the KDC. My question if we create a local UNIX user called HIVEUSER and use any BI tool to connect to HIVE using this user, will the local user be able to get authenticated and access Hive tables in kerberized cluster? or the HIVEUSER should be in AD?
... View more
Labels:
- Labels:
-
Apache Hive
10-30-2017
03:02 PM
I don't want to add one more column to the actual index columns, but in the INCLUDE() section. Will that also have same effect?
... View more
10-24-2017
02:42 PM
I have an existing covered index which I created using statement like below. CREATE INDEX my_index ON my_table (v1,v2) INCLUDE(v3).
Now I want to include one more column v4, like INCLUDE(v3,v4). v4 column will not be part of actual index but it will be in covered part.
I am not sure if we have an Alter statement to do this. Request for assistance.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
07-16-2017
06:36 PM
Thank you Matt, ListHDFS was a good hint. I was able to accomplish my task with you inputs.
... View more
07-16-2017
06:33 PM
Hi All, I have taken a Snapshot in my Dev Environment and Exported to PreProd environment so that I can clone a new table from snapshot or else I can restore it. I exported the snapshot from Dev to Pre Prod using below command: /usr/bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot snapshot_table_name -copy-to hdfs://NAME_NODE:8020/hbase -mappers 5 My problem is when I try to clone the table from this exported snapshot I get error Unknown Snapshot and also this exported snapshot is not showing up when I hit list_snapshot commad in Hbase shell. I can see the exported snapshot in hdfs using below command as I gave the hdfs path as hdfs://NAME_NODE:8020/hbase: hadoop fs -ls /hbase However all the other snapshots which I created in Pre Prod are present in the below location: hadoop fs -ls /apps/hbase/data How can I resolve this issue? Is there a problem in the way I exported the Snapshot? Do i have to export the snapshot and give the path as hdfs://NAME_NODE:8020/apps/hbase/data ? Request for assistance.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
07-13-2017
08:38 AM
Hi All, I want to fetch the data that is stored in HDFS using FetchHDFS processor . The folder structure to store our data is like /MajorData/Location/Year/Month/Day/file1.txt (/MajorData/Location/2017/01/01/file1.txt) As the day changes the folder structure will change to /MajorData/Location/2017/01/02/file2.txt How can I write a Nifi expression which will traverse through all the folders, fetch the data in NiFi?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
06-27-2017
11:40 AM
Hi @Wynner I have about 300 processors and we always have about 20MB streaming data flowing. What are the optimum values for Maximum Timer Driven Thread Count and Maximum Event Driven Thread Count? How can I decide how much value I have to set for above two parameters.
... View more
06-20-2017
03:02 PM
Hi , I am trying to configure nifi Execute Process to connect to MSSQL and execute the store procedure . I followed this link https://community.hortonworks.com/questions/26170/does-executesql-processor-allow-to-execute-stored.html where in @M. Mashayekhi has provided the steps how he connected to MSSQL and executed the Stored Procedure. @Mashayekhi , I wanted to know how your execute process configuration screen looks like and also please let me know if any additional client tools are required to be installed so that I can execute the stored procedure. Thank you.
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Spark
06-08-2017
02:03 PM
Hi I have a flow file like: server|list|number|3|abc|xyz|pqr|2015-06-06 13:00:00 , here records are separated by pipe character. In the above record, we see there a number 2 and it is followed by abc and xyz. My requirement is, I want to split the above flow file into files based on the number, my output should look like below: server|list|number|abc|2015-06-06 13:00:00 server|list|number|xyz|2015-06-06 13:00:00 server|list|number|pqr|2015-06-06 13:00:00 I have come to a stage wherein I have converted above flow file in JSON and split the json file and I have captured abc|xyz|pqr in one attribute, I request help on how I can split them further into Individual records in Nifi so that I can insert them in HBase.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache NiFi
-
Apache Phoenix