Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1496 | 10-11-2018 01:38 AM | |
1863 | 09-26-2018 02:24 AM | |
1822 | 06-29-2018 02:35 PM | |
2414 | 06-29-2018 02:34 PM | |
5354 | 06-20-2018 04:30 PM |
06-21-2018
04:53 AM
1 Kudo
I recommend you take a look at building a uber jar. This will allow you to solve classpath issues where dependencies are not found.
... View more
06-08-2018
03:31 PM
found the issue .. the phoenix view should have the same name as the HBASE table .. Phoenix document does not state this
... View more
06-08-2018
03:03 PM
hi Jay i am not using a package .. how do I compile ? # more RetriveData.java
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.util.Bytes;
public class RetriveData{
public static void main(String[] args) throws IOException, Exception{
// Instantiating Configuration class
Configuration config = HBaseConfiguration.create();
// Instantiating HTable class
HTable table = new HTable(config, "PUR_ACCT_PHX");
// Instantiating Get class
Get g = new Get(Bytes.toBytes("1001181"));
// Reading the data
Result result = table.get(g);
// Reading values from Result class object
byte [] value = result.getValue(Bytes.toBytes("cf1"),Bytes.toBytes("PUR_ID"));
byte [] value1 = result.getValue(Bytes.toBytes("cf1"),Bytes.toBytes("PUR_TRANS_DATE"));
// Printing the values
String purchaseID = Bytes.toString(value);
String purchaseDate = Bytes.toString(value1);
System.out.println("PURCASE ID: " + purchaseID + " PUR_TRANS_DATE: " + purchaseDate);
}
}
... View more
05-22-2018
04:22 PM
2 Kudos
@Sami Ahmad Moving NiFi's various repositories is an easy process. The nifi.properties file defines for NiFi where to find each one fo the repositories. Specifically look for these lines: - nifi.flowfile.repository.directory=/<my-original-path>/flowfile_repository
nifi.content.repository.directory.default=/<my-original-path>/content_repository - Each will be pointing to the current directory path where these repositories reside. 1. Stop your NiFi instance(s). Copy/Move the "content_repository" directory recursively to the new location. 2. Make sure the user that runs your NiFi process has proper ownership and permissions to the new directory location. 3. Update your nifi.properties file to point at new location path: nifi.flowfile.repository.directory=/<my-new-path>/flowfile_repository
nifi.content.repository.directory.default=/<my-new-path>/content_repository 4. Start Your NiFi instance(s). - Thank you, Matt - If you found this answer addressed your question, please take moment to login and click "accept" below the answer
... View more
05-02-2018
01:28 PM
hi Shu you wrote and you are right ..its importing only one record with m1 on each run . I tried 4 sqoop loads and got 4 versions.
that means sqoop import to hbase table will not store all the versions but from hbase shell it will store all the versions) is it a bug or feature ? is Hortonworks aware of this and what is their comment ?
... View more
04-16-2018
07:39 PM
ok thanks Josh I figured out my mistake . I didn't realize that phoenix automatically find the primary key and I don't have to specify the primary key column name explicitly . thanks for the guidance .
... View more
03-27-2018
08:15 PM
ok I found the issue with my statement , I was using "-" in table name which it didn't like . thanks for your help
... View more