Member since
02-16-2016
176
Posts
197
Kudos Received
17
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3476 | 11-18-2016 08:48 PM | |
6150 | 08-23-2016 04:13 PM | |
1661 | 03-26-2016 12:01 PM | |
1556 | 03-15-2016 12:12 AM | |
16146 | 03-14-2016 10:54 PM |
03-12-2016
01:05 PM
2 Kudos
Hello @Ancil McBarnett , I checked that ranger-admin-site.xml, it was not updated there. I was wondering, since I am using HDP2.2.4, why is that file under .../HDP/2.3/... ?!?! Anyways, I set the port to 6182, but unfortunately after restarting Ranger the alert was still there. So I searched for other RANGER configs, because of the HDP2.2.4 vs. HDP2.3 and at the end it turned out that I had to update the port in file /var/lib/ambari-server/resources/common-services/RANGER/0.4.0/configuration/ranger-site.xml After restarting Ranger the alert disappeared. Many thanks for this hint...
... View more
07-11-2017
10:26 PM
1 Kudo
If you are using Hive 2 or later (including Hive LLAP), you no longer need the dummy table, statements like: INSERT INTO table test_array SELECT 1, array('a','b'); will work.
... View more
09-02-2017
05:07 PM
1 Kudo
hadoop jar avro-tools-1.8.2.jar getschema hdfs_archive/mydoc.avro would also done the job , instead of java -jar, you can directly run it on hdfs thanks to : hadoop jar avro-tools-1.8.2.jar getschema hdfsPathTOAvroFile.avro
... View more
03-04-2016
07:10 PM
1 Kudo
@Shishir Saxena: Thanks for reply. Actually when I tried above location then it failed like below as expected. root@m1 ~]# hadoop fs -ls jceks://hdfs/user/ ls: No FileSystem for scheme: jceks But when I did ls to my user inside hdfs then it listed out that file. [root@m1 ~]# hadoop fs -ls /user/root/ Found 6 items drwxr-xr-x - root hdfs 0 2016-01-25 23:30 /user/root/.hiveJars drwx------ - root hdfs 0 2016-02-29 04:31 /user/root/.staging drwxr-xr-x - root hdfs 0 2016-02-24 18:16 /user/root/OozieTest -rwxr-xr-x 3 root hdfs 1484 2016-02-03 21:19 /user/root/Output.json -rwx------ 3 root hdfs 504 2016-03-02 04:14 /user/root/mysql.password.jceks [root@m1 ~]# hadoop fs -cat /user/root/mysql.password.jceks encodedParamst[B[encryptedContentq~Lsun.paramsAlgtLjava/lang/String;LsealAlgq~xpur[B??T?xp0xrjavax.crypto.SealedObject>6=?÷Tp[ _ܬ??uq~?"?5?????-?y?L;XF6??zQ !z???????"???>I?cU?ɾ! So It gave my question's answer. Thanks once again.
... View more
03-03-2016
05:51 PM
3 Kudos
HDFS does not have any Thrift services. You can find the HBase Thrift definitions at https://github.com/hortonworks/hbase-release/tree/HDP-2.3.0.0-tag/hbase-thrift/src/main/resources/org/apache/hadoop/hbase
... View more
03-01-2016
07:19 PM
Same issue with Hive action. It is frustrating and inefficient that such a common use case cannot be accommodated.
... View more
02-26-2016
01:34 AM
@Shishir Saxena Nice! You can accept the answer as your answered it already
... View more
07-28-2018
09:33 AM
@Shishir Saxena I also followed your instruction No. 3 [Connectting to Kerberos cluister using keytab] on connecting to Phoenix. But failed, my version is hdp 2.6.4, can you show more details? thank u so much.
... View more
09-05-2017
01:37 PM
@Shishir Saxena, Yes GetFile processor also worked for shared drive. i have doubt on how to pass the credentials for access network drive? For example: my shared drive wants prompt for credentials for access folder inside it. If my shared folder have permission for everyone then i can able to access it. But if shared drive prompts for Credentials then ListFile seems doesn't work. So can you suggest the way to access shared drive with username and password in nifi processors?
... View more
02-22-2016
05:18 PM
2 Kudos
Thank You @Artem Ervits for pointing me in right direction. While looking at policy json file, I noticed that it had a null in path for my new policy. It looks like somehow a null was being added to policy file due to some keystroke combination. Once I delete this policy, policy sync starts working correctly. In policycache directory, hdfs_<policy>.json file had following line for my new policy. "resources": {
"path": {
"values": [ null
... View more
- « Previous
- Next »