Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 789 | 06-04-2025 11:36 PM | |
| 1370 | 03-23-2025 05:23 AM | |
| 680 | 03-17-2025 10:18 AM | |
| 2465 | 03-05-2025 01:34 PM | |
| 1606 | 03-03-2025 01:09 PM |
08-18-2017
09:36 AM
@Kishore Kumar Have you set proxyuser in the Ambari UI-->HDFS --> Configs--> Advanced hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* if the ambari is being run as root
... View more
08-18-2017
09:21 AM
@@Kishore Kumar Good to now then I deserve the credit .....just accept my response !
... View more
08-18-2017
09:19 AM
@sachin gupta @webb wang All the same you should have attached your kms-acls.xml so I could visualize it. Having said that can you add this key value in kms-acls.xml <name>key.acl.key4USER_1.DECRYPT_EEK</name>
<value>USER_1 GROUP_1</value> Keep me posted
... View more
08-17-2017
07:59 PM
@sachin gupta KMS has an ACL file named "kms-acls.xml". Can you copy and paste or attach the contents in a file in this thread?
... View more
08-17-2017
05:29 PM
@Kishore Kumar Are you using volume groups? if yes,First of all you need to see if you have any available pe's in your volume group (in this case vg00). Check the output of # vgdisplay vg00 Then, assuming you have some Physical Extents free, you can extend the /usr volume, # lvextend -L +<x-amount>G /dev/vg00/usr after that you need to resize your filesystem to reflect the new size of this "partition": # resize2fs /dev/vg00/usr assuming it is ext3 or # resize4fs /dev/vg00/us if your partition filesystem is ext4 (check in /etc/fstab file if necessary) If you have no PE's available, you could consider either to shrink another volume in the same group (like home in this case), or add another PV. Regardless, note that the fact you have a partition that's 100GB in size does not mean you have access to 100GB. Since it is a physical volume used for LVM it only means you have 100GB worth of Physical Extents
... View more
08-17-2017
09:12 AM
@sachin gupta So it was a permissions issue as I had earlier stated! Can you validate the same on your cluster
... View more
08-16-2017
09:22 PM
@Sahil Jindal For your info those are the different kernel that you could use to boot up but the latest should work! Check your network adapters in my case I am using the bridged-adapter and its working fine. Make sure though you have at least 8 GB for the sandbox
... View more
08-16-2017
12:05 PM
@Luis Ruiz Nice to know it helped, that's the open source spirit 🙂
... View more
08-16-2017
11:21 AM
@Luis Ruiz Install putty to interact with your sandbox from your desktop, let download what is the version of sandbox?
... View more
08-16-2017
10:14 AM
@Luis Ruiz Try this # cd /tmp/data # su - hdfs $ hdfs dfs -copyFromLocal autonomias.txt /apps/hive/warehouse/ $ hdfs dfs -chown -R hive:hive /apps/hive/warehouse/ hive> load data inpath '/apps/hive/warehouse/autonomias.txt' into table comunidades; Tell me that should work
... View more