Member since
02-09-2016
559
Posts
422
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2137 | 03-02-2018 01:19 AM | |
3527 | 03-02-2018 01:04 AM | |
2370 | 08-02-2017 05:40 PM | |
2345 | 07-17-2017 05:35 PM | |
1720 | 07-10-2017 02:49 PM |
10-28-2016
07:30 PM
@Ashnee Sharma Can you try using s3n instead of s3a? Support for s3a came into Hadoop with version 2.6.0 (https://issues.apache.org/jira/browse/HADOOP-10400). HDP 2.5 provides s3a as a Tech Preview. HDP 2.1 came with Hadoop 2.4.0 (http://hortonworks.com/blog/announcing-hdp-2-1-general-availability/) You can see this link for using s3n https://community.hortonworks.com/articles/7296/hdp-22-configuration-required-for-s3.html
... View more
10-28-2016
07:21 PM
I don't know if it is in the pipeline. I looked at the Apache JIRAs and I didn't see any covering this functionality. I created one: https://issues.apache.org/jira/browse/RANGER-1195. To manage this in Tableau, you need to use custom SQL (again requires you to know which columns you can see): https://onlinehelp.tableau.com/current/pro/desktop/en-us/customsql.html
... View more
10-28-2016
06:17 PM
1 Kudo
@Raffi Abberbock You need to configure Knox SSO. Here is the link to the Hortonworks documentation: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/setting_up_knox_sso_for_ambari.html. You may also find this link helpful: https://cwiki.apache.org/confluence/display/KNOX/Ambari+via+KnoxSSO+and+Default+IDP
... View more
10-28-2016
06:09 PM
1 Kudo
@Kate Shaw
If you attempt to select any columns that you are not authorized, you will see that message. If you attempt to describe the table, you will also see that message. It is a security approach. In many environments, security doesn't want you to know columns exists that you aren't allow to see. I certainly understand your confusion. I would expect a "select * ..." to only return the columns I'm allow to see and I would expect a "describe table" would only show the columns I'm allowed to see. That is currently not the case.
... View more
10-27-2016
01:44 PM
@Ashnee Sharma Have you seen this HCC link, it may be helpful: https://community.hortonworks.com/questions/7165/how-to-copy-hdfs-file-to-aws-s3-bucket-hadoop-dist.html Also take a look at the official hadoop-aws docs: https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html
... View more
10-26-2016
01:39 AM
Did you install the vagrant plugin "vagrant-hostmanager"? It is listed a requirement at the top of the tutorial.
... View more
10-25-2016
06:56 PM
@Avijeet Dash When you run the docker load command, it's copying the container image to the virtual machine that Docker uses to run the containers. You can't change the location of where those images are stored that I know of. You can run docker images to see a list of images that have been loaded. You may see something like this: $ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
sandbox latest 09252f3bc286 2 weeks ago 13.84 GB
elasticsearch latest 22287ab1f811 3 weeks ago 342.8 MB
kibana latest 67c20a93c5bc 4 weeks ago 297 MB
alpine edge e4c65b272e02 4 weeks ago 4.824 MB
<none> <none> fc813bdc4bdd 4 weeks ago 14.57 GB
hdp/postgres latest 26e6495659e4 5 weeks ago 310.9 MB
centos 6 f07f6ca555a5 6 weeks ago 194.6 MB
postgres latest 6f86882e145d 7 weeks ago 265.9 MB
You can delete any images that you don't need with docker rmi <image id>. You can also use docker ps -a to see a list of containers that are lying around. You may see something like this: docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
15411dc968ad fc813bdc4bdd "/usr/sbin/sshd -D" 2 weeks ago Exited (255) 3 hours ago hdp25-atlas-demo
If you have a lot of containers sitting around, they do take up space. You can remove any containers you don't need by using docker rm <container id>.
... View more
10-25-2016
02:54 PM
@cduby Are you using the native Docker sandbox? Or are you use the VirtualBox/VMware Docker sandbox? You should be able to log into the Docker container in either case via ssh -p 2222 root@localhost. The default password is hadoop. Logs are in /var/log for the various applications and components.
... View more
10-23-2016
11:58 PM
@Mayank Shekhar VT-x is a hardware feature in most modern CPUs. It is can be enabled/disabled in the BIOS of your computer. If you do not have the feature, then Virtualization won't work (VMWare or VirtualBox). You can read more here at this link: http://www.howtogeek.com/213795/how-to-enable-intel-vt-x-in-your-computers-bios-or-uefi-firmware/. I can't give you specific instructions on how to enable this feature, as it is different for every make and model of computer. You can also read more here: http://www.itworld.com/article/2981515/virtualization/virtualbox-diagnose-and-fix-vt-xamd-v-hardware-acceleration-errors.html
... View more
10-18-2016
12:27 AM
I'm glad you got it worked out. I will look into getting the tutorials updated to reflect these details.
... View more