Member since
08-23-2016
261
Posts
201
Kudos Received
106
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1721 | 01-26-2018 07:28 PM | |
1372 | 11-29-2017 04:02 PM | |
35013 | 11-29-2017 03:56 PM | |
3374 | 11-28-2017 01:01 AM | |
933 | 11-22-2017 04:08 PM |
05-19-2017
10:36 PM
1 Kudo
Hi @Subrahmanya Oruganti I am able to reproduce this, and will be reporting it in. I suspect that it has to do with the fact that Docker is being used within the appliance for the HDF Sandbox. When the Sandbox is being aburptly shutdown, the Docker container inside has doesn't cleanly stop. When you reboot, Docker see's a conflict as a container is already started with the same name: docker: Error response from daemon: Conflict. The container name "/sandbox-hdf" is already in use by container. When you are using port -12122, you are not SSH'ing to the HDF Sandbox, but to the OS host that the Docker container runs inside of. Instead, Port 12222 will SSH you into the HDF Sandbox VM. If you try to ssh on port 12222, you may find that it does not work because the HDF Sandbox isn't running (the Docker conflict we noted above). To resolve the issue from this state, you'll have to rm the Docker container and use the included start scripts to redeploy it: 1) ssh root@127.0.0.1 -p 12122 2) docker rm "/sandbox-hdf" 3) cd start_scripts 4) ./start_sandbox.sh 5) If it started successfully, you should see this message: Successfully Started HDF Sandbox Container 6) try hitting the URL again: http://127.0.0.1:18888/ A better way to go with the Sandbox VMs is to use the "save the machine state" rather than power off from the VirtualBox shutdown options. This acts more like a suspend/resume and will preserve the Docker container. As always, if you find this post useful, please "accept" the answer.
... View more
05-19-2017
09:52 PM
1 Kudo
@amine adi I'm guessing that the ports are closed. If you are using the HDP Sandbox, the Sandbox is now Dockerized within the VM. You may want to add the Cassandra ports that you are using do both the VirtualBox Port Forwarding Section, and, you'll also have to edit some of the Docker scripts. Fortunately, there is a very good step by step tutorial here: https://community.hortonworks.com/articles/65914/how-to-add-ports-to-the-hdp-25-virtualbox-sandbox.html As
always, if you find this post useful, don't forget to "accept" the
answer.
... View more
05-19-2017
05:50 PM
1 Kudo
Hi @Lucy zhang I'm using VirtualBox, but, I think the NAT setup is all that is required. I did not do anything special for my setup. Using the VirtualBox menu, I imported the appliance which automatically setup the NAT configuration. When the machine finished installing, I started the machine and saw this screen: One thine to note is the ports on the HDF Sandbox are different compared to the HDP Sandbox. While 2222 was correct for SSH'ing into the HDP Sandbox, HDF Sandbox actually uses 12222. I use the following command to SSH into the HDF Sandbox: ssh root@127.0.0.1 -p 12222
... View more
05-19-2017
05:29 PM
1 Kudo
@Lucy zhang Is your VM setup to use NAT? I've got the HDF Sandbox running right now, and have a 127.0.0.1 IP.
... View more
05-18-2017
09:43 PM
Hi @PJ I can't speak to the network setup specifics in your environment obviously, that should come from the Hadoop and Cassandra admins. I think the default Cassandra port is 9042, but, you can check that with your admin team. If you are using HDF/Nifi, you would specify that port in the QueryCassandra processor. The NiFi nodes will require access over that port to the Cassandra environment, and the nodes will also require access to each node in the hadoop cluster. If you are using Sqoop, the connectivity must be be open between the Cassandra environment and each node in the hadoop cluster on the JDBC port that Cassandra in your environment is configured to use (Sqoop jobs can be initiated from the client node, but will actually instantiate connections from one of the worker nodes in the cluster). https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html https://community.hortonworks.com/questions/66961/how-sqoop-internally-works.html
... View more
05-18-2017
03:50 PM
1 Kudo
Hi @Eyad Garelnabi there is a great post about it here: https://community.hortonworks.com/questions/55454/can-nifi-promise-each-of-the-flowfiles-can-be-proc.html
... View more
05-17-2017
03:45 PM
2 Kudos
Hi @Abraham Abraham You may want to look at using the new Ambari Workflow Manager tool to help make working with Oozie easier. There is a fantastic series of tutorials here: https://community.hortonworks.com/articles/82964/getting-started-with-apache-ambari-workflow-design.html As
always, if you find this post useful, don't forget to accept the
answer.
... View more
05-16-2017
10:04 PM
1 Kudo
Hi @PJ If you wanted to use sqoop instead of HDF/NiFi to import tables, you would need to get an adequate JDBC driver for Cassandra. I'm not an expert on it, but I think DataStax provides one for their Enterprise software. I've seen quite a few stories about it not working very well though without that JDBC driver. I think HDF/NiFi would be the better option.
... View more
05-16-2017
08:47 PM
3 Kudos
Hi @Terrie Pugh I just connected Tableau installed on my Mac to the latest HDP 2.6 Sandbox's Hive. Here are the steps I took with some screenshots in case they help you: 1) Install the Hortonworks ODBC Connector for the OS you are using: https://hortonworks.com/downloads/#addons 2) In Tableau, Choose the Hortonworks Hadoop Hive Connection and configure as per the screenshot 3) maria_dev should be fine if that was the user your loaded your data with Once you are in, you can test the connection within Tableau by doing a schema search for "*" to see if it pulls back the databases as shown: As
always, if you find this post useful, don't forget to accept the
answer.
... View more