Member since
09-28-2016
7
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3299 | 10-08-2016 09:42 PM |
11-15-2016
06:44 PM
Or bridge the edge server.
... View more
11-15-2016
01:07 PM
Hi all, I have a small cluster (10 machines now). One edge server that has two network cards (one on the internal network 142.39.41.*, the other sees the cluster at 10.1.1.*), a management server and 8 data nodes all on the 10.1.1.* network. Sqoop is on the edge server but when I try to import a single table from a sql server database (on 142.39.41.*) sqoop import \
--connect 'jdbc:sqlserver://dbserver;DatabaseName=MyDB;user=XXXXXXXXX;password=XXXXXXX;port=1433' \
--table=dbo.Asset \
--driver com.microsoft.sqlserver.jdbc.SQLServerDriver \
-m 1
I get : Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host devsql94, port 1433 has failed. Error: "null. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
When i try to list the tables in the database using Sqoop sqoop list-tables --connect 'jdbc:sqlserver://dbserver;DatabaseName=MyDB;user=XXXXXXXXX;password=XXXXXXX;port=1433' --driver com.microsoft.sqlserver.jdbc.SQLServerDriver it works fine and list all my tables so the JDBC access from the edge server to the database works. Wrapping my head around the problem I started to think that Sqoop might be sending the jobs to another node (?) to handle the database reading. But which node ? So I tried to port forwarding the edge localport 1433 to the sql server database port 1433 using "nc" as per this site but it didn't work either. Can anyone figure this one out ? Is my architecture setup wrong by allowing only the edge server to see the corporate network ?
... View more
Labels:
- Labels:
-
Apache Sqoop
10-24-2016
03:15 PM
Makes sense. Once I create two groups then I run the script in article once per group I guess. I'll try this sometime this week and let you know how it worked out. Thanks.
... View more
10-24-2016
12:54 PM
Hi community, Spending weeks setting up and HDP 2.5 LAB at work (doing it part-time) I finally got it up and running. So my little LAB has 4 servers now : 3 x Dell T3500 (6 cores) with 72 GB RAM and two disks (1=OS, 1 for APPS) 1 x Clone computer (3 cores) with 16 GB RAM and two disks also. Since the "clone" computer is a little behind the others I reserved it for Datanode only server. One of the Dell is another Datanode and the other 2 are mixed "management" (YARN, Name, SName, Hive, HBase, MySQL, etc...) and "Datanode". After installation I realized that not all the memory is being used. 6-7 GB on each server at most. So I started investigating how I could use my memory more efficiently and potentially increase my performances. Found this article "Determine memory" but it seems to assume that all nodes have the same configuration. Can I set my memory parameters has per this article and forget that one of the servers is only 16GB ?
... View more
Labels:
10-08-2016
09:42 PM
1 Kudo
Despite the link (refering to oozie), the configuration changes must be done for : hadoop.proxyuser.root.groups
hadoop.proxyuser.root.hosts Figured it out just after the post. All work fine now.
... View more
10-07-2016
07:41 PM
1 Kudo
Hi everyone, I am new to the community and trying to work with HDP 2.5. Setup a small lab at the office with 4 servers. Installation went good and everything seems to work fine, at least the dashboard is all green now (took three complete install but get it to work). Servers (Xeon 8 cores with 16 GB of RAM each): manager001 : 192.168.2.50, slave001 : 192.168.2.51, slave002 : 192.168.2.52 and slave003 : 192.168.2.53 I distributed the services in the servers to balance the memory usage and the load. My problem is that when I try to create a table in the Hive view I get the message : "Unauthorized connection for super-user: root from IP 192.168.2.53"
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive