Member since
09-21-2017
9
Posts
0
Kudos Received
0
Solutions
10-17-2017
10:24 AM
Hi Shane @Shane Johnston I was able to launch NiFi using http://hdf:19090/nifi/ where 'hdf' is defined in the hosts file as per my post above. I guess that is what you were trying to say. If something else, please do share. I am new to HDP and HDF and can take any help possible. regards Vinay
... View more
10-17-2017
10:18 AM
Thank you @Shane Johnston That worked perfectly.
... View more
09-23-2017
09:23 AM
@nyakkanti I do not see any pending jobs in YARN. Only 3 jobs are listed and the status is finished\succeeded. In Ambari, I can still see the 4 queries running as mentioned in the original post.
... View more
09-23-2017
09:11 AM
I do not see any pending jobs in YARN. Only 3 jobs are listed and the status is finished\succeeded. In Ambari, I can still see the 4 queries running as mentioned in the original post.
... View more
09-23-2017
12:02 AM
Some more info: I am using VMWare Workstation on Windows. I had already installed HDP. Now also imported HDF. In Ambari for HDP, I see: Hostname: sandbox.hortonworks.com IP Address: 172.17.0.2 In Ambari for HDF, I see: Hostname: sandbox-hdf.hortonworks.com IP Address: 172.17.0.2 Why are both IP set to 172.17.0.2? Can I use both HDF and HDP on the same laptop like this? I can also see that no DataNode service has been started in HDF (its fine in HDP).
... View more
09-22-2017
11:43 PM
Hi Just installed HDF. Am on Windows using VMWare Workstation Player. This is working fine with earlier install of HDP. For HDF: Able to open Ambari. Click on Files View. Get error: Failed to transition to undefined sandbox-hdf.hortonworks.com:51070: Connection refused (Connection refused) I had added an entry in the hosts file as below: 192.168.137.130 hdf sandbox-hdf.hortonworks.com where the IP is from VMware Player. With or without the entry I get the above error. I thought (from HDP) that you had to map sandbox.hortonworks.com to the HDP IP. And similarly do a mapping for HDF but to sandbox-hdf.hortonworks.com. What seems to be the problem. I also cannot open the Web Shell Client from Advanced Quick Links - but I can open NiFi UI. Do I have to do additional configuration for HDF. regards Vinay
... View more
Labels:
- Labels:
-
Apache Ambari
-
Cloudera DataFlow (CDF)
09-22-2017
10:24 PM
@nyakkanti Thanks for your response. Not sure what you are suggesting. My issue was not with select * or create table - those worked fine. Issue is with INSERT statements - those are still running. The images I had attached show the running queries. I don't see anything of use in the YARN service summary or YARN Queue Manager. Pls elaborate on where to check for the jobs that are running those queries.
... View more
09-21-2017
11:57 PM
Hi
I have a .Net application and am exploring ways to upload data from Sql Server to Hive. I understand I have options of using Sqoop, NiFi, SSIS. I was exploring possibilities with the Hive ODBC driver.
My C# code goes something like this: string hiveConnString = @"DRIVER={Hortonworks Hive ODBC Driver};
Host=192.168.137.129; Port=10000; Schema=default;
HiveServerType=hiveserver2;";
var conn = new OdbcConnection(hiveConnString);
OdbcCommand selectCommand = null;
OdbcDataAdapter adapter = null;
conn.Open();
adapter = new OdbcDataAdapter();
selectCommand = new OdbcCommand();
selectCommand.Connection = conn;
selectCommand.CommandText = "..........";// The INSERT QUERY
int result = selectCommand.ExecuteNonQuery();
I am able to perform SELECT statements and retrieve data. I was also able to create a table in Hive using ExecuteNonQuery. However when I try to run INSERT statement, the ExecuteNonQuery method never returns. Initially I had 190 rows in the source table - I waited 20 minutes for the method to return. Then I tried to use just ONE row - same result. I looked at Hive through Ambari. The queries were running. Am attaching an image of running queries. You can see 4 queries running. The Query column has the INSERT queries that I had submitted. The Application ID column has no id (it says NOT Available). I have also attached an image of one query details. I cannot understand why the ODBC driver is not able to execute the statement. The table has a BINARY datatype column (you can see System.Byte[] as the value in the query). Could that be an issue. I would assume that a mismatch on any datatype would return error. Also, I do not see any way to kill the queries. Any help is much appreciated. regards Vinay
... View more
Labels:
- Labels:
-
Apache Hive
09-21-2017
03:25 AM
I am trying to use sqoop to copy a table from local Sql Server. I am on a Windows 10 laptop with VMWare Workstation installed with the sandbox. Everything works fine from Ambari - am sure my sandbox is installed correctly. Am using the Web client (my_site:4200) for sqoop. As a test, the "sqoop version" command works fine. I then tried to run the 'list databases' command in sqoop - before doing the import. I get the following error: could not load db driver class com.microsoft.sqlserver.jdbc.sqlserverdriver I have looked at the suggested solutions by the community. I downloaded the microsoft jdbc driver jar ("sqljdbc42.jar"). Using WinSCP, I copied to file to the linux directory (as suggested by @Artem Ervits 😞 /usr/hdp/current/sqoop-client/lib. Ran the command again in the shell - same error. I copied the jar file to the usr/lib/sqoop/lib folder. Got same error. I understand I have copied the jar file to the linux file system. Don't I have to copy it to a path that hadoop understands. Looking at the folder structure in WinSCP, I cannot make out which folders are being used by hadoop. There was no 'hdp' folder under 'usr' - I created that folder. I am sure I am missing something here. To be clear, I only have this one VM install with sandbox on my laptop. So WinSCP cannot be showing me a wrong folder tree (am using the WinSCP User Interface to copy the jar). Am attaching an image here showing the WinSCP structure. regards Vinay
... View more
Labels:
- Labels:
-
Apache Sqoop