Member since
03-07-2019
209
Posts
17
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
971 | 03-27-2019 04:52 AM | |
4044 | 11-21-2018 10:21 PM | |
9649 | 09-14-2016 07:35 PM | |
7816 | 07-01-2016 06:56 PM | |
1686 | 06-07-2016 04:22 PM |
04-27-2022
09:37 AM
The root cause of this is PEP 3151, introduced in Python 3.3: PEP 3151 – Reworking the OS and IO exception hierarchy Python 3.3 release notes You can overcome this issue with the following changes in the file /usr/lib64/python2.7/test/test_support.py From: def _is_ipv6_enabled():
"""Check whether IPv6 is enabled on this host."""
if socket.has_ipv6:
sock = None
try:
sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
sock.bind((HOSTv6, 0))
return True
except OSError:
pass
finally:
if sock:
sock.close()
return False To: def _is_ipv6_enabled():
"""Check whether IPv6 is enabled on this host."""
if socket.has_ipv6:
sock = None
try:
sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
sock.bind((HOSTv6, 0))
return True
except socket.error if sys.version_info < (3, 3) else OSError: ---> this is how it should be
pass
finally:
if sock:
sock.close()
... View more
11-06-2020
08:45 AM
Since no one responded back, responding to this. This is a defect and is addressed via HUE-9110
... View more
05-01-2019
04:00 PM
@Haijin Li, Based on the exception it seems like the folder /user/root may not be present in HDFS. You would want to sudo to hdfs and then run the below commands su hdfs (or equivalent command) hdfs dfs -mkdir -p /user/root hdfs dfs -chown root /user/root exit Now launch the hivecli it should start up fine.
... View more
03-27-2019
04:52 AM
1 Kudo
@Artur Brandys, Just saw this question not sure if you have already found the answer or not. No, you cannot have a different database for users as these are internal to Hue's django and how it works with the tables by taking the information from Hue.ini.
... View more
03-27-2019
04:49 AM
1 Kudo
You can install Hue, but you would need to get the Hue from gethue.com and set it up the way its documented there. Using HDP 2.6.x release of Hue may not work with HDP 3.x release.
... View more
02-15-2019
07:55 PM
1 Kudo
@Dinesh Chitlangia On the host where you are seeing the issue, try removing the rpm's of oozie using yum remove oozie_2_6_* and then try reinstalling it from the command line using the command yum install oozie_2_6_*
... View more
01-29-2019
11:37 PM
@sheeba ravs, please check this post which has the information that you are looking for https://community.hortonworks.com/questions/93519/hdp-23-tutorial.html
... View more
12-13-2018
11:25 PM
@Antonin Vloki You can issue the export HBASE_HOME=path and then issue the sqoop command, as far as I know, if the HBASE_HOME is not set then sqoop would generally complain about it with a warning message that HBASE_HOME is not set. If you are wanting to overwrite it, then also you can do the export HBASE_HOME and the run the sqoop code.
... View more
12-13-2018
12:59 AM
@MML MAHESH Please provide some more information about the stack trace of the error, this is a pretty vague message and does not provide what caused the issue unless we have the stack trace which can provide some information leading to what could have caused the issue.
... View more
12-13-2018
12:55 AM
Hi, MSSQL needs another file which has information about how to pass and access the MSSQL+AD (Windows Auth) enabled but this anotherfile is of a dll format so you cannot use that in linux/unix platform. The next best approach is to use a 3rd party driver like DataDirect who provides access to MSSQL using AD or alternatively you can use a free 3rd party driver called jtds - http://jtds.sourceforge.net/ which has option to connect to MSSQL using AD. something like this -> sqoop import/export --connect "jdbc:jtds:sqlserver://dbname:portnumber;databaseName=dbname;useNTLMv2=true;domain=DOMAINNAME" http://jtds.sourceforge.net/faq.html
... View more
12-13-2018
12:42 AM
@Swaapnika Guntaka I'm no expert on SAM, but from what I can see from the provided info is that MySQL driver is not present in the required location for SAM to make use off or if you have the mysql driver present then that driver does not have this class.
... View more
12-13-2018
12:34 AM
@Aniruddha Ghosh 1. The table in question is it a text-based table to is it of a different format. 2. Have you tried increasing the number of mappers? Also as you are trying to load the data into oracle, have you tried enabling --direct -> used for Data Connector for Oracle which does a fast import and export If you have not tried these, I would suggest you to try with these options.
... View more
12-13-2018
12:29 AM
@Navin Agarwala, I think you are hitting the same thing as posted in here https://community.hortonworks.com/questions/214980/sqoop-import-hung-hive-import-hdp-300.html Basically, hivecli is removed from HDP 3.0 so it would use beeline and beeline needs a login and the hang that you are seeing is basically expecting a username and password. If you are not willing to perform this task, then I would suggest you to use beeline-hs2-connection.xml as specified in here: https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Usinghive-site.xmltoautomaticallyconnecttoHiveServer2 or modify your sqoop syntax to use hcatalog. Hope this helps
... View more
11-21-2018
10:21 PM
2 Kudos
@Arindam Choudhury , You can use the below link which talks about creating a file called beeline-hs2-connection.xml and providing the credentials information there. https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Usinghive-site.xmltoautomaticallyconnecttoHiveServer2
... View more
11-21-2018
06:57 PM
Hi @arjun more Maybe that the below jars are not there for the yarn containers for it to make use of when hive launches the job. You can try adding in these jars with 'add jar <path_to_jar>/jarfilename.jar; and see if this helps.
... View more
11-16-2018
08:48 PM
@Anurag Mishra
From the exception, it seems like the classname provided is not correct as you can see below, the class name is starting with o and not org which is what is available in the above jar tvf command output. 18/10/30 15:16:15 ERROR tool.BaseSqoopTool: Got error creating database manager: java.io.IOException: java.lang.ClassNotFoundException: o.apache.sqoop.teradata.TeradataConnManager
... View more
11-16-2018
08:45 PM
@Anurag Mishra Not sure if you are still seeing this issue, but to know what has happened, I would need complete sqoop logs output (verbose if possible) and hive.log file as well from the /tmp/<username>/hive.log after the run.
... View more
11-16-2018
08:43 PM
@Gourav Gupta Not sure if you are referencing to a connector provided by SAP for hadoop integration. If so, then you might need to contact them for this. As far as I know, there is no such connector available under the hortonworks addon page.
... View more
11-16-2018
08:12 PM
@Sandeep SIngh It seems like you would need to look at the yarn queue whether this job was launched and see from the RM UI to understand if the queue was over utilized or not. That's the only way to identify if there was a resource contention and due to which the app stayed in accepted state.
... View more
11-16-2018
08:09 PM
this seems to be duplicate of https://community.hortonworks.com/questions/225880/sqoop-incremental-import-last-modified-error.html
... View more
11-16-2018
08:09 PM
@Suresh Kumar S, Thanks for posting the logs about the exception and LastModified Column being datetime. To me, it seems like because there is a space between the date and timestamp is what is causing the issue. What you can try is to set the lastmodified or the checkcolumn to string using --map-column-java parameter
... View more
11-16-2018
07:45 PM
@vamsi krishna sabbisetty , Sqoop does not support multichar delimiter as the field and record delimiter are defined as char type in the code. Ref: https://issues.apache.org/jira/browse/SQOOP-1175
... View more
11-16-2018
07:30 PM
@Nikhil Vemula Have you been able to get this working? If you have received the info, you can use something like the below command to get the data based on the date range #!/bin/sh
mindate=$1
maxdate=$2
querytorun="select * from <TABLENAME> where date>= $mindate and date<= $maxdate"
querytorun+=" and \$CONDITIONS"
sqoop import --connect jdbc:mysql://<DPIPADDRESS>/<DBNAME> --username <USERNAME> --password <PASSWORD> --query "$querytorun" --split-by "<SPLITBYKEY>" --delete-target-dir ......
... View more
11-16-2018
07:00 PM
@Sivakumar Mahalingam, No, there is no option to edit a job once the job has been created using sqoop job --create option. Any modification would need a new job to be created as there is no edit command.
... View more
11-16-2018
06:58 PM
@Suresh Kumar S, Can you try removing the --driver "com.microsoft.sqlserver.jdbc.SQLServerDriver" from the command and see it this helps. Ref: https://issues.apache.org/jira/browse/SQOOP-2421
... View more
11-16-2018
12:00 AM
@Mahendiran Palani Samy You want to check the mapreduce min split and max split size. From the message it seems like the min split size is larger than the max split size.
... View more
11-15-2018
11:58 PM
@Jack You would want to create a file called beeline-hs2-connection.xml under /etc/hive/conf location on the node where you are running the sqoop command and have this information in it so when sqoop code come to hive import with HS2 it would use the below access credentials to load the data. One thing to ensure here is that the hive user needs to have access (read/write/execute) to the temp location from where hive user will pick up the data and move it to the hive table location. <?xml version="1.0"?>
<configuration>
<property>
<name>beeline.hs2.connection.user</name>
<value>hive</value>
</property>
<property>
<name>beeline.hs2.connection.password</name>
<value>hive</value>
</property>
</configuration>
... View more
10-22-2018
12:40 PM
@Thomas Bazzucchi, I just did a quick test and can see that the sqoop does pick up the datetime correctly, 18/03/20 23:36:29 DEBUG manager.SqlManager: Found column create_time of type DATETIME 18/03/20 23:36:29 DEBUG manager.SqlManager: Found column update_time of type DATETIME 18/03/20 23:36:29 DEBUG manager.SqlManager: Found column added_by_id of type BIGINT 18/03/20 23:36:29 DEBUG manager.SqlManager: Found column upd_by_id of type BIGINT Can you enable verbose on the sqoop command which can provide some lead on this, apart from this if you can share the ddl along with some sample data that would also help in getting where the issue would lie. Also out of curiosity, have you tried using --map-column-java fieldname=TIMESTAMP like this?
... View more
09-18-2018
05:56 PM
@vishal rajan Based on the documentation, we do have a manual upgrade path from 2.3 to 2.6.5. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_command-line-upgrade/content/ch_upgrade_2_3.html If you have Ambari, then you can perform an upgrade from HDP 2.2 to HDP 2.6.5 release. https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.2/bk_ambari-upgrade/content/HDP_rolling_upgrade_prerequisites.html
... View more
09-18-2018
05:53 PM
@Mahadevan Swamy From the looks of the data from Hive result and SQLServer result, seems like the delimiter are not set correct. 1. You may want to check the delimiter of Hive table and try to set that in sqoop. 2. Try getting the data into HDFS first and validate how the data looks like before loading into Hive (if the above does not fit)
... View more