Member since
09-06-2016
16
Posts
1
Kudos Received
0
Solutions
06-14-2017
05:55 PM
@Joshua Adeleke run this command 'hdfs fsck /path/to/file' and check that all the file blocks of this file is not corrupted
... View more
04-18-2017
06:34 PM
@Rajesh Are you able to update with the target with snapshot difference? hadoop distcp -diff s1 s2 -update hdfs://secure_cluster:8020/source hdfs://secure_cluster:8020/target though it is syntactically correct, I don't see its working, I see the error as below. 17/04/18 14:29:53 ERROR tools.DistCp: Invalid arguments:
java.lang.IllegalArgumentException: Diff is valid only with update and delete options
at org.apache.hadoop.tools.DistCpOptions.validate(DistCpOptions.java:568)
at org.apache.hadoop.tools.DistCpOptions.setUseDiff(DistCpOptions.java:284)
at org.apache.hadoop.tools.OptionsParser.parse(OptionsParser.java:223)
at org.apache.hadoop.tools.DistCp.run(DistCp.java:115)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.tools.DistCp.main(DistCp.java:430) Invalid arguments: Diff is valid only with update and delete options
usage: distcp OPTIONS [source_path...] <target_path>
... View more
04-11-2017
03:38 AM
@bpreachuk did the above query delete the partition?
... View more
04-07-2017
08:55 PM
@am s The error clearly states that you don't have enough class files to get the sqoop command running cd /usr/hdp/current/sqoop-client/lib Please make sure you have the following jar files with the required permissions in the given location -rwxr-xr-x 1 root root 2405 Oct 24 01:14 tdgssconfig.jar
-rwxr-xr-x 1 root root 4230471 Oct 24 01:14 teradata-connector-1.4.1-hadoop2.jar
-rwxr-xr-x 1 root root 963400 Oct 24 01:14 terajdbc4.jar
... View more
04-03-2017
02:52 PM
you can try the following command as root user curl -i -v -L -u admin:admin -H 'X-Requested-By:ambari' -X DELETE 'http://<ambari-server-host>:8080/api/v1/users'
... View more
03-15-2017
04:49 PM
@Geoffrey Shelton Okot At one point of time you can only have either default-solr or Ambrari-Infra Solr As, hortonworks recommends Ambari-solr to store the Ranger audit logs See the following steps to be done 1- you need to stop the running default-solr instance, with the below script /opt/lucidworks-hdpsearch/solr/ranger_audit_server/scripts/stop_solr.sh 2- To enable Ranger to store its audits in Ambari-Solr, you need to >> Under Audit to Solr, clickOFFunder SolrCloud to enable SolrCloud. The button label will change to ON, and the SolrCloud configuration settings will be loaded automatically. Ref: -https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/ranger_audit_settings.html Let me know if this works
... View more
03-15-2017
04:53 AM
hi @Artem Ervits the date shown when do hdfs dfs -ls <directory_location> actually shows the date when the file is placed in HDFS. Even though if the file is updated with INSERT option using hive command, the date doesn't seem to be changed. Example: the file placed in HDFS about 10 days back, and though the file altered today, the date remain as the original one.
... View more
03-13-2017
08:30 PM
If you still don't see the sqoop command working then you can also try to restart the sqoop client on the required host via Ambari,
... View more
03-07-2017
12:24 PM
I see this is also not working either. the sqoop command, is getting frozen and not generating an application-id
... View more
03-04-2017
11:58 PM
Hi Arten, I tried as you have suggested and Its showing the following error . 17/03/04 18:54:35 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: -p 17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: --table
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: pdcrdata.DBQLogTbl_Hst_1
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: --map-column-hive
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: EFF_TSP=STRING
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-table
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: pi_teradata_test.teradata_import
17/03/04 18:54:36 ERROR tool.BaseSqoopTool: Unrecognized argument: --create-hive-table
Thanks, Suresh
... View more
03-03-2017
03:53 PM
1 Kudo
Hi, Referring with the documentation in https://community.hortonworks.com/articles/53531/importing-data-from-teradata-into-hive.html I have teradata table with near-to 100 columns, its not practically feasible to create schema manually. I rather prefer to import the table automatically using create-hive-table flag I have followed the below syntax: sqoop create-hive-table -libjars /usr/hdp/current/sqoop-client/lib/hortonworks-teradata-connector-1.4.1.2.3.2.0-2950.jar --connect jdbc:teradata://txxxxx.xxxx.net/Database=pxxxxx,LOGMECH=LDAP --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username sxxxxx -p --table pdcrdata.DBQLogTbl_Hst_1 --map-column-hive EFF_TSP=STRING --hive-table xx_teradata_test.teradata_import And its throwing the following error: 17/03/03 10:35:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Error parsing arguments for create-hive-table: 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: -p
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --table 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pdcrdata.DBQLogTbl_Hst_1 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --map-column-hive 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: EFF_TSP=STRING 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-table 17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pi_talend_test.teradata_import
Try --help for usage instructions. Thanks, Suresh Kumar
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
03-03-2017
03:44 PM
Hi Kit, I have a teradata table having near to 100 columns. As its not feasible for us create the schema manually. We wanted to create schema using create-hive-table flag. But it doesnt seems to be working I followed your syntax: sqoop create-hive-table \ -libjars ${LIB_JARS} \ --connect $JDBCURL \ --connection-manager org.apache.sqoop.teradata.TeradataConnManager \ --username $TDUSER \ --password $TDPASS \ --table $TDTABLE \ --map-column-hive EFF_TSP=STRING \ --hive-table ${HIVEDB}.${HIVETABLE} I see the below error "Error parsing arguments for create-hive-table" 17/03/03 10:35:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Error parsing arguments for create-hive-table:
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: -p
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --table
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pdcrdata.DBQLogTbl_Hst_1
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --map-column-hive
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: EFF_TSP=STRING
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-table
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pi_talend_test.teradata_import
Try --help for usage instructions. Thanks Suresh
... View more
10-17-2016
11:43 PM
@SBandaru To my knowledge, it's not advisable to change the pre-defined ambari scripts. These changes might affect the the potential upgrade of ambari server in future.
A best practice would be developing an individual maintenance script to change the permissions of the hive warehouse.
... View more