Member since
09-06-2016
16
Posts
1
Kudos Received
0
Solutions
04-18-2017
06:34 PM
@Rajesh Are you able to update with the target with snapshot difference? hadoop distcp -diff s1 s2 -update hdfs://secure_cluster:8020/source hdfs://secure_cluster:8020/target though it is syntactically correct, I don't see its working, I see the error as below. 17/04/18 14:29:53 ERROR tools.DistCp: Invalid arguments:
java.lang.IllegalArgumentException: Diff is valid only with update and delete options
at org.apache.hadoop.tools.DistCpOptions.validate(DistCpOptions.java:568)
at org.apache.hadoop.tools.DistCpOptions.setUseDiff(DistCpOptions.java:284)
at org.apache.hadoop.tools.OptionsParser.parse(OptionsParser.java:223)
at org.apache.hadoop.tools.DistCp.run(DistCp.java:115)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.tools.DistCp.main(DistCp.java:430) Invalid arguments: Diff is valid only with update and delete options
usage: distcp OPTIONS [source_path...] <target_path>
... View more
04-11-2017
03:38 AM
@bpreachuk did the above query delete the partition?
... View more
03-15-2017
04:53 AM
hi @Artem Ervits the date shown when do hdfs dfs -ls <directory_location> actually shows the date when the file is placed in HDFS. Even though if the file is updated with INSERT option using hive command, the date doesn't seem to be changed. Example: the file placed in HDFS about 10 days back, and though the file altered today, the date remain as the original one.
... View more
03-13-2017
08:30 PM
If you still don't see the sqoop command working then you can also try to restart the sqoop client on the required host via Ambari,
... View more
03-03-2017
03:44 PM
Hi Kit, I have a teradata table having near to 100 columns. As its not feasible for us create the schema manually. We wanted to create schema using create-hive-table flag. But it doesnt seems to be working I followed your syntax: sqoop create-hive-table \ -libjars ${LIB_JARS} \ --connect $JDBCURL \ --connection-manager org.apache.sqoop.teradata.TeradataConnManager \ --username $TDUSER \ --password $TDPASS \ --table $TDTABLE \ --map-column-hive EFF_TSP=STRING \ --hive-table ${HIVEDB}.${HIVETABLE} I see the below error "Error parsing arguments for create-hive-table" 17/03/03 10:35:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Error parsing arguments for create-hive-table:
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: -p
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --table
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pdcrdata.DBQLogTbl_Hst_1
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --map-column-hive
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: EFF_TSP=STRING
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-table
17/03/03 10:35:48 ERROR tool.BaseSqoopTool: Unrecognized argument: pi_talend_test.teradata_import
Try --help for usage instructions. Thanks Suresh
... View more
10-17-2016
11:43 PM
@SBandaru To my knowledge, it's not advisable to change the pre-defined ambari scripts. These changes might affect the the potential upgrade of ambari server in future.
A best practice would be developing an individual maintenance script to change the permissions of the hive warehouse.
... View more