Member since
04-11-2016
535
Posts
148
Kudos Received
77
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9071 | 09-17-2018 06:33 AM | |
2373 | 08-29-2018 07:48 AM | |
3366 | 08-28-2018 12:38 PM | |
2857 | 08-03-2018 05:42 AM | |
2579 | 07-27-2018 04:00 PM |
11-21-2016
06:35 AM
@Sridhar M Check your known_hosts file to match the case for the hosts. Also, check ssh to and from all the hosts. What does hostname -f return?
... View more
11-09-2016
05:01 AM
@swathi thukkaraju Try using the option --password-file to remove the possibility of entering / exposing the password. Below is the link for creating the password file (link)
... View more
11-08-2016
12:52 PM
2 Kudos
@younes kafi
The create table is incorrect, it is not possible to store the data in avro format into orc format. The best possible way is as below: CREATE TABLE avro_table
ROW FORMAT SERDE'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
LOCATION '/user/someuser/avro_folder/'
TBLPROPERTIES ('avro.schema.url'='hdfs:///user/someuser/schema.avsc');
Create orc file as below:
CREATE TABLE orc_table stored as orc;
INSERT INTO TABLE orc_table SELECT * FROM avro_table;
... View more
11-03-2016
05:30 AM
1 Kudo
@Rinku Singh Unfortunately, it is not possible. The definition of External table itself explains the location for the file: "An EXTERNAL table points to any HDFS location for its storage, rather than being stored in a folder specified by the configuration property hive.metastore.warehouse.dir ." https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-ExternalTables
... View more
10-17-2016
11:34 AM
3 Kudos
@Gayathri Reddy G The property "Dorg.apache.sqoop.splitter.allow_text_splitter=true" is required when you are using --split-by is used on a column which is of text type. There is difference in the TextSplitter class of Sqoop jars in HDP 2.4 and HDP 2.5 because of sqoop command fails without the required argument in HDP 2.5.
... View more
10-11-2016
09:51 AM
9 Kudos
@Kaliyug Antagonist This is actually a known issue, and there is a Jira for a documentation bug to get this fixed in a later HDP release. Sqoop uses 1.8.0 of avro and there are other Hadoop components using 1.7.5 or 1.7.4 avro. Please add the following property after 'import': -Dmapreduce.job.user.classpath.first=true Example: sqoop import -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:db2://xxx:60000/x2 --username xx -password-alias xx --as-avrodatafile --target-dir xx/data/test --fields-terminated-by '\001' --table xx -m 1
... View more
10-10-2016
10:30 AM
3 Kudos
@Sergio Aparicio This is actually a known issue, and there is a Jira for a documentation bug to get this fixed in a later HDP release. Sqoop uses 1.8.0 of avro and there are other Hadoop components using 1.7.5 or 1.7.4 avro.
Please add the following property after 'import': -Dmapreduce.job.user.classpath.first=true
Example:
sqoop import -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:db2://xxx:60000/VKTXAP02 --username xx -password-alias xx --as-avrodatafile --target-dir xx/data/test --fields-terminated-by '\001' --table xx -m 1
... View more
09-29-2016
07:09 AM
@anjul tiwari Thank you for the update. Please mark helpful answer to close the discussion.
... View more
09-29-2016
06:56 AM
1 Kudo
Hi Mahesh, Please refer to below link for integration details: https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration Thanks and Regards, Sindhu
... View more
09-27-2016
09:38 AM
@Mourad Chahri Could you please check from Ambari - reason for unhealthy node?
... View more