Member since
06-06-2016
185
Posts
12
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2705 | 07-20-2016 07:47 AM | |
2262 | 07-12-2016 12:59 PM |
11-05-2016
02:45 PM
hd-hive.pngI am new to HDinsight hadoop cluster , when i have open and Ambari ->hive view>query and execute any hive query(show tables etc..) i am getting error: java.net:UnknowhostException: namenode Note: I am able to trigger hive query in HIVE cli . HDP 2.7 hive 1.2
... View more
Labels:
10-21-2016
01:52 PM
Thank you @Constantin Stanca I have did by below steps and works good.. #I distcp the table from Prod to Dev (but table meta data is not visible in dev cluster) #I created same table schema in Dev what we had created table in Prod Then i get the table with data. So this process is good or will face data loss problem?
... View more
10-20-2016
07:24 AM
Thank you so much @grajagopal Can you enhance the 2 step with example please 2)You can just distcp the /user/hive/warehouse from PROD to DEV and generate the 'create table' DDL statement from hive, change the NN info on the table and recreate them in DEV. You need to also generate the manual ALTER TABLE ADD Partition statement to get the partitions recognized.
... View more
10-19-2016
02:00 PM
1 Kudo
Hi All , I have 4 TB of table in Prod cluster i want to move this to Dev using Dsitcp but i have less for export in Prod.. So i want to to split the table into some chunks ..can any one help me here i have try like this export table tablename where count(customer_id)> 100000 to 'hdfs_exports_location'; export table tablename having count(customer_id)> 100000 to 'hdfs_exports_location'; export table db.tablename partition (count(sname))> "2") to 'apps/hive/warehouse'; and i try like this finally , its also not working export table db.tablename partition (count(sname)) = "2") to 'apps/hive/warehouse' But no use..please suggest me
... View more
Labels:
- Labels:
-
Apache Hadoop
10-13-2016
06:20 AM
@Ayub Pathan Both are using same version HDP 2.1.2 I for got mansion port but it has 8020 both cluster export table db_c720_dcm.network_matchtables_act_ad to 'apps/hive/warehouse/sankar7_dir'; and i could see sankar7_dir in /user/hdfs/apps/hive/warehouse/sankar7_dir in source cluster... hadoop distcp hdfs://xx.xx.xx.xx:8020/apps/hive/warehouse/sankar7_dir hdfs://yy.yy.yy.yy:8020/apps/hive/warehous e/sankar7_dir
16/10/13 01:01:05 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=fa lse, maxMaps=20, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hdfs:///xx.xx.xx.xx:8020/apps/h ive/warehouse/sankar7_dir], targetPath=hdfs://yy.yy.yy.yy4:8020/apps/hive/warehouse/sankar7_dir}
16/10/13 01:01:05 INFO client.RMProxy: Connecting to ResourceManager at stlts8711/39.0.8.13:8050
16/10/13 01:01:06 ERROR tools.DistCp: Invalid input:
org.apache.hadoop.tools.CopyListing$InvalidInputException: hdfs:///xx.xx.xx.xx:8020/apps/hive/warehouse/sankar7_dir doesn't exist
at org.apache.hadoop.tools.GlobbedCopyListing.doBuildListing(GlobbedCopyListing.java:84)
at org.apache.hadoop.tools.CopyListing.buildListing(CopyListing.java:80)
at org.apache.hadoop.tools.DistCp.createInputFileListing(DistCp.java:327)
at org.apache.hadoop.tools.DistCp.execute(DistCp.java:151)
at org.apache.hadoop.tools.DistCp.run(DistCp.java:118)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.tools.DistCp.main(DistCp.java:375) hdfs:///xx.xx.xx.xx:8020/apps/hive/warehouse/sankar7_dir doesn't existIif see my error while doing Distcp with out creating sankar_7. But i export table to directory :. export table db_c720_dcm.network_matchtables_act_ad to 'apps/hive/warehouse/sankar7_dir';
... View more
10-13-2016
05:42 AM
@Ayub Pathan No con't see this directory..may i know reason for this ..and help me to out of this issue..
... View more
10-13-2016
04:15 AM
Thanks you so much@Ayub Pathan I have below information on user directory hdfs@HADOOP:/root> hadoop fs -ls /user/hdfs/apps/hive/warehouse/sankar5_dir Found 2 items
-rw-r--r-- 3hdfs hdfs 1882 2016-10-12 17:34 /user/hdfs/apps/hive/warehouse/sankar5_dir/_metadata
drwxr-xr-x - hdfs hdfs 0 2016-10-12 17:34 /user/hdfs/apps/hive/warehouse/sankar5_dir/data I could able to import in source cluster but i could not in destination cluster after distcp
... View more
10-13-2016
03:16 AM
HI..I want to migrate some hive table in Prod cluster to dev Cluster to i am doing like this #export the hive table in some tem directory #distcp the tem directory to tem directory in target cluster #import the tem directory to hive database. #01 hdfs@HADOOProot> hadoop fs -mkdir /apps/hive/warehouse/sankar5_dir #02 export table db_c720_dcm.network_matchtables_act_creative to 'apps/hive/warehouse/sankar5_dir';
#03 hadoop distcp hdfs://xx.xx.xx.xx:8020/apps/hive/warehouse/sankar5_dir hdfs://xx.xx.xx.xx//apps/hive/warehouse/sankar5_dir FAILED: SemanticException [Error 10027]: Invalid path on 3 step I could import in source cluster but after distcp ,i cont import in destination cluster
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
09-23-2016
09:20 AM
Thanks@Mats Johansson I have a another basic question is that, what is this directory each of my Node node having 12 directories,can i increase this.and how the logs are distributed between these 4 nodes/12 directories?
... View more
09-23-2016
08:07 AM
I have a container logs configured as below yarn.nodemanager.log-dirs: /data1/hadoop/yarn/log,/data2/hadoop/yarn/log,/data3/hadoop/yarn/log,/data4/hadoop/yarn/log,/data5/hadoop/yarn/log,/data6/hadoop/yarn/log,/data7/hadoop/yarn/log,/data8/hadoop/yarn/log,/data9/hadoop/yarn/log,/data10/hadoop/yarn/log,/data11/hadoop/yarn/log,/data12/hadoop/yarn/log /data9/hadoop/yarn/log file system in one of the data node is full, all logs are older than 1 year Can i deleted these logs ?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache YARN