Member since
04-25-2016
579
Posts
609
Kudos Received
111
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2020 | 02-12-2020 03:17 PM | |
1397 | 08-10-2017 09:42 AM | |
10247 | 07-28-2017 03:57 AM | |
2221 | 07-19-2017 02:43 AM | |
1698 | 07-13-2017 11:42 AM |
12-09-2016
09:59 AM
3 Kudos
@subash sharmacould you plese set hadoop.proxyuser.hive.groups,hadoop.proxyuser.hive.hosts to * using ambari and restart hdfs and see if it help.
... View more
12-08-2016
11:42 AM
3 Kudos
@oula.alshiekh@gmail.com alshiekhadd datanode if you are running out of storage capacity of cluster, add computation node when you see bottleneck in processing, by adding more computation nodes you can launch more mapreduce/spark task. you can also use your node to store data as well as to add more processing capacity(in terms of more no mapreduce tasks)
... View more
12-08-2016
09:02 AM
3 Kudos
is hive table going to be replicated in all cluster data nodes when replication factor inclueds all data nodes ? with insert into hive will create a table directory and replicate the data with configured replication factor hive use hadoop to store and retrieve the data. hive table is equal to a dir on HDFS the underlying data is in files depending on the table definition. how is hive table represented and how to find this table when working with hdfs if we didn't specify location in its creation statement? hive has terninology of managed table and external table to store and govern the data more on this you can find here http://stackoverflow.com/questions/17038414/difference-between-hive-internal-tables-and-external-tables
... View more
12-08-2016
08:50 AM
2 Kudos
can you try this query using http mode by setting hive.server2.transport.mode to http instead of binary
... View more
12-07-2016
10:34 AM
2 Kudos
@Sampat Budankayala 1.you can take advantage of hadoop-client and write java program to write on hdfs to achieve it. 2.use NIFI and use putHDFS processor to achieve it
... View more
12-07-2016
10:31 AM
after getting these credential using above command just do execute hive
... View more
12-07-2016
10:21 AM
2 Kudos
try with ambari-qa klist -kt /etc/security/keytabs/smokeuser.headless.keytab Keytab name: FILE:/etc/security/keytabs/smokeuser.headless.keytab KVNO Timestamp Principal ---- ----------------- -------------------------------------------------------- 1 11/22/16 11:24:27 ambari-qa-rks242secure@EXAMPLE.COM 1 11/22/16 11:24:27 ambari-qa-rks242secure@EXAMPLE.COM 1 11/22/16 11:24:27 ambari-qa-rks242secure@EXAMPLE.COM 1 11/22/16 11:24:27 ambari-qa-rks242secure@EXAMPLE.COM 1 11/22/16 11:24:27 ambari-qa-rks242secure@EXAMPLE.COM then obtain ticket kinit -kt /etc/security/keytabs/smokeuser.headless.keytab ambari-qa-rks242secure@EXAMPLE.COM
... View more
12-07-2016
10:13 AM
1 Kudo
@subash sharmayes you can access hive cli after getting the valid kerberos credential for non-root user, given that this user belongs to hdfs group.
... View more
12-03-2016
06:21 PM
you need to explicitly load data using load data inpath into table
... View more
12-03-2016
05:43 PM
sorry I could not get your ask,could you please clarify
... View more