Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1511 | 10-11-2018 01:38 AM | |
1876 | 09-26-2018 02:24 AM | |
1835 | 06-29-2018 02:35 PM | |
2436 | 06-29-2018 02:34 PM | |
5387 | 06-20-2018 04:30 PM |
09-26-2018
02:24 AM
I could resolve this issue by modifying the following parameters yarn.scheduler.minimum-allocation-mb=2560
yarn.nodemanager.resource.memory-mb=7680
yarn.app.mapreduce.am.resource.mb=2560
... View more
09-25-2018
02:31 AM
you were absolutely right .. . it was the proxy settings causing . I removed the proxy settings from ambari-server and bounced it and all views working now . I tried to append the -Dhttp.nonProxyHosts=hadoop1|hadoop2|hadoop3 to the proxy settings of ambari \but it didn't like it .. can you give me right syntax please ?
... View more
09-22-2018
05:28 AM
@Sami Ahmad The sqoop output is generating a orc snappy file and the hive table you have created is a orc table without any compression. Do create a table with compression type snappy. CREATE TABLE mytable (...) STORED AS orc tblproperties ("orc.compress"="SNAPPY");
... View more
06-12-2019
09:56 PM
My issue is actually caused by the fact that Port 10000 is not working. After installing the Hive client no another data node, where port 10000 is working, I am able to follow the steps above and create the DSN successfully.
... View more
09-23-2018
02:49 AM
I just noted there is a small note on top saying " Note This procedure requires change data capture from the operational database that has a primary key and modified date field where you pulled the records from since the last update. we don't have CDC on our database so we cant do incremental imports? it should be possible by looking at the date field as that's constantly increasing ?
... View more
06-29-2018
06:11 PM
oh the following syntax worked [root@hadoop1 ~]# curl --negotiate -i -u : -X GET -H "Accept: text" http://$(hostname):17001/
HTTP/1.1 401 Authentication required
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Path=/; HttpOnly
Content-Type: text/html; charset=iso-8859-1
Cache-Control: must-revalidate,no-cache,no-store
Content-Length: 1393
HTTP/1.1 200 OK
Set-Cookie: hadoop.auth="u=hbase&p=hbase/hadoop1.xxx.com@XXX.US&t=kerberos&e=1530331783162&s=Ypuvww45JSzCbQwTbc5ysWmaSfI="; Path=/; HttpOnly
Content-Type: text/plain
Cache-Control: no-cache
Content-Length: 18
UFM
WZ
state_code
... View more
06-29-2018
02:35 PM
resolved .. the following missing url was causing the issue hbase.rest.authentication.kerberos.principal=HTTP/_HOST@
... View more
06-29-2018
02:34 PM
found out .. was missing this parameter in hbase config hbase.rest.authentication.kerberos.principal=HTTP/_HOST@
... View more
06-28-2018
03:11 PM
@Sami Ahmad yes its possible to read hbase data using rest api. You need to start the rest api server first: $ hbase rest start By default this will listen on port 8080. URL used should be similar to this: http://<myhost>:8080/<mytable>/<rowkey1>/<cf:q>/ Documentation at: https://hbase.apache.org/1.2/apidocs/org/apache/hadoop/hbase/rest/package-summary.html HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
06-28-2018
05:18 PM
@Sami Ahmad, Yes. You can use the hbase REST API to manipulate data in HDP. By default the hbase rest server is not started. You need to start it first To start rest server in foreground # su hbase
# hbase rest start -p {port to start the server} To start in background # su hbase
# /usr/hdp/current/hbase-master/bin/hbase-daemon.sh start rest -p {port to start the server} . Ref: http://hbase.apache.org/book.html#_rest http://blog.cloudera.com/blog/2013/03/how-to-use-the-apache-hbase-rest-interface-part-1/ . You can also use the phoenix query server to do the manipulation using http using sqlline.py. But the interface it not REST. Phoenix is built on Apache Calcite. http://phoenix.apache.org/server.html . Please "Accept" the answer if this helps. -Aditya
... View more