Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1170 | 10-11-2018 01:38 AM | |
1573 | 09-26-2018 02:24 AM | |
1444 | 06-29-2018 02:35 PM | |
1955 | 06-29-2018 02:34 PM | |
4644 | 06-20-2018 04:30 PM |
06-28-2018
09:06 PM
i followed the steps in the link below to enable Hbase rest in Kerberos but when I give the rest command is throwing error. I can see the rest server is up . https://community.hortonworks.com/articles/91425/howto-start-and-test-hbase-rest-server-in-a-kerber.html/ added the following to Custom Hbase-Site.xml and bounced services hdfs,hbase and others.
hbase.rest.authentication.type=kerberos
hbase.master.kerberos.principal=hbase/_HOST@xxx.com
hbase.master.keytab.file=/etc/security/keytabs/hbase.service.keytab
hadoop.proxyuser.HTTP.groups=*
hadoop.proxyuser.HTTP.hosts=*
hbase.security.authorization=true
hbase.rest.authentication.kerberos.keytab=/etc/security/keytabs/spnego.service.keytab
hbase.rest.authentication.kerberos.principal=HTTP/_HOST@xxx.com
hbase.security.authentication=kerberos
hbase.rest.kerberos.principal=hbase/_HOST@xxx.com
hbase.rest.keytab.file=/etc/security/keytabs/hbase.service.keytab
hbase.rest.info.port=17050
hbase.rest.port=17000
normal way to start rest :
[hbase@hadoop1 ~]$ /usr/hdp/current/hbase-master/bin/hbase-daemon.sh stop rest
stopping rest..
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$ netstat -a | grep 17000
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$ /usr/hdp/current/hbase-master/bin/hbase-daemon.sh start rest
starting rest, logging to /var/log/hbase/hbase-hbase-rest-hadoop1.out
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$ netstat -a | grep 17000
tcp 0 0 *:17000 *:* LISTEN
[hbase@hadoop1 ~]$
[hbase@hadoop1 ~]$ curl --negotiate -u : 'http://hadoop1:17000/status/cluster'
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 404 NOT_FOUND</title>
</head>
<body>
<h2>HTTP ERROR: 404</h2>
<p>Problem accessing /status/cluster. Reason:
<pre> NOT_FOUND</pre></p>
<hr /><i><small>Powered by Jetty://</small></i>
... View more
Labels:
06-28-2018
01:24 PM
can I read/write/manipulate HBASE data via REST API in Hortonworks ? if yes can you guide me some documentation and examples ? I also have Phoenix , is it better to read Hbase data via Phoenix views? again I want to read it via REST API.
... View more
Labels:
06-28-2018
01:21 PM
can I read/write/manipulate HBASE data via REST API in Hortonworks ? if yes can you guide me some documentation and examples ? I also have Phoenix , is it better to read Hbase data via Phoenix views? again I want to read it via REST API.
... View more
Labels:
06-27-2018
04:03 PM
but doesn't this import creates files in hdfs somewhere or the data is moved directly in hbase?
... View more
06-27-2018
02:21 PM
I am getting this error at the end but its still loading the data in hbase fine so what does this error mean for this load 'cannot append files to target directory' ? sqoop command sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON --incremental append --check-column INSERT_TIME --table PATRON.UFM_VIEW -split-by UFM_VIEW.UFMID --target-dir /user/root/_sqoop --hbase-table UFM --column-family F1 --hbase-row-key "UFMID" --columns "UFMID,LANEUFMSEQNO,LANEID,PLAZAID,TXNTM,TIP_ID,TIPUFMSEQ,INSERT_TIME"
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=0
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 147.9979 seconds (0 bytes/sec)
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Retrieved 999 records.
18/06/27 10:05:48 WARN util.AppendUtils: Cannot append files to target dir; no such directory: _sqoop/c81a737093c64d4492c58671affe31fe_PATRON.UFM_VIEW
18/06/27 10:05:48 INFO tool.ImportTool: Saving incremental import state to the metastore
18/06/27 10:05:49 INFO tool.ImportTool: Updated data for job: incjob
[hdfs@hadoop1 ~]$
hbase gets the data fine hbase(main):001:0>
hbase(main):002:0* count 'UFM',INTERVAL => 20000
999 row(s) in 0.3320 seconds
=> 999
hbase(main):003:0>
... View more
Labels:
06-26-2018
04:12 PM
hbase has the correct time but when I query the data from phoenix view its showing totally wrong date value? hbase info 10017175360300 column=F1:INSERT_TIME, timestamp=1530027357550, value=2017-02-05 15:36:31.0
phoenix info 0: jdbc:phoenix:dotstohdamn201.tolls.dot.stat> select ufmid,insert_time from "Sami:UFM2" where ufmid='10017175360300';
+-----------------+-------------------------------+
| UFMID | INSERT_TIME |
+-----------------+-------------------------------+
| 10017175360300 | 177670840-06-01 22:47:08.077 |
+-----------------+-------------------------------+
phoenix view CREATE VIEW "Sami:UFM2" (
UFMID VARCHAR PRIMARY KEY,
"EXLIST"."LIST1" varchar,
"EXLIST"."LIST2" varchar,
"EXLIST"."LIST3" varchar,
"EXLIST"."LIST4" varchar,
"EXLIST"."LIST5" varchar,
"EXLIST"."LIST6" varchar,
"EXLIST"."LIST7" varchar,
"EXLIST"."LIST8" varchar,
"EXLIST"."LIST9" varchar,
"EXLIST"."LIST10" varchar,
"EXLIST"."LIST11" varchar,
"EXLIST"."LIST12" varchar,
"EXLIST"."LIST13" varchar,
"EXLIST"."LIST14" varchar,
"EXLIST"."LIST15" varchar,
"EXLIST"."LIST16" varchar,
"EXLIST"."LIST17" varchar,
"EXLIST"."LIST18" varchar,
"EXLIST"."LIST19" varchar,
"EXLIST"."LIST20" varchar,
"EXLIST"."LIST21" varchar,
"F1"."INSERT_TIME" date,
"F1"."LANEID" varchar,
"F1"."LANEUFMSEQNO" varchar,
"F1"."PLAZAID" varchar,
"F1"."TIPUFMSEQ" varchar,
"F1"."TIP_ID" varchar,
"F1"."TXNTM" date) SALT_BUCKETS=10 ;
... View more
Labels:
06-26-2018
02:21 PM
I narrowed the error to the boundry query , if I remove it the sqoop command works. .
... View more
06-26-2018
02:03 PM
I am getting inconsistent data type in sqoop import , below are my source and target tables and sqoop command error 18/06/26 09:54:27 INFO mapreduce.Job: map 0% reduce 0%
18/06/26 09:54:33 INFO mapreduce.Job: Task Id : attempt_1526339051549_1088_m_000000_0, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00932: inconsistent datatypes: expected NUMBER got TIMESTAMP
sqoop command sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=dotstodb070)(port=1527))(connect_data=(service_name=trhprod)))" --username TRANHOST --incremental append --check-column INSERT_TIME --table TRANHOST.UFM_VIEW -split-by UFM_VIEW.UFMID --target-dir /user/root/_sqoop --hbase-table 'Sami:UFM2' --column-family F1 --hbase-row-key "UFMID" --columns "UFMID,LANEUFMSEQNO,LANEID,PLAZAID,TXNTM,TIP_ID,TIPUFMSEQ,INSERT_TIME" --boundary-query "SELECT MIN(\"INSERT_TIME\"), MAX(\"INSERT_TIME\") FROM \"UFM_VIEW\" WHERE ( \"INSERT_TIME\" > TO_DATE('24-JAN-2017', 'DD-MON-YYYY') AND \"INSERT_TIME\" <= TO_DATE('25-JAN-2017', 'DD-MON-YYYY'))"
source table structure SQL> desc ufm_view
Name Null? Type
----------------------------------------- -------- ----------------------------
UFMID NOT NULL NUMBER(19)
LANEUFMSEQNO NOT NULL NUMBER(10)
LANEID NOT NULL NUMBER(2)
PLAZAID NOT NULL VARCHAR2(6)
TXNTM NOT NULL TIMESTAMP(6)
TIP_ID NOT NULL VARCHAR2(255)
TIPUFMSEQ NOT NULL NUMBER(10)
CCTVETM TIMESTAMP(6)
CCTVSTM TIMESTAMP(6)
COLID VARCHAR2(4)
LIST1 VARCHAR2(160)
LIST2 VARCHAR2(160)
LIST3 VARCHAR2(160)
LIST4 VARCHAR2(160)
LIST5 VARCHAR2(160)
LIST6 VARCHAR2(160)
LIST7 VARCHAR2(160)
LIST8 VARCHAR2(160)
LIST9 VARCHAR2(160)
LIST10 VARCHAR2(160)
LIST11 VARCHAR2(160)
LIST12 VARCHAR2(160)
LIST13 VARCHAR2(160)
LIST14 VARCHAR2(160)
LIST15 VARCHAR2(160)
LIST16 VARCHAR2(160)
LIST17 VARCHAR2(160)
LIST18 VARCHAR2(160)
LIST19 VARCHAR2(160)
LIST20 VARCHAR2(160)
LIST21 VARCHAR2(160)
LANEMODE VARCHAR2(5)
LANEPOS VARCHAR2(2)
LANESTATE VARCHAR2(7)
LANETYPE VARCHAR2(5)
LEGACY NUMBER(1)
POSLSTTM TIMESTAMP(6)
RATETBLTM TIMESTAMP(6)
TAGPLTTBLTM TIMESTAMP(6)
TIPADJ NUMBER(1)
TOURID VARCHAR2(4)
TXNURL VARCHAR2(255)
LEVOFSVC NUMBER(2)
MAINTTYPE VARCHAR2(6)
MAINTID VARCHAR2(4)
CALCLASS NUMBER(2)
CALCLASSRULE VARCHAR2(6)
CALPVPLZID VARCHAR2(6)
CALPVPLZRULE VARCHAR2(6)
PLTACCTID NUMBER(8)
PREPAIDPASS VARCHAR2(7)
INDREV NUMBER(6,2)
FULFAREREV NUMBER(6,2)
DISFAREREV NUMBER(6,2)
ETCREV NUMBER(6,2)
ETCDISREV NUMBER(6,2)
ACTREV NUMBER(6,2)
PAYMETHOD VARCHAR2(8)
TMDAYCSHMULT NUMBER(6,2)
TMDAYETCMULT NUMBER(6,2)
LVLSVCCSHMULT NUMBER(6,2)
LVLSVCETCMULT NUMBER(6,2)
DAYOFWKMULT NUMBER(6,2)
HOLIDAYMULT NUMBER(6,2)
TXNREC1LANEID NUMBER(2)
TXNREC1LNTOK NUMBER(19)
TXNREC2LANEID NUMBER(2)
TXNREC2LNTOK NUMBER(19)
TXNREC3LANEID NUMBER(2)
TXNREC3LNTOK NUMBER(19)
TXNREC4LANEID NUMBER(2)
TXNREC4LNTOK NUMBER(19)
TXNREC5LANEID NUMBER(2)
TXNREC5LNTOK NUMBER(19)
TXNREC6LANEID NUMBER(2)
TXNREC6LNTOK NUMBER(19)
APMCASHEQSTAT VARCHAR2(7)
DISPENSER1ID VARCHAR2(7)
DISPENSER1TYPE CHAR(3)
DISPENSER1COUNT NUMBER(2)
DISPENSER2ID VARCHAR2(7)
DISPENSER2TYPE CHAR(3)
DISPENSER2COUNT NUMBER(2)
DISPENSER3ID VARCHAR2(7)
DISPENSER3TYPE CHAR(3)
DISPENSER3COUNT NUMBER(2)
VAULT1ID VARCHAR2(7)
VAULT1CURRENCYCTS CHAR(78)
VAULT2ID VARCHAR2(7)
VAULT2CURRENCYCTS CHAR(78)
VAULT3ID VARCHAR2(7)
VAULT3CURRENCYCTS CHAR(78)
VAULT4ID VARCHAR2(7)
VAULT4CURRENCYCTS CHAR(78)
APMCREQSTAT VARCHAR2(7)
APMCRAUTHCODE NUMBER(6)
APMCRAUTHSTAT VARCHAR2(7)
APMCRCCTXNNO CHAR(28)
APMCRCARDTYPE VARCHAR2(5)
APMCREXPDATE VARCHAR2(6)
APMCRGATEWAY NUMBER(1)
APMCRLASTFOUR VARCHAR2(4)
APMCRNRCID VARCHAR2(15)
APMCRNRCVALID NUMBER(1)
AVIEQSTAT VARCHAR2(7)
AGENCY NUMBER(2)
INTERNAL NUMBER(19)
LOWBAT NUMBER(1)
PASBAK NUMBER(1)
AVIPREVLANE NUMBER(2)
AVIPREVPLZID VARCHAR2(6)
AVIPREVTKID NUMBER(4)
AVITXNID NUMBER(4)
AVIPREVTM TIMESTAMP(6)
REGION NUMBER(2)
REVTYPE VARCHAR2(10)
TPDRID VARCHAR2(8)
TPDRSTAT VARCHAR2(11)
CANOPYLTEQSTAT VARCHAR2(7)
GATEEQSTAT VARCHAR2(7)
ITLEQSTAT VARCHAR2(7)
OCREQSTAT VARCHAR2(7)
GCKEYCONF NUMBER(4)
HPRDCONF NUMBER(4)
HORPOS NUMBER(4)
HORSIZE NUMBER(4)
JURIS VARCHAR2(3)
JURISCONF NUMBER(4)
MATCHFPID NUMBER(10)
OCRIMGURL VARCHAR2(255)
PLATETYPE NUMBER(2)
PLTAREACONF NUMBER(4)
PLTMATCHCONF NUMBER(4)
PLTRDCONF NUMBER(4)
PLTREAD VARCHAR2(16)
FEATMATCHCONF NUMBER(4)
VEHMATCHCONF NUMBER(4)
VERTPOS NUMBER(4)
VERTSIZE NUMBER(4)
POSTAVCEQSTAT VARCHAR2(7)
POSTAVCLOOPCLASS NUMBER(2)
POSTAVCLOOPSPEED NUMBER(3)
POSTAVCREVAXLECT NUMBER(2)
PREAVCEQSTAT VARCHAR2(7)
PREAVCLOOPCLASS NUMBER(2)
PREAVCLOOPSPEED NUMBER(3)
PREAVCREVAXLECT NUMBER(2)
PTDEQSTAT VARCHAR2(7)
RECEIPTPRTEQSTAT VARCHAR2(7)
VESEQSTAT VARCHAR2(7)
IMG1RANK NUMBER(1)
IMG1URL VARCHAR2(255)
IMG1TYPE CHAR(1)
IMG2RANK NUMBER(1)
IMG2URL VARCHAR2(255)
IMG2TYPE CHAR(1)
IMG3RANK NUMBER(1)
IMG3URL VARCHAR2(255)
IMG3TYPE CHAR(1)
IMG4RANK NUMBER(1)
IMG4URL VARCHAR2(255)
IMG4TYPE CHAR(1)
IMG5RANK NUMBER(1)
IMG5URL VARCHAR2(255)
IMG5TYPE CHAR(1)
IMG6RANK NUMBER(1)
IMG6URL VARCHAR2(255)
IMG6TYPE CHAR(1)
MLTEQSTAT VARCHAR2(7)
MLTCOLCMT VARCHAR2(50)
MLTCOLVEHCLASS NUMBER(2)
MLTCONVOY NUMBER(1)
MLTINSUFFUNDS NUMBER(1)
MLTNRCID VARCHAR2(15)
MLTNRCVALID NUMBER(1)
MLTPREVLANE NUMBER(2)
MLTPREVPLZID VARCHAR2(6)
MLTPREVTKID NUMBER(4)
MLTPREVTKVOID NUMBER(1)
MLTPREVTM TIMESTAMP(6)
MLTRUNTHRU NUMBER(1)
MLTCREQSTAT VARCHAR2(7)
MLTCRAUTHCODE NUMBER(6)
MLTCRAUTHSTAT VARCHAR2(7)
MLTCRCCTXNNO CHAR(28)
MLTCRCARDTYPE VARCHAR2(5)
MLTCREXPDATE VARCHAR2(6)
MLTCRGATEWAY NUMBER(1)
MLTCRLASTFOUR VARCHAR2(4)
MLTCRNRCID VARCHAR2(15)
MLTCRNRCVALID NUMBER(1)
ATIMEQSTAT VARCHAR2(7)
TKID NUMBER(4)
TKTAKEN NUMBER(1)
TKRDREQSTAT VARCHAR2(7)
TKRDRPREVCLASS NUMBER(2)
TKRDRPREVPLZID VARCHAR2(6)
TKRDRPREVTKID NUMBER(4)
TKRDRPREVTM TIMESTAMP(6)
TKRDRPREVLANE NUMBER(2)
VDACEQSTAT VARCHAR2(7)
VDACCLASS NUMBER(2)
VDACHEIGHT NUMBER(3)
VDACLEN NUMBER(3)
VDACWIDTH NUMBER(2)
INSERT_TIME NOT NULL TIMESTAMP(6)
CMPSITMULT NUMBER(5,2)
LANETMHZN TIMESTAMP(6)
PREVTKID NUMBER(4)
SQL>
target Hbase table hbase(main):002:0> describe 'Sami:UFM2'
Table Sami:UFM2 is ENABLED
Sami:UFM2
COLUMN FAMILIES DESCRIPTION
{NAME => 'EXLIST', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NO
NE', TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
{NAME => 'F1', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE',
TTL => 'FOREVER', COMPRESSION => 'NONE', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
2 row(s) in 6.9330 seconds
hbase(main):003:0>
... View more
Labels:
06-22-2018
05:23 PM
with kutil you have to type in user's password to create the keytab file for it . but how can the system admins create these keytab files for LDAP users since they don't know their passwords ? ktutil: add_entry -password -p serverdb_user -k 0 -e rc4-hmac-nt
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)