Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1417 | 10-11-2018 01:38 AM | |
1797 | 09-26-2018 02:24 AM | |
1736 | 06-29-2018 02:35 PM | |
2284 | 06-29-2018 02:34 PM | |
5181 | 06-20-2018 04:30 PM |
05-22-2019
07:21 PM
In installing hive if I use the driver that came with HDP the DB connection test fails , but if I use the older driver I have(don't know the version) the check succeeds . [root@hadoop1 resources]# pwd /var/lib/ambari-server/resources [root@hadoop1 resources]# ls -al mysql-connector-java.jar lrwxrwxrwx. 1 root root 64 May 22 14:56 mysql-connector-java.jar -> /var/lib/ambari-server/resources/mysql-connector-java-8.0.15.jar [root@hadoop1 resources]#
... View more
Labels:
- Labels:
-
Apache Hive
02-28-2019
07:29 PM
I am loading a phoenix table but in hbase this table row key is shown as hex values. how can I correct this to see the ASCII values ? that is the integer value
sqlline version 1.1.8
0: jdbc:phoenix:localhost:2181:/hbase-unsecur> SELECT * FROM EXAMPLE;
+--------+-------------+------------+
| MY_PK | FIRST_NAME | LAST_NAME |
+--------+-------------+------------+
| 12345 | John | Doe |
| 67890 | Mary | Jane |
+--------+-------------+------------+
2 rows selected (0.053 seconds)
Version 1.1.2.2.6.5.0-292, r897822d4dd5956ca186974c10382e9094683fa29, Fri May 11 08:01:42 UTC 2018
hbase(main):001:0> scan 'EXAMPLE'ROW COLUMN+CELL \x80\x00\x00\x00\x00\x0009 column=M:FIRST_NAME, timestamp=1551381906272, value=John \x80\x00\x00\x00\x00\x0009 column=M:LAST_NAME, timestamp=1551381906272, value=Doe \x80\x00\x00\x00\x00\x0009 column=M:_0, timestamp=1551381906272, value=x \x80\x00\x00\x00\x00\x01\x092 column=M:FIRST_NAME, timestamp=1551381906272, value=Mary \x80\x00\x00\x00\x00\x01\x092 column=M:LAST_NAME, timestamp=1551381906272, value=Jane \x80\x00\x00\x00\x00\x01\x092 column=M:_0, timestamp=1551381906272, value=x2 row(s) in 0.2030 seconds
... View more
Labels:
02-25-2019
08:55 PM
I have Hadoop classpath defined but when I use it as per documentation I am getting error [root@hadoop1 conf]# hadoop classpath
/usr/hdp/2.6.5.0-292/hadoop/conf:/usr/hdp/2.6.5.0-292/hadoop/lib/*:/usr/hdp/2.6.5.0-292/hadoop/.//*:/usr/hdp/2.6.5.0-292/hadoop-hdfs/./:/usr/hdp/2.6.5.0-292/hadoop-hdfs/lib/*:/usr/hdp/2.6.5.0-292/hadoop-hdfs/.//*:/usr/hdp/2.6.5.0-292/hadoop-yarn/lib/*:/usr/hdp/2.6.5.0-292/hadoop-yarn/.//*:/usr/hdp/2.6.5.0-292/hadoop-mapreduce/lib/*:/usr/hdp/2.6.5.0-292/hadoop-mapreduce/.//*:/usr/jdk64/jdk1.8.0_112/lib/tools.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:ojdbc6.jar:/usr/hdp/2.6.5.0-292/tez/*:/usr/hdp/2.6.5.0-292/tez/lib/*:/usr/hdp/2.6.5.0-292/tez/conf
[root@hadoop1 conf]#
[root@hadoop1 conf]#
[root@hadoop1 conf]# echo $JAVA_HOME
/usr/jdk64/jdk1.8.0_112
[root@hadoop1 conf]#
[root@hadoop1 conf]# javac -cp `hadoop classpath` TestHbaseTable.java
TestHbaseTable.java:3: error: package org.apache.hadoop.hbase does not exist
import org.apache.hadoop.hbase.HBaseConfiguration;
^
TestHbaseTable.java:4: error: package org.apache.hadoop.hbase does not exist
import org.apache.hadoop.hbase.HColumnDescriptor;
^
TestHbaseTable.java:5: error: package org.apache.hadoop.hbase does not exist
import org.apache.hadoop.hbase.HTableDescriptor;
^
TestHbaseTable.java:6: error: package org.apache.hadoop.hbase.client does not exist
import org.apache.hadoop.hbase.client.HBaseAdmin;
^
TestHbaseTable.java:12: error: cannot find symbol
HBaseConfiguration hconfig = new HBaseConfiguration(new Configuration());
^
symbol: class HBaseConfiguration
location: class TestHbaseTable
TestHbaseTable.java:12: error: cannot find symbol
HBaseConfiguration hconfig = new HBaseConfiguration(new Configuration());
^
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
02-25-2019
07:19 PM
I am using HDP2.6, hbase 1.1.2 python 3.3.6 [root@hadoop1 conf]# /usr/hdp/current/hbase-master/bin/hbase-daemon.sh start thrift -p 9090 --infoport 9091starting thrift, logging to /var/log/hbase/hbase-root-thrift-hadoop1.out
>>> import happybase
>>> c = happybase.Connection('hadoop1',9090, autoconnect=False)
>>> c.open()
>>> print(c.tables())
Traceback (most recent call last):File "<stdin>", line 1, in <module>File "/usr/local/lib/python3.6/site-packages/happybase/connection.py", line 242, in tablesnames = self.client.getTableNames()File "/usr/local/lib/python3.6/site-packages/thriftpy/thrift.py", line 198, in _reqreturn self._recv(_api)File "/usr/local/lib/python3.6/site-packages/thriftpy/thrift.py", line 210, in _recvfname, mtype, rseqid = self._iprot.read_message_begin()File "thriftpy/protocol/cybin/cybin.pyx", line 439, in cybin.TCyBinaryProtocol.read_message_begin (thriftpy/protocol/cybin/cybin.c:6470)cybin.ProtocolError: No protocol version header>>>
... View more
Labels:
02-25-2019
04:52 PM
hi Josh how can I do it via 'get' command ? since we will be using java to get these rows
... View more
02-25-2019
04:05 PM
I can select a particular version from a table , but how can I select a range of versions ? e.g say I want to select all the rows from timestamp 1551108073557 to 1551108018724 ? hbase(main):005:0> scan 'PERSON',{NAME => 'student',VERSIONS => 5} ROW COLUMN+CELL r1 column=student:student_name, timestamp=1551108086986, value=John r1 column=student:student_name, timestamp=1551108073557, value=Mary r1 column=student:student_name, timestamp=1551108037609, value=Nancy r1 column=student:student_name, timestamp=1551108018724, value=Ram r1 column=student:student_name, timestamp=1551108002231, value=Sam 1 row(s) in 0.0190 seconds hbase(main):006:0> hbase(main):007:0* hbase(main):008:0* get 'PERSON','r1',{COLUMN => 'student', TIMESTAMP => 1551108018724} COLUMN CELL student:student_name timestamp=1551108018724, value=Ram
... View more
Labels:
11-27-2018
04:08 PM
after starting the ldap server for hadoop i am getting different error now, also i had to replace the clustername "ftehdp" with "default" ? curl -i -k -u admin:admin-password -X GET 'https://localhost:8443/gateway/default/webhdfs/v1/tmp/uname.txt?op=OPEN'
HTTP/1.1 307 Temporary Redirect
Date: Tue, 27 Nov 2018 16:21:44 GMT
Set-Cookie: JSESSIONID=1219u2f8zreb11eu9fuxlggxhq;Path=/gateway/default;Secure;HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Set-Cookie: rememberMe=deleteMe; Path=/gateway/default; Max-Age=0; Expires=Mon, 26-Nov-2018 16:21:44 GMT
Cache-Control: no-cache
Expires: Tue, 27 Nov 2018 16:21:44 GMT
Date: Tue, 27 Nov 2018 16:21:44 GMT
Pragma: no-cache
Expires: Tue, 27 Nov 2018 16:21:44 GMT
Date: Tue, 27 Nov 2018 16:21:44 GMT
Pragma: no-cache
X-FRAME-OPTIONS: SAMEORIGIN
Location: https://hadoop1:8443/gateway/default/webhdfs/data/v1/webhdfs/v1/tmp/uname.txt?_=AAAACAAAABAAAACgLvtILkFAljr5PIP7MVSOAump8j0kSwFCPdGCP2R_b1tCZ0V2KGOQuiRiI4_IU7GDG6NqRtK2Vu7DOZeOhbuQUaP1FYtD_-IV3P-VXMbOFbPfbwpNseAuN-RyQduRm5S1mrk0GVbYKQg4NscgsoF0GGsvqKDyPtECwhwkX96E37Jc5_yCnlkw3LVKUY41Hg6LOt96W8-3rTmnrbo7o26dOcpPv1_uv4Q1F18b4yk5N5BNf6HTZdVZ6Q
Content-Type: application/octet-stream
Server: Jetty(6.1.26.hwx)
Content-Length: 0
... View more
11-27-2018
03:35 PM
I am trying to create a file in hdfs via the knox command but i am getting error : $ curl -i -k -u admin:adminxxxx -X PUT 'https://localhost:8443/gateway/ftehdp/namenode/api/v1/tmp/README?op=CREATE'
HTTP/1.1 404 Not Found
Date: Tue, 27 Nov 2018 15:33:47 GMT
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html; charset=ISO-8859-1
Content-Length: 319
Server: Jetty(9.2.15.v20160210)
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 404 </title>
</head>
<body>
<h2>HTTP ERROR: 404</h2>
<p>Problem accessing /gateway/ftehdp/namenode/api/v1/tmp/README. Reason:
<pre> Not Found</pre></p>
<hr /><i><small>Powered by Jetty://</small></i>
</body>
</html>
... View more
Labels:
10-19-2018
08:24 PM
package com.demo;
import org.apache.hadoop.hive.ql.exec.UDF;
import java.sql.*;
import oracle.jdbc.*;
import org.apache.hadoop.io.Text;
public class queryp extends UDF {
public Text evaluate(final Text s) throws SQLException {
//----- below code works fine as a seperate class till EOF ------------
System.setProperty("oracle.net.tns_admin","/u01/oracle/admin");
System.setProperty("oracle.net.wallet_location","/u01/oracle/admin");
String url = "jdbc:oracle:thin:/@secure";
DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver());
Connection conn = DriverManager.getConnection(url);
Statement stmt = conn.createStatement ();
ResultSet rset = stmt.executeQuery ("select HOST_NAME,INSTANCE_NAME FROM V$INSTANCE");
while (rset.next ())
System.out.println (rset.getString (1));
// -------------------------- EOF ------------------------------
return null;
}
}
> add jar /tmp/hive-udf-0.1-SNAPSHOT.jar;
Added [/tmp/hive-udf-0.1-SNAPSHOT.jar] to class path
Added resources: [/tmp/hive-udf-0.1-SNAPSHOT.jar]
hive> create temporary function qp as 'com.demo.queryp';
OK
Time taken: 0.356 seconds
hive> select qp();
FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments 'qp': No matching method for class com.fdot.demo.querypatron with (). Possible choices: _FUNC_(string)
hive>
... View more
Labels: