Member since
04-03-2019
97
Posts
7
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1065 | 01-13-2025 11:17 AM | |
4942 | 01-21-2022 04:31 PM | |
6982 | 02-25-2020 10:02 AM | |
4858 | 02-19-2020 01:29 PM | |
3302 | 09-17-2019 06:33 AM |
06-14-2019
10:04 PM
@Geoffrey Shelton Okot My intention is to use "su hive", not "su - hive". Below is the resource showing the difference. https://www.tecmint.com/difference-between-su-and-su-commands-in-linux/ Thanks.
... View more
06-14-2019
09:57 PM
@Gulshad Ansari Thanks. You are right. The execute permission is required for entering the directory.
... View more
06-12-2019
09:56 PM
My issue is actually caused by the fact that Port 10000 is not working. After installing the Hive client no another data node, where port 10000 is working, I am able to follow the steps above and create the DSN successfully.
... View more
06-03-2019
04:14 AM
@Shu Your code put me on the right track. Thanks again. However, I got some strange returned records. Here are my guesses of the possible causes. 1. The CR and LF as the row delimiter and they are not the default row delimiter of the regex SerDe. How can I specify the row delimiter? 2. There are two strange characters in the first column of the first row. It might be related to the row delimiter too. Below is the create-table script. Create External Table slog(LogTime string, LogSource string, LogMessage string) ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' WITH SERDEPROPERTIES ("input.regex" = "(.{46})(.{24})(.*)") LOCATION '/path/to/slog/'; I attached a screenshot of the log file (in notepad ++) and the hive query result.
... View more
06-02-2019
03:22 AM
@Shu Thanks for your help. I am currently on the road and will test your solution once I am back in office.
... View more
05-31-2019
02:39 PM
Basically, I am trying to analyze SQL server log files using Hive. The layout is the log file is char(23) - for timestamp char(1) - space char(12) - source the rest of the row, and the length varies. The row delimiter is CR+LF. Below are some entries in the log. 2019-05-28 07:29:55.03 Server UTC adjustment: -7:00
2019-05-28 07:29:55.03 Server (c) Microsoft Corporation.
2019-05-28 07:29:55.03 Server All rights reserved.
2019-05-28 07:29:55.03 Server Server process ID is 3368. There are several posts here regarding fixed-width column layout. But in my case, the last column is identified not by the width but by the row delimiter.
... View more
Labels:
- Labels:
-
Apache Hive
05-28-2019
05:13 PM
@Vinay Your are right. The issue is resolved. Thank you!
... View more
05-28-2019
03:28 AM
Content of stock2.json file {"myid":"0001","mytype":"donut"} stock2.json file is in the path of '/warehouse/tablespace/external/hive/haijintest.db/stock2'; Below is the table creation script. CREATE EXTERNAL TABLE stock4_json (myjson struct<myid:string, mytype:string>)ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' LOCATION '/warehouse/tablespace/external/hive/haijintest.db/stock2'; Below is the query that returned null value. 0: jdbc:hive2://> select myjson.myid from stock4_json;
19/05/27 20:20:18 [103d2343-eb59-4072-8a07-5469b2112093 main]: WARN optimizer.SimpleFetchOptimizer: Table haijintest@stock4_json is external table, falling back to filesystem scan.
OK
+-------+
| myid |
+-------+
| NULL |
+-------+
1 row selected (0.232 seconds)
0: jdbc:hive2://> select * from stock4_json;
OK
+---------------------+
| stock4_json.myjson |
+---------------------+
| NULL |
+---------------------+
1 row selected (0.242 seconds) Any suggestions?
... View more
Labels:
- Labels:
-
Apache Hive
05-16-2019
06:50 PM
It is strange to me to see the port number is 2181. I thought the port number should be 10000. However, I tried the following command beeline -u "jdbc:hive2://name1.iehp.local:10000" and got the error WARN jdbc.HiveConnection: Failed to connect to name1.abc.local:10000 Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status. Error: Could not open client transport with JDBC Uri: jdbc:hive2://name1.abc.local:10000: java.net.UnknownHostException: name1.iehp.local (state=08S01,code=0)
... View more
05-16-2019
03:36 AM
I ran into the same issue and below is the failure message. FAILED!
[Hortonworks][DriverSupport] (1110) Unexpected response received from server. Please ensure the server host and port specified for the connection are correct and confirm if SSL should be enabled for the connection. @Chiran Ravani My HDP box does not require SSL. I understand and followed step 1 and step 3 but does not know how to do with step 2. Could you provide more specs on Step 2? By the way, I checked all listening ports on my box and did not see port 100000. The firewall on my box has been disabled. I am not sure whether this matters. Here is my command to check listening ports. sudo nmap -sTU -O localhost Thanks.
... View more
- « Previous
- Next »