Member since
03-23-2016
21
Posts
10
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4187 | 05-19-2016 07:08 PM | |
4505 | 05-16-2016 10:10 PM | |
1432 | 04-15-2016 04:27 PM |
05-23-2016
08:44 PM
@ashu Thanks. Resolved it.
... View more
05-23-2016
08:25 PM
Trying to copy a column data from one Hive-Hbase table to the second Hive-Hbase table. Getting the HBase row key cannot
be NULL error even though there is rowkey. Way to reproduce this is : HBase DDL: CREATE TABLE 'TRIAL_SRC',{NAME => 'd'} CREATE TABLE 'TRIAL_DEST',{NAME => 'd'} Hive DDL: Table 1 (Hive over Hbase): create external table if not exists DCHANDRA.TRIAL_SRC ( key string , pat_id string ) STORED BY
'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES
('hbase.columns.mapping' = ':key, d:ref_val ') tblproperties ('hbase.table.name' ='TRIAL_SRC'); Table 2 (Hive over Hbase): create external table if not exists DCHANDRA.TRIAL_DEST ( key string , pat_id string ) STORED BY
'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES
('hbase.columns.mapping' = ':key, d:ref_val ') tblproperties ('hbase.table.name' ='TRIAL_DEST'); DML for Table1: insert into DCHANDRA.TRIAL_SRC (key,pat_id) values (1,101); insert into DCHANDRA.TRIAL_SRC (key,pat_id) values (2,102); insert into DCHANDRA.TRIAL_SRC (key,pat_id) values (3,103); insert into DCHANDRA.TRIAL_SRC (key,pat_id) values (4,104); insert into DCHANDRA.TRIAL_SRC (key,pat_id) values (5,105); DML for Table2: insert into DCHANDRA.TRIAL_DEST (key,pat_id) values (1,10); Upsert into Table2(TRIAL_DEST) from Table1(TRIAL_SRC) : insert into dchandra.trial_dest(pat_id) select
src.pat_id from dchandra.trial_src src join dchandra.trial_dest dest on
src.key=dest.key; Getting the error Status: Failed Vertex failed, vertexName=Map 1,
vertexId=vertex_1463698008506_2409_9_01, diagnostics=[Task failed,
taskId=task_1463698008506_2409_9_01_000000, diagnostics=[TaskAttempt 0 failed,
info=[Error: Failure while running task:java.lang.RuntimeException:
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:
Hive Runtime Error while processing row
{"key":"1","pat_id":"101"} at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:171) at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:137) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:344) at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179) at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171) at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at
java.util.concurrent.FutureTask.run(FutureTask.java:262) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at
java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row
{"key":"1","pat_id":"101"} at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:91) at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:68) at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:310) at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:148) ... 14 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
Hive Runtime Error while processing row
{"key":"1","pat_id":"101"} at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:545) at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83) ... 17 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
Unexpected exception: org.apache.hadoop.hive.serde2.SerDeException:
java.io.IOException: HBase row key cannot be NULL at
org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:426) -Datta
... View more
Labels:
05-19-2016
07:08 PM
Able to resolve the issue after following the instructions in this link. https://hortonworks.my.salesforce.com/kA1E0000000fyL5?lang=en_US&popup=true&caseId=500E000000Z4YaH -Datta
... View more
05-19-2016
04:23 PM
Hi, We are trying to use Squirrel as a SQL client against Hive with HDP2.3.2 secured cluster. But the connection fails with the following error message: java.lang.RuntimeException: Illegal Hadoop Version: Unknown (expected A.B.* format)
Detailed log stack below : 2016-05-19 10:59:34,902 [pool-5-thread-1] INFO org.apache.hive.jdbc.Utils - Supplied authorities: <<server_name>>:10000
2016-05-19 10:59:34,902 [pool-5-thread-1] INFO org.apache.hive.jdbc.Utils - Resolved authority: <<server_name>>:10000
2016-05-19 10:59:34,915 [pool-5-thread-1] INFO net.sourceforge.squirrel_sql.fw.util.log.SystemOutToLog - SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
2016-05-19 10:59:34,915 [pool-5-thread-1] INFO net.sourceforge.squirrel_sql.fw.util.log.SystemOutToLog - SLF4J: Defaulting to no-operation (NOP) logger implementation
2016-05-19 10:59:34,915 [pool-5-thread-1] INFO net.sourceforge.squirrel_sql.fw.util.log.SystemOutToLog - SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2016-05-19 10:59:34,922 [pool-5-thread-1] WARN org.apache.hadoop.util.VersionInfo - Could not read 'common-version-info.properties', java.io.IOException: Resource not found
java.io.IOException: Resource not found
at org.apache.hadoop.util.VersionInfo.<init>(VersionInfo.java:49)
at org.apache.hadoop.util.VersionInfo.<clinit>(VersionInfo.java:99)
at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:160)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:125)
at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:54)
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:451)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:207)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:180)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at net.sourceforge.squirrel_sql.fw.sql.SQLDriverManager.getConnection(SQLDriverManager.java:133)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:167)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$000(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$1.run(OpenConnectionCommand.java:104)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-05-19 10:59:34,924 [AWT-EventQueue-1] DEBUG net.sourceforge.squirrel_sql.client.gui.db.ConnectToAliasCallBack - java.util.concurrent.ExecutionException
2016-05-19 10:59:34,924 [AWT-EventQueue-1] ERROR net.sourceforge.squirrel_sql.client.gui.db.ConnectToAliasCallBack - Unexpected Error occurred attempting to open an SQL connection.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: Illegal Hadoop Version: Unknown (expected A.B.* format)
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.awaitConnection(OpenConnectionCommand.java:132)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$100(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$2.run(OpenConnectionCommand.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Illegal Hadoop Version: Unknown (expected A.B.* format)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:175)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$000(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$1.run(OpenConnectionCommand.java:104)
... 5 more
Caused by: java.lang.RuntimeException: Illegal Hadoop Version: Unknown (expected A.B.* format)
at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:164)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:125)
at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:54)
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:451)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:207)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:180)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at net.sourceforge.squirrel_sql.fw.sql.SQLDriverManager.getConnection(SQLDriverManager.java:133)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:167)
... 7 more -------------------------------------------------------------------------------------------------------------------------------------------------------------- Here is our JDBC connect string: jdbc:hive2://servername:10000/default;principal=hive/servername.com@HADOOPPROD.LOCAL;auth-kerberos Here are our jars in the extra classpath section: hadoop-common-2.7.1.2.3.2.0-2950.jar hive-jdbc-1.2.1.2.3.2.0-2950-standalone.jar We get the same error even if we do the same with DBVIsualizer. Thanks
... View more
Labels:
05-16-2016
10:10 PM
3 Kudos
Hi, This has fixed the issue. https://hortonworks.my.salesforce.com/kA2E0000000LZQ5?srPos=0&srKp=ka2〈=en_US ROOT CAUSE:
A character limitation for PARAM_VALUE field in SERDE_PARAMS table in hive metastore for 4000 character is the root cause of this issue. This limitation prevents Hive from creating a table with high column numbers, eventually causing desc <table name> or select * from <table name> to fail with error above.
WORKAROUND:
This issue can be worked around by doing the following in hive metastore -- log into Hive Metastore DB
-- >alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(400000000); Thanks. -Datta
... View more
05-16-2016
07:23 PM
Hi @Predrag Minovic,Thanks for your response. I have same number of columns both on the Hive & hbase side in my ddl.& dont see any whitespaces. I am also attaching the 2 ddls. (1) working_ver, that works with column count up to 207. After executing this script , it looks fine with Hive or hbase table. hive> select * from PATIENT_NRICHED;
OK
Time taken: 0.447 seconds (2) not_working_ver, I added a new column called "xyz string" on hive side & "v:xyz" on the hbase side. After executing the script, It creates the table. When you run the query "select * from CLARITY_RT.PATIENT_NRICHED", will get the following error. hive> select * from PATIENT_NRICHED;
FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 208 elements while hbase.columns.mapping has 207 elements (counting the key if implicit)).
Thanks working-ver.txt not-working-ver.txt Datta
... View more
05-13-2016
10:10 PM
Hi, I have been trying create bunch of Hive over Hbase tables. It creates the table both on Hive & on Hbase side. But when I try to query that hive table get the following error when the number of columns reach 200 ish mark...I verified the count of columns on the DDL of Hive over Hbase matches, there are no special characters or any of that sort. Works fine for the same tables with smaller set of columns (< 200ish). Not sure, are we missing something in the configuration.. We are on HDP2.3.2. Appreciate any sort of suggestions to debug this issue. hive> select * from CLARITY_RT.PATIENT_TRIAL; FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 248 elements while hbase.columns.mapping has 207 elements (counting the key if implicit))
hive> select * from CLARITY_RT.ORDER_MED_TRIAL; FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 205 elements while hbase.columns.mapping has 198 elements (counting the key if implicit))
hive> select * from CLARITY_RT.DM_CF; FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 544 elements while hbase.columns.mapping has 225 elements (counting the key if implicit))
... View more
Labels:
04-15-2016
04:27 PM
1 Kudo
Thanks @asinghal & @Josh Elser . The following query fixed it.. select count(*) from CDC WHERE "key" > TO_CHAR(TO_NUMBER(NOW())-600000,'#############'); Appreciate your quick help. -Datta
... View more
04-15-2016
04:05 PM
Hi Josh,
Thanks for the response. It works fine in the select query. I built the function by putting it in select query something like below.
0: jdbc:phoenix:lnxhdpdp07.smrcy.com,lnxhdpdp> select "key", TO_CHAR(TO_NUMBER(NOW())-600000) as TM from CDC LIMIT 5; +------------------------------------------+-------------------+
| key | TM |
+------------------------------------------+-------------------+
| 1460067042710,EPT,26|T|Z978926||1 | 1,460,734,960,474 |
| 1460067042710,EPT,26|T|Z978926||8 | 1,460,734,960,474 |
| 1460067042711,EPT,26|T|Z978926||1 | 1,460,734,960,474 |
| 1460067042711,EPT,26|T|Z978926||8 | 1,460,734,960,474 |
| 1460067042712,EPT,26|T|Z978926||1 | 1,460,734,960,474 |
+------------------------------------------+-------------------+ Also works fine, If I put the hardcoded value in the function part of where clause ...
0: jdbc:phoenix:lnxhdpdp07.smrcy.com,lnxhdpdp> SELECT COUNT(*) FROM CDC WHERE "key"> '1460734960474'; +------------------------------------------+
| COUNT(1) |
+------------------------------------------+
| 539753 |
+------------------------------------------+ Also tried putting it in "where (regexp_split(CDC."key",',')[1]) > TO_CHAR((TO_NUMBER(NOW())-60000)" didn't work.
DDL is as below :
DROP VIEW IF EXISTS CDC;
CREATE VIEW CDC
(
"key" VARCHAR primary key
)
default_column_family='d';
... View more
04-15-2016
03:15 PM
We have a phoenix table with "key" as time in millisec. I am trying to get count of number of records for the last 10 minutes by using Phoenix functions in the where clause, but not going anywhere. It just returns the zero count. Anyone tried functions on the where clause something like below ? select count(*) from CDC where CDC."key" > TO_CHAR((TO_NUMBER(NOW())-600000); -Datta
... View more
Labels:
- Labels:
-
Apache Phoenix