Member since
11-04-2015
5
Posts
0
Kudos Received
0
Solutions
06-10-2019
12:48 AM
Hi, Thank you for your reply. We changed the data type from timestamp to text to overcome this issue. Regards. Sandeep
... View more
04-29-2019
10:41 PM
Hi All, We are generating parquet file using Python pandas library on a text file. The text file has a field value '2019-04-01 00:00:00.000', that is converted to format '2019-04-01 00:00:00+00:00 ' with data type 'datetime64[ns, UTC]'. The parquet file conversion is successful however while firing a select a query on the Hive external table on this specific column throws an error 'Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='|\xc0[7\x07*O%\xa9P\xde\xb3\x9a\x0c[s', guid='\xf6\x17\xb7\x1e\x15\xbaC\xeb\x9c*\x8e\xf7e<e}')), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage='java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Cannot inspect org.apache.hadoop.io.LongWritable', sqlState=None, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Cannot inspect org.apache.hadoop.io.LongWritable:14:13', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:463',. And in Impala, incompatible Parquet schema for column type: TIMESTAMP, Parquet schema: optional int64 [i:0 d:1 r:0]. Could you pelase guide what could be possible reason for it. We don't want the data type for this column to be STRING. As partial data will be sqoop from RDBMS and later will sent in Parquet format weekly/monthly/quarterly/yearly. Thanks. ispirit
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Impala
04-18-2019
01:20 AM
Hi All, We have created a Hive database and have some external tables on with files placed in S3 bucket. Our Application accesses this data using Impala. Tables are not partitioned and the files are in text/csv format. How should we collect stats using Impala or Hive ? Is it possible to collect stats for Hive tables using Impala and HIve tables stats should only be created to hive. Appreciate your help. Regards. Sandeep Suman
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Impala
03-05-2019
07:15 PM
Hi Jerry, Thanks for your help. Few columns had different data types and table was created using delimiter ',' and column data had elements with '','. Regards. Sandeep Suman
... View more
03-03-2019
11:57 PM
Hi All,
We have sqooped data from MS-SQL 2008 into HDFS and created external tables . The record count shows same between the MS-SQL table and Hive tables. However, if i fire same query on MS-SQL table and Hive table, the output is different. The query is being fired through Hue(tested through Impala as well). Could you please help, if i missed something?
Regards.
Sandeep Suman
... View more
Labels: