Member since
10-16-2014
11
Posts
1
Kudos Received
0
Solutions
10-26-2022
07:51 AM
Our vulnerability scanning found these two vulnerabilities on our CDP Private Cloud 7.1.7-SP1, CVE-2022-22970 and CVE-2022-22971. There are several versions of spring-core in the parcel, none of which are the recommended version: ./jars/spring-core-4.3.29.RELEASE.jar ./jars/spring-core-5.2.18.RELEASE.jar ./jars/spring-core-5.3.10.jar ./jars/spring-core-5.3.12.jar ./jars/spring-core-5.3.13.jar ./jars/spring-core-5.3.4.jar Is CDP vulnerable to these vulnerabilities? https://tanzu.vmware.com/security/cve-2022-22970 https://tanzu.vmware.com/security/cve-2022-22971
... View more
Labels:
- Labels:
-
Apache Hive
05-02-2017
11:10 AM
1 Kudo
Classification: //SecureWorks/Confidential - Limited External Distribution: I have a table with a complex column which is an array of structs. I would like to run a query against the main table (not the complex fields) that has a LIMIT clause that limits to the top N rows. I would then like to join against the complex column. The results will be more than N rows. In the example below, I want to retrieve the 2 most valuable entries from the main table and also get back the full part list for each of those entries. Is this possible? Is there a way to do it efficiently (a single scan)? In Hive: hive> create external table main (id BIGINT, value BIGINT, summary STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','; hive> select * from main; 1 5 part 1 2 10 part 2 3 8 part 3 hive> create external table parts (id BIGINT, part_id BIGINT, count BIGINT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','; hive> select * from parts; OK 1 11 5 1 13 6 2 11 4 2 15 9 2 7 4 3 11 1 hive> insert overwrite table example > select main.id, value, summary, collect_set(named_struct('part_id',part_id,'count',count)) > from main > join parts on main.id = parts.id > group by main.id, value, summary; hive> select * from example; 1 5 part 1 [{"part_id":11,"count":5},{"part_id":13,"count":6}] 2 10 part 2 [{"part_id":11,"count":4},{"part_id":15,"count":9},{"part_id":7,"count":4}] 3 8 part 3 [{"part_id":11,"count":1}] In Impala: Show that the table is accessible. Query: select * from example, example.parts +----+-------+---------+---------+-------+ | id | value | summary | part_id | count | +----+-------+---------+---------+-------+ | 1 | 5 | part 1 | 11 | 5 | | 1 | 5 | part 1 | 13 | 6 | | 2 | 10 | part 2 | 11 | 4 | | 2 | 10 | part 2 | 15 | 9 | | 2 | 10 | part 2 | 7 | 4 | | 3 | 8 | part 3 | 11 | 1 | +----+-------+---------+---------+-------+ In Impala: This query performs the limit after the join. It is not what I want. I want my query to return the ‘part 2’ and ‘part 3’ and all their subparts. That would be the last 4 rows in the query above. Query: select * from example, example.parts order by value limit 2 +----+-------+---------+---------+-------+ | id | value | summary | part_id | count | +----+-------+---------+---------+-------+ | 1 | 5 | part 1 | 13 | 6 | | 1 | 5 | part 1 | 11 | 5 | +----+-------+---------+---------+-------+ In Impala: Some failed attempts [i-pvd1c1r2data01:21000] > select * from (select * from example ex order by value desc limit 2) sub, ex.parts; Query: select * from (select * from example ex order by value desc limit 2) sub, ex.parts ERROR: AnalysisException: Could not resolve table reference: 'ex.parts' [i-pvd1c1r2data01:21000] > select * from (select * from example ex order by value desc limit 2) sub, sub.parts; Query: select * from (select * from example ex order by value desc limit 2) sub, sub.parts ERROR: AnalysisException: Could not resolve table reference: 'sub.parts'
... View more
Labels:
- Labels:
-
Apache Impala
04-20-2017
01:33 PM
Thank you for your quick response. It's a bit disappointing that impala differs from hive: hive> select from_unixtime(1492677561,'yyyy-MM-dd\'H\'HH');
OK
_c0
2017-04-20H08 I've created https://issues.apache.org/jira/browse/IMPALA-5237
... View more
04-13-2017
10:47 AM
I would like to use from_unixtime to convert a unix_timestamp to a string in the format 2017-04-13H08 where the 08 is the hour of day. I haven't figured out how I can do that. I've tried several approaches with backslashes and quotes without success. Here is an example that would produce 2017-04-13T08. [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-13 08:10:11'),'yyyy-MM-ddTHH'); Query: select from_unixtime(unix_timestamp('2017-04-13 08:10:11'),'yyyy-MM-ddTHH') +-----------------------------------------------------------------------+ | from_unixtime(unix_timestamp('2017-04-13 08:10:11'), 'yyyy-mm-ddthh') | +-----------------------------------------------------------------------+ | 2017-04-13T08 | +-----------------------------------------------------------------------+ Fetched 1 row(s) in 0.01s Here are some of my attempts: [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-ddHHH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-ddHHH') +-----------------------------------------------------------------------+ | from_unixtime(unix_timestamp('2017-04-08 09:10:11'), 'yyyy-mm-ddhhh') | +-----------------------------------------------------------------------+ | 2017-04-08009 | +-----------------------------------------------------------------------+ Fetched 1 row(s) in 0.02s [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\HHH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\HHH') +-------------------------------------------------------------------------+ | from_unixtime(unix_timestamp('2017-04-08 09:10:11'), 'yyyy-mm-dd\\hhh') | +-------------------------------------------------------------------------+ | 2017-04-08\009 | +-------------------------------------------------------------------------+ Fetched 1 row(s) in 0.01s [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\'H\'HH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\'H\'HH') +---------------------------------------------------------------------------+ | from_unixtime(unix_timestamp('2017-04-08 09:10:11'), 'yyyy-mm-dd\'h\'hh') | +---------------------------------------------------------------------------+ | 2017-04-08'9'09 | +---------------------------------------------------------------------------+ Fetched 1 row(s) in 0.01s [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\'H\\'HH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\'H\\'HH') ERROR: AnalysisException: Syntax error in line 1: ... 09:10:11'),'yyyy-MM-dd\\'H\\'HH') ^ Encountered: IDENTIFIER Expected: AND, BETWEEN, DIV, HAVING, ILIKE, IN, IREGEXP, IS, LIKE, LIMIT, NOT, OFFSET, OR, ORDER, RANGE, REGEXP, RLIKE, ROWS, UNION, COMMA CAUSED BY: Exception: Syntax error [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\\'H\\\'HH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd\\\'H\\\'HH') +-------------------------------------------------------------------------------+ | from_unixtime(unix_timestamp('2017-04-08 09:10:11'), 'yyyy-mm-dd\\\'h\\\'hh') | +-------------------------------------------------------------------------------+ | 2017-04-08\'9\'09 | +-------------------------------------------------------------------------------+ Fetched 1 row(s) in 0.02s [i-pvd1c1mgr-vip.vldb-bo.secureworkslab.com:21001] > select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd''H''HH'); Query: select from_unixtime(unix_timestamp('2017-04-08 09:10:11'),'yyyy-MM-dd''H''HH') ERROR: AnalysisException: Syntax error in line 1: ...08 09:10:11'),'yyyy-MM-dd''H''HH') ^ Encountered: STRING LITERAL Expected: AND, BETWEEN, DIV, HAVING, ILIKE, IN, IREGEXP, IS, LIKE, LIMIT, NOT, OFFSET, OR, ORDER, RANGE, REGEXP, RLIKE, ROWS, UNION, COMMA CAUSED BY: Exception: Syntax error
... View more
Labels:
- Labels:
-
Apache Impala