Member since
09-21-2022
9
Posts
0
Kudos Received
0
Solutions
01-16-2024
01:49 PM
1 Kudo
There is a workaround to solve this. Is not a definitive solutions but it can help: The final result would be like this: Firstly, I created a table like your example (I used ";" as separator): insert overwrite t_1 select 'Asia' as cont,'Japan;China;Singapore;' Country_list union select 'Europe' as cont,'UK;Spain;Italy;German;Norway;' Country_list After, I created a external table. It must be stored as textfile: CREATE EXTERNAL TABLE IF NOT EXISTS t_transpose ( field_transpose string ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ";" STORED AS TEXTFILE; Then insert on this table like this: insert overwrite t_transpose select REGEXP_REPLACE(Country_list, ';', concat("|", cont, '\n' ) ) as transpose from t_1; After you can select like in my example before: select split_part(field_transpose,"|",1), split_part(field_transpose,"|",2) from t_transpose; Ps: The final result could have some blank lines, just filter/ignore it. I also put one more ";" in the line comparing with the example informed.
... View more
10-23-2023
04:27 PM
jdbc:hive2://<host>:10000 Error: Could not open client transport with JDBC Uri: .... Invalid status 21 If i use just 'beeline' command, i am able to connect to hive. However if i use below command, i am getting exception. beeline -u jdbc:hive2://<host>:10000/default -n user -p pwd 23/10/23 23:22:31 [main]: WARN transport.TSaslTransport: Could not send failure response org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset 23/10/23 23:22:31 [main]: WARN jdbc.HiveConnection: Failed to connect to host:10000 log4j:ERROR Could not create an Appender. Reported error follows. java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to com.cloudera.hive.jdbc42.internal.apache.log4j.Appender at com.cloudera.hive.jdbc42.internal.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:248) at com.cloudera.hive.jdbc42.internal.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176) log4j:WARN No appenders could be found for logger (com.cloudera.hive.jdbc42.internal.apache.thrift.transport.TSaslTransport). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Unknown HS2 problem when communicating with Thrift server. Error: Could not open client transport with JDBC Uri: jdbc:hive2://host:10000/default: Invalid status 21 Also, could not send response: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset (state=08S01,code=0)
... View more
06-16-2023
12:04 AM
Check the possibility of using hive managed table. As part of hive managed tables, you won't require separate merge job , as hive compaction takes care by default if compaction is enabled. You can access managed tables through HWC (Hive Warehouse Connector) from Spark.
... View more
02-14-2023
01:53 AM
As of now and I believe in the future as well, Phoenix will be a packaged component. Afaik, License for Phoenix was needed in CDH but it is tied to Hbase in CDP onwards. Was your question answered? Please take some time to click on “Accept as Solution” below this post.
... View more
12-21-2022
07:04 AM
1 Kudo
Hello @SagarCapG Confirmed that Phoenix v5.1.0 has the Fix for "!primarykeys" to show the Primary Key linked with a Phoenix Table. Upon checking our Product Documentation, CDP v7.1.6 introduces Phoenix v5.1.0 [1]. As such, I am surprised your Team has Phoenix v5.0.0 with CDP v7.1.7, wherein Official v7.1.7 Doc [2] says Phoenix v5.1.1.7.1.7.0-551 is used. Since the Issue is fixed in Phoenix v5.1.x & CDP v7.1.6 onwards ship Phoenix v5.1.x, Kindly engage Cloudera Support to allow Support to review your Cluster for identifying the reasoning for CDP v7.1.7 using Phoenix v5.0.0. Or, Upgrade to Phoenix v5.1.x (If Customer is managing Phoenix outside of CDP) to use "!primarykeys" functionality. Regards, Smarak [1] What's New in Apache Phoenix | CDP Private Cloud (cloudera.com) [2] Cloudera Runtime component versions | CDP Private Cloud
... View more