Member since
03-06-2020
398
Posts
54
Kudos Received
35
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
95 | 11-21-2024 10:12 PM | |
829 | 07-23-2024 10:52 PM | |
1078 | 05-16-2024 12:27 AM | |
3022 | 05-01-2024 04:50 AM | |
1337 | 03-19-2024 09:23 AM |
12-20-2024
01:40 AM
1 Kudo
I think the problem partly has to do with our Python3.8 installation. We did the installation via Anaconda. Cloudera recommended will use yum to install the rh-python38 on our RHEL/OL7 as I mentioned in the previous message. Documentation is here: Installing Python 3.8 standard package on RHEL 7 | CDP Private Cloud. The installation resolved most of the Web Server issue. The Web Server issue for Impala not only has to do with Python installation but the Web Server username and password. Below is the following action performed to resolve the Impala Web Server issue after enabling the hadoop_secure_web_ui. WORK PERFORMED: Removed the below configurations from CM UI : Impala > Configuration > Catalog Server > Web Server Username Impala > Configuration > Catalog Server > Server Web Server User Password Impala > Configuration > Impala Daemon > Web Server Username Impala > Configuration >Impala Daemon > Web Server User Password Impala > Configuration >Statestore > Web Server Username Impala > Configuration >Statestore > Web Server User Password Enabled "Enable Kerberos Authentication for HTTP Web-Consoles" under CM UI > Impala > Configurations Restarted Impala Service. Also, regarding the Impala, this Cloudera documentation was quite helpful: Configuring Impala Web UI | CDP Public Cloud The issue is resolved now by following the instructions in the above documentation.
... View more
11-22-2024
04:56 AM
1 Kudo
If you suspect that TEZ-4032 is the cause, consider upgrading your cluster to CDP 7 and testing it again, as it has been backported in CDP 7.
... View more
11-21-2024
10:40 PM
1 Kudo
Hi @mrblack To avoid full table scan you follow these tips: 1. Ensure proper partition pruning: https://impala.apache.org/docs/build/html/topics/impala_partitioning.html#:~:text=the%20impalad%20daemon.-,Partition%20Pruning%20for%20Queries,-Partition%20pruning%20refers 2. Re write the query with sub queries. 3. Add explicit hints for join behaviour. Impala supports join hints like brodcast and shuffle that can influence query planning. After optimising check the explain plan. Regards, Chethan YM
... View more
11-21-2024
10:29 PM
Hi @xiaohai >What is the error you are seeing? >can you use this delimiter? impala-shell -B --output_delimiter='|' -q 'SELECT * FROM your_table' Regards, Chethan YM
... View more
11-21-2024
10:12 PM
1 Kudo
Hi @pravin_speaks Can you export the below before running the sqoop command and see if it helps? export HADOOP_CLIENT_OPTS="-Dsqoop.oracle.escaping.disabled=false -Djava.security.egd="file:///dev/../dev/urandom" Regards, Chethan YM
... View more
11-15-2024
01:55 AM
1 Kudo
@luffy07 The given error message is generic when you are using JDBC driver to connect impala and it does not suggest the specific cause. Verify your Impala JDBC connection string is correct, port and Hosts are reachable etc.. Check Impala server that you are trying to connect is up and running fine, paste the memory errors here to understand what you are seeing in the logs. Also you can append the below into JDBC connection string and repro the issue , It will generate driver DEBUG logs and may give some more details about the issue. LogLevel=6;LogPath=/tmp/jdbclog And try to use the latest Impala JDBC driver that is avaible. https://www.cloudera.com/downloads/connectors/impala/jdbc/2-6-34.html Regards, Chethan YM
... View more
09-18-2024
09:19 PM
1 Kudo
This solution worked for eliminating error , but data is not being fetched from table. empty data frame showing.
... View more
09-05-2024
07:18 PM
1 Kudo
```java private static String JDBC_DRIVER = "com.cloudera.impala.jdbc.Driver"; private static String CONNECTION_URL ="jdbc:impala://127.0.0.1:21050/;LogLevel=5;UseNativeQuery=1"; ``` Found in logs ```log Adding THandleIdentifier(guid:52 BF 8E A3 7D E8 47 B6 00 00 00 00 3B FB 5C 30, secret:29 1B CE 80 42 7C 45 91 93 B8 4E 77 56 C3 0E D6) to the heart beat operation handle list. ``` the guid is something about query id in profile Query (id=b647e87da38ebf52:305cfb3b00000000)
... View more
08-13-2024
10:04 PM
Thanks @ggangadharan As far as I can see HBase is up and running but I found something in the HBase log: 2024-08-13 21:53:30,583 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hive/HOST@REALM (auth:KERBEROS) 2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55106, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hive/HOST@REALM (auth:KERBEROS), service=ClientService 2024-08-13 21:53:30,584 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hive/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface 2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39718 2024-08-13 21:53:38,853 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39718 2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39720 2024-08-13 21:53:39,056 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39720 2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39722 2024-08-13 21:53:39,361 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39722 2024-08-13 21:53:39,869 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39724 2024-08-13 21:53:39,870 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39724 2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39726 2024-08-13 21:53:40,877 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39726 2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39728 2024-08-13 21:53:42,882 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39728 2024-08-13 21:53:46,219 INFO org.apache.hadoop.hbase.io.hfile.LruBlockCache: totalSize=9.18 MB, freeSize=12.20 GB, max=12.21 GB, blockCount=5, accesses=7481, hits=7461, hitRatio=99.73%, , cachingAccesses=7469, cachingHits=7461, cachingHitsRatio=99.89%, evictions=2009, evicted=0, evictedPerRun=0.0 2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730 2024-08-13 21:53:46,914 WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x0A\x04hi from xx.xxx.xx.xxx:39730 2024-08-13 21:53:50,477 INFO org.apache.hadoop.hbase.ScheduledChore: CompactionThroughputTuner average execution time: 8653 ns. 2024-08-13 21:53:50,572 INFO org.apache.hadoop.hbase.replication.regionserver.Replication: Global stats: WAL Edits Buffer Used=0B, Limit=268435456B 2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for hbase/HOST@REALM (auth:KERBEROS) 2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.hbase.Server: Connection from xx.xxx.xx.xxx:55174, version=2.2.3.7.1.7.0-551, sasl=true, ugi=hbase/HOST@REALM (auth:KERBEROS), service=ClientService 2024-08-13 21:53:55,216 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for hbase/HOST@REALM (auth:KERBEROS) for protocol=interface org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingInterface 2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping HBase metrics system... 2024-08-13 21:53:56,136 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system stopped. 2024-08-13 21:53:56,638 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2024-08-13 21:53:56,641 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: HBase metrics system started This Warning (WARN org.apache.hadoop.hbase.ipc.RpcServer: Expected HEADER=HBas but received HEADER=\x00\x00\x013 from xx.xxx.xx.xxx:39730) only appears for the statement: insert overwrite table managed_ml select key, cf1_id , cf1_name from c_0external_ml; Others statements like insert into c_0external_ml values (1,2,3); runs perfectly. Does this error sound familiar to you???
... View more
08-07-2024
06:33 AM
After adding a ) to both lines(which Hive didn't like not having), it works for some of the currencies- but others are not adding up properly- ie, they'll double the negative values(when there).
... View more