Member since
03-06-2020
406
Posts
56
Kudos Received
37
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1088 | 08-29-2025 12:27 AM | |
| 1629 | 11-21-2024 10:40 PM | |
| 1539 | 11-21-2024 10:12 PM | |
| 5279 | 07-23-2024 10:52 PM | |
| 3017 | 05-16-2024 12:27 AM |
10-03-2021
08:50 PM
Hi, Have you used the hs2-url something like below? sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table "employees_test" --target-dir "/user/username/importdir/employees_test" --delete-target-dir --hive-import --hs2-url "jdbc:hive2://hs2host:10000/default;principal=hive/hs2host@DOMAIN.COM;ssl=true;sslTrustStore=/etc/cdep-ssl-conf/CA_STANDARD/truststore.jks;trustStorePassword=password" --hs2-user username --hs2-keytab "/path/to/sqooptestkeytab" Can you provide your sqoop command along with its error output?
... View more
09-28-2021
06:26 PM
Hi, I think you can check the HS2 active sessions by navigating to HS2 WEB UI, here is the screenshot for your reference. Regards, Chethan YM
... View more
09-15-2021
07:17 PM
Hi, As per the application log it seems to be the issue with permissions, Have you tried to run with other users? or have you tried to give access to this user to access the path? Check the permissions with other users who have access to this path and try to apply the permission to "googleops" user. {"Event":"SparkListenerTaskEnd","Stage ID":11,"Stage Attempt ID":0,"Task Type":"ResultTask","Task End Reason":{"Reason":"ExceptionFailure","Class Name":"org.apache.hadoop.security.AccessControlException","Description":"Permission denied: user=googleops, access=WRITE, inode=\"/user/yarn/google-links/google-book-feed/_temporary/0\":yarn:yarn:drwxr-xr-x\n\tat org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:504)\n\tat org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:336)\n\tat org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)\n\tat Thank you, Chethan YM
... View more
08-29-2021
06:13 AM
Hi, Can you explain the issue in brief.. 1. what is the error you are seeing while opening the UI? 2. was this accessible earlier? 3. Please attach the screenshot of the error/exception you are seeing. 4. What is the CDH version. Regards, Chethan YM
... View more
08-12-2021
05:57 PM
Hi @lalala , This behavior was caused by the csv module when impala is using it to export the data. # csv.writer expects a file handle to the input. # cStringIO is used as the temporary buffer. temp_buffer = StringIO() writer = csv.writer(temp_buffer, delimiter=self.field_delim, lineterminator='\n', quoting=csv.QUOTE_MINIMAL) writer.writerows(rows) Seems to be we cannot change this since it is needs to modified at code level. [1]. https://github.com/apache/impala/blob/014c455aaaa38010ae706228f7b439c080c0bc7d/shell/shell_output.py#L64
... View more
08-12-2021
05:21 PM
Hi, Can you provide the sample output for the same where you are seeing the double quotation mark? Regards, Chethan YM
... View more
08-09-2021
05:40 PM
Hi, What is the query you are using to read the data from table? can you attach its "query profile" and coordinator logs to have a look? Regards, Chethan YM
... View more
08-08-2021
09:47 AM
Hi, Can you provide these below details to understand the issue: 1. Initially from where you created the table in hive or impala? 2. Provide the DDL of the table that you are trying to query from impala. # show create table <table-name>; 3. list the files from hdfs path ( you will see the table path in the DDL def. ) # hdfs dfs -ls /table/path/ 4. Are you able to access the same table data from hive?( if you are facing the issues from impala? vice versa) 5. Go to HS2 logs/coordinator logs copy the entire error stack trace for the query that is failing and paste here. I think these are the basic details needed for analysing the issue. Regards, Chethan YM
... View more
07-29-2021
07:09 AM
Hi, I do not think we have a command to check all the table stats in one go, If you would like to update the stats on all tables i think you should prepare a custom script for it. -> List all the tables from "show tables" command -> copy the table names and run "show table stats <table-name>" on all the tables. -> And then "compute stats <table-name>" on all the tables. You can make a custom shell script with a for loop to run the compute stats on all the tables. All this should be done from terminal.
... View more
07-05-2021
12:16 AM
Hello, Have you tried with a query like this -> select <column-name> from <table-name>; If this is not your expectation let us know in detail regarding the requirement. Regards, Chethan YM
... View more