Member since
09-02-2016
523
Posts
89
Kudos Received
42
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1619 | 08-28-2018 02:00 AM | |
1227 | 07-31-2018 06:55 AM | |
3134 | 07-26-2018 03:02 AM | |
1335 | 07-19-2018 02:30 AM | |
3669 | 05-21-2018 03:42 AM |
04-20-2022
01:14 AM
You can use one of those query to reduce num of file in insert query, so it will increase the file size: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+SortBy#LanguageManualSortBy-SyntaxofClusterByandDistributeBy
... View more
09-20-2021
04:30 AM
To append data frames in R, use the rbind() function. The rbind() is a built-in R function that can combine several vectors, matrices, and/or data frames by rows. When it comes to appending data frames, the rbind() and cbind() function comes to mind because they can concatenate the data frames horizontally and vertically. In this example, we will see how to use the rbind() function to append data frames.
... View more
03-19-2021
12:49 PM
that worked but when I tried to fire command from admin user (commands like --- hdfs dfs -cp file /user/admin or hdfs dfs -ls /user/) it's not allowing me giving below error WARN security.UserGroupInformation: PriviledgedActionException as:admin (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
... View more
03-10-2021
12:00 AM
@Venkat_ as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
12-09-2020
09:16 AM
Hi @Broche you can use this script.(for hue) For id column, 2 is default value. You don't have to define default value, its optional.
... View more
10-12-2020
05:37 AM
When you get below error message when doing kinit using a keytab file klist: Unsupported key table format version number while starting keytab scan Make sure that keytab file is not of zero byte e.g This is Zero byte keytab file and you will get the above error when trying to do kinit with it -rw------- 1 cloudera-scm cloudera-scm 0 Aug 30 12:15 ./32-cloudera-mgmt-SERVICEMONITOR/cmon.keytab A good keytab file will have non-zero size e.g. 778 for the below file -rw------- 1 cloudera-scm cloudera-scm 778 Oct 12 05:21 ./150-cloudera-mgmt-SERVICEMONITOR/cmon.keytab
... View more
09-08-2020
10:33 AM
This is not working. Please let me know how to use full path
... View more
07-02-2020
11:57 PM
Try with the sql statement: select VALUE from scm.CONFIGS where ATTR="kdc_admin_user"; scm is the CM database in the example.
... View more
01-31-2020
02:58 AM
1 Kudo
Hello, You can follow the steps outlined here. This is for CDH 6.3: https://docs.cloudera.com/documentation/enterprise/6/6.3/topics/cm_mc_adding_hosts.html#cmug_topic_7_5_1__title_215 Regards, Steve
... View more
10-30-2019
07:16 PM
Do you resolve this problem ?
... View more
10-28-2019
10:40 AM
Since Hadoop 2.8, it is possible to make a directory protected and so all its files cannot be deleted, using : fs.protected.directories property. From documentation: "A comma-separated list of directories which cannot be deleted even by the superuser unless they are empty. This setting can be used to guard important system directories against accidental deletion due to administrator error." It does not exactly answer the question but it is a possibility.
... View more
09-18-2019
12:01 AM
You can do alter like I mentioned before: ALTER TABLE test CHANGE col1 col1 int COMMENT 'test comment'; But I do not think you can remove it, but rather to just empty it. Cheers Eric
... View more
09-03-2019
08:16 PM
Could you share the error ? Do you have sqoop client being installed on the node ? whats you mysql cnf file looking if you have this skip-networking just comment it out and restart the mysql i assume it your poc box.
... View more
08-28-2019
05:50 AM
Hi, Can you let me know how to replace a set of special characters in the sqoop import query. I need to replace the column value if it contains any of the special characters(|,",^,$,% etc) in it Thank You Vijay
... View more
07-01-2019
07:12 PM
hi,man, did you fixed this problem,i have the same too.
... View more
05-31-2019
10:29 AM
Not helpful yet, but promising... PIVOT keyword is reserved for future use! https://www.cloudera.com/documentation/enterprise/6/6.2/topics/impala_reserved_words.html
... View more
05-16-2019
01:46 PM
this helped thanks!
... View more
05-10-2019
02:02 PM
By default, in Hive, Parquet files are not written with compression enabled. https://issues.apache.org/jira/browse/HIVE-11912 However, writing files with Impala into a Parquet table will create files with internal Snappy compression (by default).
... View more
05-06-2019
08:14 AM
Worked for me too. Thank you.
... View more
04-25-2019
04:24 AM
can you provide solution for this in Spark2+? Following code working in Spark 1.6 but not in Spark 2.3.0 Class.forName(impalaJdbcDriver).newInstance
UserGroupInformation.getLoginUser.doAs(
new PrivilegedAction[Connection] {
override def run(): Connection = DriverManager.getConnection(impalaJdbcUrl) We are getting following exception User class threw exception: java.security.PrivilegedActionException: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://XXx:21050/;principal=impala/XXXX: GSS initiate failed
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:694)
Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://XXX:21050/;principal=impala/XXXX@XXX.com: GSS initiate failed
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
.....
Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed Thanks
... View more
04-10-2019
04:24 AM
Can you please post the code of how you are mentioning the default delimiters. There is an additional parameter you need to specify along witht the --hive-drop-import-delims. So post seeing the code we can provide a solution.
... View more
03-08-2019
05:10 AM
I am facing the same issue and can anyone please suggest how to resolve this. On running two spark application , one remains at accepted state while other is running. What is the configuration that needs to be done for this to be working? Following is the configuration for dynamic resource pool config: Please help!
... View more
02-27-2019
07:42 PM
Any thoughts on my previous query can someone let me know how to upgrade an unmanaged cluster to 6.1 from 5.16?
... View more
02-27-2019
02:53 AM
Late reply though. It is indeed possible to export the underlying parquet files of the table with the limitation - 1. Like other file formats support, BlobRef is not supported. 2. Files are read via Kite SDK. Currently Kite requires .metadata present. https://issues.apache.org/jira/browse/SQOOP-1394
... View more
02-25-2019
01:10 PM
Hi All, Facing below error while running the hive command. MSCK REPAIR TABLE flexdto_standin_avro; Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask Your inputs is much appreciated. Thanks Yasmin
... View more
02-23-2019
02:35 AM
Hi, I am also facing the same issue. Not able to load hive table into spark. I tried to copy the xml files in spark conf folder. But its permission denied and I tries to change the permission for the folder also .That is also not working. Using cloudera vm 5.12
... View more
02-18-2019
05:36 AM
Hello, any update to this ? I see that issue is still open. So does it mean, that there is no way how to increase dashboard download limit at the moment?
... View more
01-22-2019
04:38 AM
I had the same situation @HMC. I created again the admin user with hue createsuperuser --cm-managed Without the "--cm-managed" flag it didn't work
... View more
12-31-2018
05:10 AM
Setting quota will work. Queries will fail with quota errors.
... View more