Member since
09-17-2014
93
Posts
5
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
32094 | 03-02-2015 12:47 PM | |
2512 | 02-03-2015 01:24 PM | |
4292 | 12-12-2014 08:19 AM | |
4285 | 11-07-2014 01:55 PM | |
2717 | 10-13-2014 06:47 PM |
07-20-2015
12:12 PM
Hi, Well this is a spotfire Issue,i had numerous calls with tibco guys and they said its an issue from their side. Although Tableau works fine in this scenario.
... View more
03-02-2015
12:47 PM
1 Kudo
i was able to solve the problem, instead of keeping the file in /user/<home-directory> i put the script file in /user/<home-directory>/oozie-oozi and it worked.
... View more
03-02-2015
11:50 AM
Hi, I am using CDH 5.2 on RHEL 6.3. I want to run shell script using oozie fron HUE. i am getting an error like this:- java.io.IOException: Cannot run program "test.sh" (in directory "/apps/yarn/nm/usercache/tsingh12/appcache/application_1425085556881_0042/container_1425085556881_0042_01_000002"): error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at org.apache.oozie.action.hadoop.ShellMain.execute(ShellMain.java:93)
at org.apache.oozie.action.hadoop.ShellMain.run(ShellMain.java:55)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
at org.apache.oozie.action.hadoop.ShellMain.main(ShellMain.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:227)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 17 more
... View more
02-03-2015
01:24 PM
I was able to resolve this, I changed a property in krb5.conf dns_lookup_kdc=true I was getting an error in creating KDC initial credentials.By changing this property from false to true i was able to install spark on secure kerberized with sentry cluster.
... View more
02-03-2015
07:46 AM
Grant all the permissions using SQL GRANT Syntax on the role of which that user is the part of. After that you would be able to create the table
... View more
01-30-2015
09:04 AM
i got the same error, Restart your Cloudera management services,it will get working.
... View more
01-22-2015
07:59 AM
Hi , I am using CDH 5.2 on RHEL 6.5 I am trying to install spark in yarn mode in kerberized environment. But it fails on the 3 rd step when it tries to upload the jars after creating history server and user dir. + echo 'Using /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand as conf dir' + echo 'Using scripts/control.sh as process script' + export COMMON_SCRIPT=/usr/lib64/cmf/service/common/cloudera-config.sh + COMMON_SCRIPT=/usr/lib64/cmf/service/common/cloudera-config.sh + chmod u+x /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/scripts/control.sh + exec /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/scripts/control.sh upload_jar Thu Jan 22 10:41:50 EST 2015 Thu Jan 22 10:41:50 EST 2015: Detected CDH_VERSION of [5] Thu Jan 22 10:41:50 EST 2015: Uploading Spark assembly jar to '/user/spark/share/lib/spark-assembly.jar' on CDH 5 cluster + export SCM_KERBEROS_PRINCIPAL=spark/itsusmpl00512.xxx.com@CDH5.xxx.COM + SCM_KERBEROS_PRINCIPAL=spark/itsusmpl00512.xxx.com@CDH5.xxx.COM + acquire_kerberos_tgt spark_on_yarn.keytab + '[' -z spark_on_yarn.keytab ']' + '[' -n spark/itsusmpl00512.xxx.com@CDH5.xxx.COM ']' + '[' -d /usr/kerberos/bin ']' + which kinit + '[' 0 -ne 0 ']' ++ id -u + export KRB5CCNAME=/var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/krb5cc_481 + KRB5CCNAME=/var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/krb5cc_481 + echo 'using spark/itsusmpl00512.jnj.com@CDH5.JNJ.COM as Kerberos principal' + echo 'using /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/krb5cc_481 as Kerberos ticket cache' + kinit -c /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/krb5cc_481 -kt /var/run/cloudera-scm-agent/process/295-spark_on_yarn-SPARK_YARN_HISTORY_SERVER-SparkUploadJarCommand/spark_on_yarn.keytab spark/itsusmpl00512.xxx.com@CDH5.xxx.COM kinit: Cannot resolve network address for KDC in realm "CDH5.xxx.COM" while getting initial credentials + '[' 1 -ne 0 ']' + echo 'kinit was not successful.' + exit 1
... View more
Labels:
12-12-2014
12:57 PM
What are you trying to achieve?Can you be little elaborate on it? The keytab file need not be given path for because is it internally taken care of.
... View more
12-12-2014
12:54 PM
Hi, I have been trying to connect to spotfire using cloudera impala odbc connector 2.5.22. I am using CDH 5.2 on RHEL 6.5. The security is enabled on the Cloudera cluster:-Kerberos,LDAP,Sentry Sentry works fine in HUE and as well as using impala shell and beeline client.I am able to see the authorized tables. Now i want to connect to spotfire using the cloudera impala connector. I installed 64 bit ODBC Connnctor and tested the connection using SASL (mechansism) username and password giving hostname and port no. and the database field is left empty,it is tested successfully but in the log files when i see,even though the database field is left empty(or i give the name of any other database) it takes default database, using sentry i have only access to one database "sentry_test" and not even default. using Spotfire when i connect to my tables,it is connected but abrupty gives an error like:- Could not connect due to an unknown error. External error: ERROR [HY000] [Cloudera][ImpalaODBC] (110) Error while executing a query in Impala: [HY000] : AuthorizationException: User 'txxxxx' does not have privileges to access: default.* even if i try to connect to a database i have privilege on it still gives the same error ANy idea why is it redirecting to default database? the log file of the impala cloudera connector shows:- Dec 12 14:16:04 INFO 3432 StatementState::InternalPrepare: Preparing query: use sentry_test Dec 12 14:16:04 INFO 3432 ImpalaDataEngine::Prepare: Trying to parse query: use sentry_test Dec 12 14:16:04 INFO 3432 ImpalaDataEngine::Prepare: [Cloudera][SQLEngine] (31480) syntax error near 'use<<< ??? >>> sentry_test'. Dec 12 14:16:04 INFO 3432 ImpalaNativeQueryExecutor::ExecuteQuery: use sentry_test Dec 12 14:16:04 INFO 3432 Connection::SQLGetInfoW: InfoType: 6 Dec 12 14:16:04 INFO 3432 Connection::SQLGetInfoW: InfoType: 18 Dec 12 14:16:04 INFO 3432 Statement::SQLGetStmtAttrW: Attribute: 10010 Dec 12 14:16:04 INFO 3432 Statement::SQLGetStmtAttrW: Attribute: 10011 Dec 12 14:16:04 INFO 3432 Statement::SQLGetStmtAttrW: Attribute: 10012 Dec 12 14:16:04 INFO 3432 Statement::SQLGetStmtAttrW: Attribute: 10013 Dec 12 14:16:04 INFO 3432 Statement::SQLSetStmtAttrW: Attribute: 0 Dec 12 14:16:04 INFO 3432 Statement::SQLSetStmtAttrW: Attribute: 1228 Dec 12 14:16:04 INFO 3432 StatementAttributes::SetAttribute: Invalid attribute: 1228 Dec 12 14:16:04 ERROR 3432 Statement::SQLSetStmtAttrW: [Cloudera][ODBC] (10210) Attribute identifier invalid or not supported: 1228 Dec 12 14:16:04 INFO 3432 Statement::SQLSetStmtAttrW: Attribute: 1227 Dec 12 14:16:04 INFO 3432 StatementAttributes::SetAttribute: Invalid attribute: 1227 Dec 12 14:16:04 ERROR 3432 Statement::SQLSetStmtAttrW: [Cloudera][ODBC] (10210) Attribute identifier invalid or not supported: 1227 Dec 12 14:16:04 INFO 3432 Connection::ExecuteCatalogFunction: SQLTables Dec 12 14:16:05 ERROR 3432 Statement::ExecuteCatalogFunction: [Cloudera][ImpalaODBC] (110) Error while executing a query in Impala: [HY000] : AuthorizationException: User 'tsingh12' does not have privileges to access: default.* Dec 12 14:16:05 INFO 3432 Connection::SQLGetInfoW: InfoType: 6 Dec 12 14:16:05 INFO 3432 Connection::SQLGetConnectAttr: Attribute: 1209 Dec 12 14:16:05 INFO 3432 Connection::SQLGetInfoW: InfoType: 1 Dec 12 14:16:38 INFO 11980 Environment::SQLSetEnvAttr: Attribute: 200 Dec 12 14:16:39 WARN 11980 ImpalaClient::OpenSession: The Database was not valid. Using the default database instead. Dec 12 14:17:13 INFO 11980 Environment::SQLSetEnvAttr: Attribute: 200
... View more
12-12-2014
08:19 AM
This issue got resolved by adding --ldap_domain in advanced Snippet Command line argument for impala daemon. After restarting the service, doing impala-shell -l in linux terminal prompted for ldap username's password and got successfuly connected.
... View more