Member since
03-18-2018
33
Posts
0
Kudos Received
0
Solutions
10-09-2018
11:27 PM
After restarting the services i am getting error on all data nodes "Ambari Monitor not running on ". Should i start ambari-agent on all data nodes? Also All HBASE region servers are down getting error " Connection failed: [Errno 111] Connection refused to ctcl-hdpdata1.com:16030"
... View more
10-09-2018
11:07 AM
Hi, How do we find why all services went down ?We have HDP2.6 all services went down, do we just restart the all services? .We are able to connect to Ambari.
... View more
Labels:
08-30-2018
12:51 PM
Thanks .Yes we do have postgres DB which used for ambari and Hive metastore
... View more
08-30-2018
12:44 PM
All nodes on our cluster require outage for maintenance . I am planning to stop all services in Ambari and stop Ambari server Are these steps correct & enough?
... View more
Labels:
08-02-2018
12:48 PM
We have HDP 2.6.4 Ambari URL is not loading sudo ambari-server status Using python /usr/bin/python Ambari-server status Ambari Server running Found Ambari Server PID: 14654 at: /var/ambari-server/ambari-server.pid 01 Aug 2018 19:08:53,866 ERROR [ambari-client-thread-1088676] MetricsRequestHelper:115 - Error getting timeline metrics : Connection refused (Connection refused)
01 Aug 2018 19:08:53,867 ERROR [ambari-client-thread-1088676] MetricsRequestHelper:122 - Cannot connect to collector: SocketTimeoutException for ctcl-hdpedge1.msoit.com 01 Aug 2018 19:08:46,391 ERROR [ambari-client-thread-1088525] MetricsRequestHelper:122 - Cannot connect to collector: SocketTimeoutException for ctcl-hdpedge1.msoit.com
01 Aug 2018 19:08:53,867 ERROR [ambari-client-thread-1088676] MetricsRequestHelper:122 - Cannot connect to collector: SocketTimeoutException for ctcl-hdpedge1.msoit.com
... View more
Labels:
- Labels:
-
Apache Ambari
06-21-2018
03:35 PM
U mean create user on all data nodes? we have user created on 3 master nodes & edge node
... View more
06-21-2018
03:35 PM
U mean create user on all data nodes? we have user created on 3 master nodes & edge node
... View more
06-21-2018
02:52 PM
We have HDP 2.6.4, we have user janu, when she runs the first sql it works fine, 2 sql gives below error we have kerberos enabled and users were created using ldap 1)select * from db.tbl 2) select col1 from db.tbl group by col1 AM Container for appattempt_1523646395030_2519_000002 exited with exitCode: -1000
For more detailed output, check the application tracking page: http://ctcl-hdpmaster3.com:8088/cluster/app/application_1523646395030_2519 Then click on links to logs of each attempt.
Diagnostics: Application application_1523646395030_2519 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is janu
main : requested yarn user is janu
User janu not found
Failing this attempt
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache YARN
05-29-2018
02:19 PM
We are getting below error on randomly for few minutes
and then goes away, its coming in PUThiveql 2018-05-29 01:01:07,279 INFO [Timer-Driven Process
Thread-95] org.apache.hive.jdbc.HiveConnection Will try to open client
transport with JDBC Uri:
jdbc:hive2://hdpmaster2.msoit.com:10000/default;principal=hive/_HOST@IT.COM 2018-05-29 01:01:07,281 ERROR [Timer-Driven Process
Thread-95] o.apache.thrift.transport.TSaslTransport SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at
org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204) at
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at
org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) at
org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) at
org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148) at
org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106) at
org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) at
org.apache.nifi.dbcp.hive.HiveConnectionPool.lambda$getConnection$0(HiveConnectionPool.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at
org.apache.nifi.dbcp.hive.HiveConnectionPool.getConnection(HiveConnectionPool.java:355) at
sun.reflect.GeneratedMethodAccessor393.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at
java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:89) at
com.sun.proxy.$Proxy97.getConnection(Unknown Source) at
org.apache.nifi.processors.hive.PutHiveQL.lambda$new$1(PutHiveQL.java:191) at
org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:96) at
org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:274) at
org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at
org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at
org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:274) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1147) at
org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:175) at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at
java.lang.Thread.run(Thread.java:748) Caused by: org.ietf.jgss.GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos tgt) at
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache NiFi
05-23-2018
04:39 PM
None of the above worked. It working after adding below configuration set("spark.jars","/usr/hdp/current/spark2-client/jars/sqljdbc42.jar")
... View more
05-23-2018
12:08 PM
Thanks Sridhar Already tried the --jars and spark.driver.extraClassPath & spark.executor.extraClassPath, they don't work. What do you mean by "Directory expansion does not work with --jars"
... View more
05-23-2018
01:03 AM
spark-submit --master yarn --deploy-mode cluster sqlserver.py --jars sqljdbc42.jar I get Error : java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerDriver Everything works when i use --deploy-mode client and copy the jar file to /usr/hdp/current/spark2-client/jars/sqljdbc42.jar Should i copy the sqljdbc42.jar to /usr/hdp/current/hadoop-yarn-client/lib/ on all data nodes?
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
05-03-2018
01:09 AM
Any update , its 2018. I am gettiing the same error, Error: Error while processing statement: FAILED: Hive Internal Error: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException(Query with transform clause is disallowed in current configuration.) (state=08S01,code=12)
... View more
05-02-2018
07:18 PM
I am trying to use python udf in hive, but i get the below error Error: Error while processing statement: FAILED: Hive Internal Error: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException(Query with transform clause is disallowed in current configuration.) (state=08S01,code=12)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
04-14-2018
02:54 PM
I am also facing the same issue, did you find a resolution?
... View more
04-11-2018
05:27 PM
Ranger plugins for Atlas and hive are enabled.
Automatic sync will happen only if Advanced hive-atlas-application.properties is true in Ambari
atlas.hook.hive.synchronous - boolean, true to run the hook synchronously. default false. Recommended to be set to false to avoid delays in hive query completion.
... View more
04-11-2018
04:31 PM
Thanks Geoffrey. It works after i have run the script ./usr/hdp/current/atlas-server/hook-bin/import-hive.sh
... View more
04-06-2018
06:36 PM
It seems , i have run the script ./usr/hdp/current/atlas-server/hook-bin/import-hive.sh
... View more
04-06-2018
03:17 PM
I have a external table already created in Hive, i run the alter table to add comment. alter table tblstat_tmp change id id int COMMENT 'Unique ID';
... View more
04-06-2018
01:38 PM
I want to see the table & column comments in Atlas hive table schema table. Yes ranger plugins are enabled for hive & atlas
... View more
04-06-2018
01:21 PM
We have HDP 2.6, when i added table & column comments using alter table.These comments are not reflected in Atlas. I thought this should be automated
... View more
Labels:
- Labels:
-
Apache Atlas
04-04-2018
04:53 PM
This is resolved.
Because hive is not configured with impersonation, table creation uses hive
user to create tables and landing directories in HDFS. added hive in
Ranger policy
... View more
04-03-2018
02:32 PM
I am trying to create an external table hive , i get below error.I have defined the policy in ranger. Hive policy i have given all permissions to the user for the database(all tables & all columns). I am conecting to hive using beeline Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [hdp_ingest] does not have [ALL] privilege on [hdfs://hdpmaster1.m.com:8020/datalake/first/landing/tbltest] (state=42000,code=40000)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
03-22-2018
06:10 PM
I have downloaded NIFI registry from hortonworks website. I am installed on windows 10.I am getting below error when trying to execute run-nifi-registry.bat 2018-03-22 13:51:26,814 INFO [main] org.apache.nifi.registry.NiFiRegistry Launching NiFi Registry...
2018-03-22 13:51:26,814 INFO [main] org.apache.nifi.registry.NiFiRegistry Read property protection key from conf/bootstrap.conf
2018-03-22 13:51:26,861 INFO [main] o.a.n.r.security.crypto.CryptoKeyLoader No encryption key present in the bootstrap.conf file at C:\Users\Vdutt\Downloads\nifi_regis\nifi-registry-0.1.0-bin.tar\nifi-registry-0.1.0-bin\nifi-registry-0.1.0\conf\bootstrap.conf
2018-03-22 13:51:26,877 INFO [main] o.a.n.r.p.NiFiRegistryPropertiesLoader Loaded 26 properties from C:\Users\Vdutt\Downloads\nifi_regis\nifi-registry-0.1.0-bin.tar\nifi-registry-0.1.0-bin\nifi-registry-0.1.0\conf\nifi-registry.properties
2018-03-22 13:51:26,885 INFO [main] org.apache.nifi.registry.NiFiRegistry Loaded 26 properties
2018-03-22 13:51:26,886 INFO [main] org.apache.nifi.registry.NiFiRegistry NiFi Registry started without Bootstrap Port information provided; will not listen for requests from Bootstrap
2018-03-22 13:51:26,889 ERROR [main] org.apache.nifi.registry.NiFiRegistry Failure to launch NiFi Registry due to java.lang.NoClassDefFoundError: org/apache/nifi/registry/util/FileUtils
java.lang.NoClassDefFoundError: org/apache/nifi/registry/util/FileUtils
at org.apache.nifi.registry.NiFiRegistry.<init>(NiFiRegistry.java:97) ~[nifi-registry-runtime-0.1.0.jar:0.1.0]
at org.apache.nifi.registry.NiFiRegistry.main(NiFiRegistry.java:158) ~[nifi-registry-runtime-0.1.0.jar:0.1.0]
Caused by: java.lang.ClassNotFoundException: org.apache.nifi.registry.util.FileUtils
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_161]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_161]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) ~[na:1.8.0_161]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_161]
... 2 common frames omitted
2018-03-22 13:51:26,891 INFO [Thread-1] org.apache.nifi.registry.NiFiRegistry Initiating shutdown of Jetty web server...
2018-03-22 13:51:26,891 INFO [Thread-1] org.apache.nifi.registry.NiFiRegistry Jetty web server shutdown completed (nicely or otherwise).
... View more
Labels:
- Labels:
-
Apache NiFi
03-19-2018
06:34 PM
I have installed NIFI 1.5 on windows machine, i am trying to use puthdfs. I have coresite.xml and hdfs-site.xml copied to C:\nifi-1.5.0-bin\nifi-1.5.0\lib\. In Hadoop coniguration resources i have given C:\nifi-1.5.0-bin\nifi-1.5.0\lib\core-site.xml,C:\nifi-1.5.0-bin\nifi-1.5.0\lib\hdfs-site.xml What should i specify in Kerberos Principal & Kerberos Keytab properties?
... View more
Labels:
- Labels:
-
Apache NiFi
03-19-2018
05:04 PM
We have HDP license , but NIFI complete install(on the cluster ) would require HDF license. We have to ingest files into HDFS and Incrementally pull data from sqlserver and load to Hive. Currently we are using custom sqoop and hive sql i wanted to look at the option of using NIFI standalone on windows server to Perform data Ingestion & scheduling .
... View more
03-19-2018
01:18 PM
We are currently using sqoop and Hive sql to ingest data.Can we use NIFI to ingest data into HDFS and load data into Hive tables.Is it possible to Install NIFI on a windows Server and schedule jobs to load data into HDFS and Hive?
... View more
Labels:
- Labels:
-
Apache NiFi