Member since
09-06-2016
35
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
839 | 03-06-2017 12:52 PM | |
592 | 03-01-2017 09:46 PM |
10-15-2018
02:54 PM
Hi We are facing performance problem at Nifi UI. Sometime it takes around 6-14 sec. when moving a processor at the UI.
Questions: 1) How to debug the UI which is slow? 2) Is overflow of cancelled tasks normal when setting to DEBUG? SLOW 2018-10-15 11:16:14,645 INFO [NiFi Web Server-729217] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for () PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 1.1.1.1) 2018-10-15 11:16:14,681 INFO [NiFi Web Server-728820] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) 2018-10-15 11:16:20,811 INFO [NiFi Web Server-729013] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) FAST 2018-10-15 11:17:11,931 INFO [NiFi Web Server-728355] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for () PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 1.1.1.1) 2018-10-15 11:17:11,966 INFO [NiFi Web Server-729217] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) 2018-10-15 11:17:11,985 INFO [NiFi Web Server-729215] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) When change org.apache.nifi to log level="DEBUG", the nifi-app.log get overflow by.. Not sure if that is a issue.. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-1] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7501e256' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-43] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@706658f2' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-33] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@2272bf22' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-43] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@21fdf62a' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-64] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@28592572' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-11] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@af48695' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-45] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@fe17bc0' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-7] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@18db7949' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-44] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@f91f797' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-3] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a9bbd7f' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-54] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5ed37c9a' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-44] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@43972b17' has been cancelled. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-51] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@3300301' has been cancelled.
... View more
Labels:
10-15-2018
02:51 PM
Hi We are facing performance problem at Nifi UI. Sometime it takes around 6-14 sec. when moving a processor at the UI. Questions: 1) How to debug the UI which is slow? 2) Is overflow of cancelled tasks normal when setting to DEBUG? SLOW 2018-10-15 11:16:14,645 INFO [NiFi Web Server-729217] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<JWT token>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 10.129.40.59) 2018-10-15 11:16:14,681 INFO [NiFi Web Server-728820] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_><NIFI_URL_01><CN=NIFI_URL_02, OU=Internal PKI, O=COMPANY, C=DK>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) 2018-10-15 11:16:20,811 INFO [NiFi Web Server-729013] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_><NIFI_URL_01><CN=NIFI_URL_02, OU=Internal PKI, O=COMPANY, C=DK>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) FAST 2018-10-15 11:17:11,931 INFO [NiFi Web Server-728355] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<JWT token>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip:
10.129.40.59) 2018-10-15 11:17:11,966 INFO [NiFi Web Server-729217] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_><NIFI_URL_01><CN=NIFI_URL_02, OU=Internal PKI, O=COMPANY, C=DK>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) 2018-10-15 11:17:11,985 INFO [NiFi Web Server-729215] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<_USER_><NIFI_URL_01><CN=NIFI_URL_02, OU=Internal PKI, O=COMPANY, C=DK>) PUT https://NIFI_URL_01:9091/nifi-api/processors/be6b3028-e778-1f93-ae64-9a6552036e77 (source ip: 0.0.0.0) When change org.apache.nifi to log level="DEBUG", the nifi-app.log get overflow by.. Not sure if that is a issue.. 2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-1] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7501e256' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-43] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@706658f2' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-33] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@2272bf22' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-43] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@21fdf62a' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-64] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@28592572' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-11] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@af48695' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-45] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@fe17bc0' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-7] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@18db7949' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-44] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@f91f797' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-3] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a9bbd7f' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-54] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5ed37c9a' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-44] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@43972b17' has been cancelled.
2018-10-15 14:04:56,165 DEBUG [Timer-Driven Process Thread-51] org.apache.nifi.engine.FlowEngine A flow controller execution task 'java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@3300301' has been cancelled.
... View more
Labels:
09-18-2018
05:37 AM
Non of us is using HDP 3.0. what about HDP 2.6? If it ACID always have been "off" the last half year and 100 hive tables (text, Avro, orc) have been created. If we change to ACID as default, what other effect then performance do we need to consider? Is old table been converted to ACID tables? Will old tables still work as expected?
... View more
09-17-2018
12:15 PM
We are looking into same scenario. What happens with existing non traditional table?.
... View more
08-19-2018
11:46 AM
Hi, We have created a connection with a ODBC from Informatica to Hive, but special chars as привет is convert to □ . The Hive table data is showed fine in Beeline, Ambari Hive View, JDBC 64bit but not ODBC 64bit. The documentation link page 18, tell that is is possible to add server-side property to the ODBC and that these options can be showed by type set -v. I can figure out to run the command to see the properties. What we are looking for is if any encoding/code-page can be set to solve the encoding problem. Questions: 1) Have any other had problem with UTF-8 chars like привет, to be showed correctly with Hortonworks ODBC 54bit? 2) Any guidance to how the server-site options kan be displyed. From documentation: 1. To create a server-side property, click Add, then type appropriate values in the Key
and Value fields, and then click OK Note: For a list of all Hadoop and Hive server-side properties that your
implementation supports, type set -v at the Hive CLI command line or Beeline. You
can also execute the set -v query after connecting using the driver.
... View more
Labels:
07-13-2018
06:48 PM
I am having the same error on my macOS High Sierra
... View more
07-04-2018
03:29 PM
@Mayur manurkar did you found any solution for this?
... View more
11-17-2017
02:11 PM
for me /usr/hdf/current/nifi/lib did not work. HDF Version 2.1.1 Apache NiFi - Version 1.1.0.2.1.2.0-10
... View more
11-16-2017
08:34 PM
I solve this issue with copy from HDP /etc/hadoop/conf/core-site.xml and /etc/hadoop/conf/hdfs-site.xml to HDF /etc/nifi-resources/archive/
... View more
10-23-2017
05:07 PM
Thx for answer. We are using a older version today. As showed only some of the fields is supported for Expression Language. HDF Version 2.1.1 - Powered by Apache NiFi - Version 1.1.0.2.1.2.0-10
... View more
10-23-2017
03:37 PM
We are trying making our processor using custom properties for environment independent. In this case we are using a SelectHiveQL processor which is working if the Database Connection URL is hardcode. If the Database Connection URL is a reference as done with Kerberos Principal and Kerberos Keytab it return a error, when running SelectHiveQL. The pictures show the two scenarios. Our hive url is as follow: jdbc:hive2://HOST:10000/default;principal=hive/_HOST@HOST.com I have tried added the propery How is it possible to do a custom property reference in Database Connection URL?
... View more
Labels:
10-17-2017
07:14 AM
HI @Rishi The interpreter settings is as followed: livy.spark.driver.cores = 1 livy.spark.driver.memory = 12G livy.spark.dynamicAllocation.cachedExecutorIdleTimeout livy.spark.dynamicAllocation.enabled livy.spark.dynamicAllocation.initialExecutors livy.spark.dynamicAllocation.maxExecutors livy.spark.dynamicAllocation.minExecutors livy.spark.executor.cores = 4 livy.spark.executor.instances = 11 livy.spark.executor.memory = 12G livy.spark.master = yarn-cluster spark.driver.maxResultSize = 120G zeppelin.interpreter.localRepo = (HIDDEN) zeppelin.livy.concurrentSQL = false zeppelin.livy.create.session.retries = 240 zeppelin.livy.keytab = (HIDDEN) zeppelin.livy.principal = (HIDDEN) zeppelin.livy.spark.sql.maxResult = 100000 zeppelin.livy.url = http://HOST:8998 It is very clear that, the Y job says PENDING and when X is finish Y starts.
... View more
10-16-2017
09:45 AM
Scenario: User X run a %livy.pyspark job in notebook AnalysisX 5 seconds after user Y run a %livy.pyspark job in notebook AnalysisY Y have to wait for X's spark job to finish, which is not effective. How is it possible in HDP2.5 through Livy impersonated, to run multiple spark jobs from Zeppeline at the same time?
... View more
Labels:
09-06-2017
06:43 AM
I have tried @Andrew Grande suggestion, but it don't work for me. When adding following to Advanced nifi-bootstrap-env in Ambari: java.arg.18=-Dhttp.proxyHost=http://proxy.XXX.X java.arg.19=-Dhttp.proxyPort=8080 java.arg.20=-Dhttps.proxyHost=http://proxy.XXX.XX java.arg.21=-Dhttps.proxyPort=8080 It seems to start up Nifi as expected: nifi 5059 1.7 12.5 8831536 4106968 ? Sl Sep05 15:04 /usr/jdk64/jdk1.8.0_112/bin/java -classpath /usr/hdf/current/nifi/conf:/usr/hdf/current/nifi/lib/javax.servlet-api-3.1.0.jar:/usr/hdf/current/nifi/lib/jcl-over-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/jetty-schemas-3.1.jar:/usr/hdf/current/nifi/lib/jul-to-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/log4j-over-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/logback-classic-1.2.3.jar:/usr/hdf/current/nifi/lib/logback-core-1.2.3.jar:/usr/hdf/current/nifi/lib/nifi-api-1.2.0.3.0.1.1-5.jar:/usr/hdf/current/nifi/lib/nifi-nar-utils-1.2.0.3.0.1.1-5.jar:/usr/hdf/current/nifi/lib/nifi-properties-1.2.0.3.0.1.1-5.jar:/usr/hdf/current/nifi/lib/nifi-framework-api-1.2.0.3.0.1.1-5.jar:/usr/hdf/current/nifi/lib/nifi-runtime-1.2.0.3.0.1.1-5.jar:/usr/hdf/current/nifi/lib/slf4j-api-1.7.25.jar -Dhttps.proxyPort=8080 -Dhttps.proxyHost=http://proxy.XXX.XX -Dorg.apache.jasper.compiler.disablejsr199=true -Djava.security.auth.login.config=/usr/hdf/current/nifi/conf/nifi_jaas.conf -Xmx4g -Xms4g -Dhttp.proxyPort=8080 -Dambari.application.id=nifi -Dambari.metrics.collector.url=http://XXX.XXX.dk:6188/ws/v1/timeline/metrics -Dhttp.proxyHost=http://proxy.XXX.XX -Djava.security.egd=file:/dev/urandom -Dsun.net.http.allowRestrictedHeaders=true -Djava.net.preferIPv4Stack=true -Djava.awt.headless=true -XX:+UseG1GC -Djava.protocol.handler.pkgs=sun.net.www.protocol -Dnifi.properties.file.path=/usr/hdf/current/nifi/conf/nifi.properties -Dnifi.bootstrap.listen.port=42442 -Dapp=NiFi -Dorg.apache.nifi.bootstrap.config.log.dir=/var/log/nifi org.apache.nifi.NiFi -K /usr/hdf/current/nifi/conf/sensitive.key Error from HDF: 08:35:33 LEST- XX.XXX.dk:9091 - ERROR GetHITFIld=51827f4f-015e-1000-0000- Failed to process session due to ong.apathe.nifi.processor.exception.ProcessException: org.apactiehttp.conn.ConnectlimeoutException: Connect to distribution.virk.dk:80 distribution.virk dk/34.250.45.23, distribution.virk.dk/52.209.102.247, distribution.virk.dk/54.72.195.115] failed: connect timed out org.apache.nift.processor.exception.ProcessException: org.apachahttp.conn.ConnectTimeoutException: Connect to distribution.virk.dk:80 distribution.virk.dk134.250.45.23, distribution.virk.dk152.209.102.247, distribution.virk.dk154.72.195.115] failed: connect timed out
08:36:55 CEST XX.XXX.dk:9091 - ERROR Scroll ElasticsearchHttp[id=32M379d-C115e-1000-929e-] Failed to read from Elasticsearch due to http://proxy.XXX.XX , this may indicate an error in configuration (hosts, username/passwordi, etc.).: java.net.UnknownHostException: http://proxy.XXX.XX
... View more
04-12-2017
10:40 AM
@Rafael Coss and @Saumitra Buragohain Are there any news about the sandbox version of HDP 2.6
... View more
03-28-2017
11:10 AM
There is problem with both XSLT schema. Below example is not converted correct. <?xml version="1.0" encoding="UTF-8"?>
<note>
<country>
<city>Stockholm</city>
</country>
<country>
<code>FR</code>
<city>Paris</city>
<streets>
<road>Paris</road>
</streets>
</country>
<country>
<code>DK</code>
<city>Copenhagen</city>
<streets>
<road>Paris</road>
<no>1</no>
</streets>
</country>
</note>
... View more
03-06-2017
12:52 PM
Problem solve by using Hive2
... View more
03-06-2017
12:52 PM
Thx for your reply. I solve the problem by converting the Oozie script to run Hive2.
... View more
03-01-2017
09:46 PM
Thx for the replies. The problem is solve by using Hive2 (Beeline) instead of Hive1.
... View more
02-27-2017
01:34 PM
I want to run a very simple Oozie workflow with a Hive action on a kerberized cluster. The problem is that Hive is using my credential and not the Hive-user as it is doing through Hive View.
If I change my access in Ranger for "/apps/..." then the Oozie workflow is working fine.
But we don't want personal account to have access for "/apps/..." folder
How is it possible to achieve do a Hive action where don't have access to "/apps"..." folder on HDFS? == WORKFLOW.XML == <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="oozie_hive_kerberos_test">
<credentials>
<credential name="hcat" type="hcat">
<property>
<name>hcat.metastore.principal</name>
<value>hive/_HOST@<host>.com</value>
</property>
<property>
<name>hcat.metastore.uri</name>
<value>thrift://<host>.com:9083</value>
</property>
</credential>
</credentials>
<start to="hive"/>
<action cred="hcat" name="hive">
<hive xmlns="uri:oozie:hive-action:0.6">
<job-tracker>${resourceManager}</job-tracker>
<name-node>${nameNode}</name-node>
<query>
use XXXXX;
drop table if exists YYYY.ZZZZ;
</query>
</hive>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>${wf:errorMessage(wf:lastErrorNode())}</message>
</kill>
<end name="end"/>
</workflow-app>
== ERROR MESSAGE == SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in /data/hadoop/yarn/local/usercache/MY_USER_NAME/appcache/application_1487006380071_0351/container_e94_1487006380071_0351_01_000002/hive-log4j.properties
FAILED: SemanticException MetaException(message:org.apache.hadoop.security.AccessControlException: Permission denied: user=MY_USER_NAME, access=EXECUTE, inode="/apps/hive/warehouse/DATABASE.db":hdfs:hdfs:d---------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:307)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
... View more
Labels:
02-23-2017
02:09 PM
Thx for your reply. I am using HDP 2.5 out of the box. Which I guess is Ambari 2.4.2. I am having a hard time to found out what kind of upgrades is included in Ambari 2.5. Do you have a link and a idea when Ambari 2.5 is going to be released?
... View more
02-23-2017
01:13 PM
Thx for some nice articles.
At the moment I am using Oozie View from HDP 2.5. There is many things which don't work which is very frustrating. Your screenshots is indicating that you are using Hue or something else. Could you tell me which tool you are using? https://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html
... View more
02-09-2017
04:34 PM
I am unable to connect with a JDBC driver from a Windows PC to Hive with Kerberos. Everything is working fine with a ODBC connection. But that is not a option in this case. The connection string is jdbc:hive2://XXX.YYY.com:10000/default;principal=hive/XXX.YYY.com@YYYY.com;saslQop=auth-conf And the error which is recive from Hive's log is: 2017-02-09 16:12:21,254 ERROR [HiveServer2-Handler-Pool: Thread-151963]: server.TThreadPoolServer (TThreadPoolServer.java:run(297)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:609)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:606)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1704)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:606)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 10 more I guess this is something to do with a kerberos ticket which is not recive by Hive. Link to JDBC file https://github.com/timveil/hive-jdbc-uber-jar
... View more
Labels:
10-27-2016
08:04 PM
The link don't exist. In HUE you enable another user "reader" access and then everything works. I just want to know how I can do something similar in Hortonworks' Oozie.
... View more
10-27-2016
07:23 PM
@Geoffrey Shelton Okot this don't change anything. Is it possible to create a workflow and add other users on creation.
... View more
10-27-2016
11:27 AM
Hi I am having problem understand what to change to make another user run a workflow. User A submit a workflow in Oozie. A is able to run and it work fine. Now user B want to run the same workflow. But is not authorized. oozie-error-message: E0508: User [XXXX] not authorized for WF job [0000012-161010094223089-oozie-oozi-W] What we want is a developer to create a workflow and what this workflow can be access through the Oozie API by a external service.
Question 1) Where is this authorized being controlled in Oozie? 2) How can auth being change for other users/service account to run workflows?
Error: HTTP/1.1 401 Unauthorized Server: Apache-Coyote/1.1 WWW-Authenticate: Negotiate Set-Cookie: hadoop.auth=; Path=/; HttpOnly Content-Type: text/html;charset=utf-8 Content-Length: 997 Date: Wed, 26 Oct 2016 13:03:51 GMT HTTP/1.1 401 Unauthorized Server: Apache-Coyote/1.1 WWW-Authenticate: Negotiate
YGYGCSqGSIb3EgECAgIAb1cwVaADAgEFoQMCAQ+iSTBHoAMCAReiQAQ+Ar4Y2B5Cx+YJTHB3R7olNUPQNMZTqZfdTAoO0RRLuA20m9LgfB3LpyRaGwuPRF3tio3FREDxUJ7TQoZPdm8= Set-Cookie: hadoop.auth="u=XXXX&p=XXXX@XX.XX&t=kerberos&e=1477523031321&s=OKOKcrY3HZXdjhNQKpEr4FXiLSQ=";
Path=/; HttpOnly oozie-error-code: E0508 oozie-error-message: E0508: User [XXXX] not authorized for WF job
[0000012-161010094223089-oozie-oozi-W] Content-Type: text/html;charset=utf-8 Content-Length: 951 Date: Wed, 26 Oct 2016 13:03:51 GMT
... View more
Labels:
10-24-2016
01:13 PM
Okay.. Was hoping this feature could be or will be avalible in Resource Based. One case could be data in HDFS which only should be allowed to acces data based on location or a time perioed.
... View more
10-24-2016
09:49 AM
Thx @Terry Stebbens, would this also enable "Policy conditions" option?
... View more
10-24-2016
08:39 AM
Hi, When I login in the Sandbox 2.5 (VMWare). Ranger don't contain any option for "Deny" or "Policy Condition" only through "Tag based..". In the documentation a screendump and description is showed with Hive and "Deny" condition. Link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_security/content/about_ranger_policies.html Questions 1) Is there anything that which need to be enable to get this to work? 2) Is "Policy Condition" possible in Resource-Based Policy or only in "Tag based.." / Anders
... View more
10-20-2016
03:30 PM
2 Kudos
I had a hard time finding a way to add a tag/traits in Atlas by using the REST API.
Here is a solution: POST http://{YOUR IP ADRESS}:21000/api/atlas/entities/{GUID FOR ENTITY}/traits/ BODY {"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct","typeName":"PII","values":{}} curl -X POST -H "Content-Type: application/json" -H "Authorization: Basic YWRtaW46YWRtaW4=" -H "Cache-Control: no-cache" -d '{"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Struct","typeName":"PII","values":{}}' "http://192.168.255.128:21000/api/atlas/entities/d5dcb483-d2fc-4544-8368-6ef56321efdb/traits/"
... View more
Labels: