Member since
10-20-2016
106
Posts
0
Kudos Received
0
Solutions
02-24-2022
09:18 PM
Hi Team, I have recently migrated Kerberos principals using the below command from one KDC to another KDC, post-migration kinit is not working and it is throwing some error whereas the same identity is working in the original KDC. Can you please help us in identifying the error? Did I make any mistakes while migrating the principles? Command Used - kdb5_util dump -verbose dumpfile and logged in to other KDC and executed the restore kdb5_util restore -verbose /tmp/dumpfile Error: KRB5_TRACE=/dev/stdout kinit testuser [8962] 1645765308.654184: Getting initial credentials for testuser@EXAMPLE.COM [8962] 1645765308.654186: Sending unauthenticated request [8962] 1645765308.654187: Sending request (181 bytes) to EXAMPLE.COM [8962] 1645765308.654188: Resolving hostname stg-hdplucykrb101.phonepe.nb6 [8962] 1645765308.654189: Sending initial UDP request to dgram 10.57.55.228:88 [8962] 1645765308.654190: Received answer (163 bytes) from dgram 10.57.55.228:88 [8962] 1645765308.654188: Resolving hostname kdc.example.com [8962] 1645765308.654191: Sending DNS URI query for _kerberos.EXAMPLE.COM. [8962] 1645765308.654192: No URI records found [8962] 1645765308.654193: Sending DNS SRV query for _kerberos-master._udp.EXAMPLE.COM. [8962] 1645765308.654194: Sending DNS SRV query for _kerberos-master._tcp.EXAMPLE.COM. [8962] 1645765308.654195: No SRV records found [8962] 1645765308.654196: Response was not from master KDC [8962] 1645765308.654197: Received error from KDC: -1765328353/Decrypt integrity check failed [8962] 1645765308.654198: Retrying AS request with master KDC [8962] 1645765308.654199: Getting initial credentials for testuser@EXAMPLE.COM [8962] 1645765308.654201: Sending unauthenticated request [8962] 1645765308.654202: Sending request (181 bytes) to EXAMPLE.COM (master) [8962] 1645765308.654203: Sending DNS URI query for _kerberos.EXAMPLE.COM. [8962] 1645765308.654204: No URI records found [8962] 1645765308.654205: Sending DNS SRV query for _kerberos-master._udp.EXAMPLE.COM. [8962] 1645765308.654206: Sending DNS SRV query for _kerberos-master._tcp.EXAMPLE.COM. [8962] 1645765308.654207: No SRV records found kinit: Password incorrect while getting initial credentials
... View more
Labels:
- Labels:
-
Kerberos
01-05-2021
05:13 AM
Hi Team, I am getting below error while doing db test connection on ranger.(Using Mysql as a back end) Please help me to fix this. 2021-01-05 03:44:52,150 - Host checks started. 2021-01-05 03:44:52,151 - Check execute list: db_connection_check 2021-01-05 03:44:52,151 - DB connection check started. WARNING: File /var/lib/ambari-agent/cache/DBConnectionVerification.jar already exists, assuming it was downloaded before WARNING: File /var/lib/ambari-agent/cache/mysql-connector-java.jar already exists, assuming it was downloaded before 2021-01-05 03:44:52,155 - call['/usr/jdk64/jdk1.8.0_112/bin/java -cp /var/lib/ambari-agent/cache/DBConnectionVerification.jar:/var/lib/ambari-agent/cache/mysql-connector-java.jar -Djava.library.path=/var/lib/ambari-agent/cache org.apache.ambari.server.DBConnectionVerification "jdbc:mysql://master.hadoop:3306/ranger" "rangeradmin" [PROTECTED] com.mysql.jdbc.Driver'] {} 2021-01-05 03:44:53,464 - call returned (1, "Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.\nERROR: Unable to connect to the DB. Please check DB connection properties.\ncom.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.") 2021-01-05 03:44:53,465 - DB connection check completed. 2021-01-05 03:44:53,467 - Host checks completed. 2021-01-05 03:44:53,467 - Check db_connection_check was unsuccessful. Exit code: 1. Message: Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary. ERROR: Unable to connect to the DB. Please check DB connection properties. com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ranger
09-04-2020
02:13 AM
Hi, Could you please any of you tell how to remove the background operation entry in ambari database. Attaching the screenshot
... View more
Labels:
- Labels:
-
Ambari Blueprints
08-18-2020
04:36 AM
@stevenmatison @mattw do i need to enable this property nifi.state.management.embedded.zookeeper.start=false After enabling, seems NIFI is not able to run it's kept on failing. Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flowService': FactoryBean threw exception on object creation; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flowController': FactoryBean threw exception on object creation; nested exception is java.lang.ArrayIndexOutOfBoundsException: 1 at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:175) at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:103) at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1634) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:317) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1084) at org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:55) ... 33 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flowController': FactoryBean threw exception on object creation; nested exception is java.lang.ArrayIndexOutOfBoundsException: 1 at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:175) at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:103) at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1634) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:317) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1084) at org.apache.nifi.spring.StandardFlowServiceFactoryBean.getObject(StandardFlowServiceFactoryBean.java:48) at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:168) ... 39 common frames omitted Caused by: java.lang.ArrayIndexOutOfBoundsException: 1 at org.apache.zookeeper.server.quorum.QuorumPeerConfig.parseProperties(QuorumPeerConfig.java:188) at org.apache.nifi.controller.state.server.ZooKeeperStateServer.<init>(ZooKeeperStateServer.java:55) at org.apache.nifi.controller.state.server.ZooKeeperStateServer.create(ZooKeeperStateServer.java:189) at org.apache.nifi.controller.FlowController.<init>(FlowController.java:620) at org.apache.nifi.controller.FlowController.createClusteredInstance(FlowController.java:452) at org.apache.nifi.spring.FlowControllerFactoryBean.getObject(FlowControllerFactoryBean.java:63) at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:168) ... 46 common frames omitted 2020-08-18 07:32:30,284 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server... 2020-08-18 07:32:30,291 INFO [Thread-1] o.eclipse.jetty.server.AbstractConnector Stopped ServerConnector@5ec10ab1{HTTP/1.1,[http/1.1]}{w0lxqhdp04.:9090} 2020-08-18 07:32:30,292 INFO [Thread-1] org.eclipse.jetty.server.session Stopped scavenging
... View more
08-18-2020
04:07 AM
@mattw Attaching the NIFI properties file for your reference. cat nifi.properties # Generated by Apache Ambari. Tue Aug 18 06:44:23 2020 nifi.administrative.yield.duration=30 sec nifi.authorizer.configuration.file=/usr/hdf/current/nifi/conf/authorizers.xml nifi.bored.yield.duration=10 millis nifi.cluster.flow.election.max.candidates=1 nifi.cluster.flow.election.max.wait.time=5 mins nifi.cluster.is.node=true nifi.cluster.node.address=w0lxqhdp04. nifi.cluster.node.connection.timeout=30 sec nifi.cluster.node.event.history.size=25 nifi.cluster.node.max.concurrent.requests=100 nifi.cluster.node.protocol.max.threads= nifi.cluster.node.protocol.port=9088 nifi.cluster.node.protocol.threads=50 nifi.cluster.node.read.timeout=30 sec nifi.cluster.protocol.heartbeat.interval=5 sec nifi.cluster.protocol.is.secure=True nifi.components.status.repository.buffer.size=1440 nifi.components.status.repository.implementation=org.apache.nifi.controller.status.history.VolatileComponentStatusRepository nifi.components.status.snapshot.frequency=1 min nifi.content.claim.max.appendable.size=1 MB nifi.content.claim.max.flow.files=20 nifi.content.repository.always.sync=false nifi.content.repository.archive.enabled=false nifi.content.repository.archive.max.retention.period=6 hours nifi.content.repository.archive.max.usage.percentage=25% nifi.content.repository.directory.default=/data/content_repository nifi.content.repository.implementation=org.apache.nifi.controller.repository.FileSystemRepository nifi.content.viewer.url=../nifi-content-viewer/ nifi.database.directory=/var/lib/nifi/database_repository nifi.documentation.working.directory=/var/lib/nifi/work/docs/components nifi.flow.configuration.archive.dir=/var/lib/nifi/archive/ nifi.flow.configuration.archive.enabled=true nifi.flow.configuration.archive.max.count= nifi.flow.configuration.archive.max.storage=500 MB nifi.flow.configuration.archive.max.time=30 days nifi.flow.configuration.file=/var/lib/nifi/conf/flow.xml.gz nifi.flowcontroller.autoResumeState=true nifi.flowcontroller.graceful.shutdown.period=10 sec nifi.flowfile.repository.always.sync=false nifi.flowfile.repository.checkpoint.interval=2 mins nifi.flowfile.repository.directory=/var/lib/nifi/flowfile_repository nifi.flowfile.repository.implementation=org.apache.nifi.controller.repository.WriteAheadFlowFileRepository nifi.flowfile.repository.partitions=256 nifi.flowfile.repository.wal.implementation=org.apache.nifi.wali.SequentialAccessWriteAheadLog nifi.flowservice.writedelay.interval=500 ms nifi.h2.url.append=;LOCK_TIMEOUT=25000;WRITE_DELAY=0;AUTO_SERVER=FALSE nifi.kerberos.krb5.file= nifi.kerberos.service.keytab.location= nifi.kerberos.service.principal= nifi.kerberos.spnego.authentication.expiration=12 hours nifi.kerberos.spnego.keytab.location= nifi.kerberos.spnego.principal= nifi.login.identity.provider.configuration.file=/usr/hdf/current/nifi/conf/login-identity-providers.xml nifi.nar.library.directory=/usr/hdf/current/nifi/lib nifi.nar.working.directory=/var/lib/nifi/work/nar nifi.provenance.repository.always.sync=false nifi.provenance.repository.buffer.size=100000 nifi.provenance.repository.compress.on.rollover=true nifi.provenance.repository.concurrent.merge.threads=2 nifi.provenance.repository.debug.frequency=1_000_000 nifi.provenance.repository.directory.default=/var/lib/nifi/provenance_repository nifi.provenance.repository.encryption.key= nifi.provenance.repository.encryption.key.id= nifi.provenance.repository.encryption.key.provider.implementation= nifi.provenance.repository.encryption.key.provider.location= nifi.provenance.repository.implementation=org.apache.nifi.provenance.WriteAheadProvenanceRepository nifi.provenance.repository.index.shard.size=500 MB nifi.provenance.repository.index.threads=1 nifi.provenance.repository.indexed.attributes= nifi.provenance.repository.indexed.fields=EventType, FlowFileUUID, Filename, ProcessorID, Relationship nifi.provenance.repository.journal.count=16 nifi.provenance.repository.max.attribute.length=65536 nifi.provenance.repository.max.storage.size=1 GB nifi.provenance.repository.max.storage.time=24 hours nifi.provenance.repository.query.threads=2 nifi.provenance.repository.rollover.size=100 MB nifi.provenance.repository.rollover.time=30 secs nifi.provenance.repository.warm.cache.frequency=1 hour nifi.queue.backpressure.count=10000 nifi.queue.backpressure.size=1 GB nifi.queue.swap.threshold=20000 nifi.remote.contents.cache.expiration=30 secs nifi.remote.input.host= nifi.remote.input.http.enabled=true nifi.remote.input.http.transaction.ttl=30 sec nifi.remote.input.secure=True nifi.remote.input.socket.port= nifi.security.group.mapping.pattern.anygroup= nifi.security.group.mapping.transform.anygroup= nifi.security.group.mapping.value.anygroup= nifi.security.identity.mapping.pattern.dn= nifi.security.identity.mapping.pattern.kerb= nifi.security.identity.mapping.transform.dn= nifi.security.identity.mapping.transform.kerb= nifi.security.identity.mapping.value.dn= nifi.security.identity.mapping.value.kerb= nifi.security.keyPasswd=lzvbGvxTu2jqYoiW||kF3EEKAgEgICkGdNcwI1ea7OwJ7M9zzIzYhXTGlUyBPt nifi.security.keyPasswd.protected=aes/gcm/256 nifi.security.keystore=/usr/hdf/current/nifi/conf/keystore.jks nifi.security.keystorePasswd=o6guFiFvjHZkd1sv||/PW2IpW+3uHDLfdYEYlGjJ5AwKfsY/JYlglLMUvzm9LB nifi.security.keystorePasswd.protected=aes/gcm/256 nifi.security.keystoreType=jks nifi.security.needClientAuth=False nifi.security.ocsp.responder.certificate= nifi.security.ocsp.responder.url= nifi.security.truststore=/usr/hdf/current/nifi/conf/truststore.jks nifi.security.truststorePasswd=GPdsLL2nuVPDna/8||D8l1yCf8Y2u0iijKajs+ujpJ0k8b6+9BYe9fwwXG3Fke nifi.security.truststorePasswd.protected=aes/gcm/256 nifi.security.truststoreType=jks nifi.security.user.authorizer=ranger-provider nifi.security.user.knox.audiences= nifi.security.user.knox.cookieName=hadoop-jwt nifi.security.user.knox.publicKey= nifi.security.user.knox.url= nifi.security.user.login.identity.provider= nifi.security.user.oidc.client.id= nifi.security.user.oidc.client.secret= nifi.security.user.oidc.connect.timeout=5 secs nifi.security.user.oidc.discovery.url= nifi.security.user.oidc.preferred.jwsalgorithm= nifi.security.user.oidc.read.timeout=5 secs nifi.sensitive.props.additional.keys= nifi.sensitive.props.algorithm=PBEWITHMD5AND256BITAES-CBC-OPENSSL nifi.sensitive.props.key=+uS9mR1+fHmc7lmU||ZeyYH+ZsLF7S9y1MjrOz4D4zwcZQs3SeFsfz8kv2aUyPBxPz nifi.sensitive.props.key.protected=aes/gcm/256 nifi.sensitive.props.provider=BC nifi.state.management.configuration.file=/usr/hdf/current/nifi/conf/state-management.xml nifi.state.management.embedded.zookeeper.properties=/usr/hdf/current/nifi/conf/zookeeper.properties nifi.state.management.embedded.zookeeper.start=false nifi.state.management.provider.cluster=zk-provider nifi.state.management.provider.local=local-provider nifi.swap.in.period=5 sec nifi.swap.in.threads=1 nifi.swap.manager.implementation=org.apache.nifi.controller.FileSystemSwapManager nifi.swap.out.period=5 sec nifi.swap.out.threads=4 nifi.templates.directory=/var/lib/nifi/templates nifi.ui.autorefresh.interval=30 sec nifi.ui.banner.text= nifi.variable.registry.properties= nifi.version=1.7.0.3.2.0.0-520 nifi.web.http.host= nifi.web.http.network.interface.default= nifi.web.http.port= nifi.web.https.host=w0lxqhdp04. nifi.web.https.network.interface.default= nifi.web.https.port=9091 nifi.web.jetty.threads=200 nifi.web.jetty.working.directory=/var/lib/nifi/work/jetty nifi.web.max.header.size=16 KB nifi.web.proxy.context.path= nifi.web.proxy.host= nifi.web.war.directory=/usr/hdf/current/nifi/lib nifi.zookeeper.connect.string=w0lxqhdp03.:2181,w0lxqhdp01.:2181,w0lxqhdp02.:2181 nifi.zookeeper.connect.timeout=60 secs nifi.zookeeper.root.node=/nifi nifi.zookeeper.session.timeout=60 secs #nifi.security.ambari.hash.kspwd=60748b13f122c465a5b6f373cd82751fb803044db848ebb1d801281e3b48c154 #nifi.security.ambari.hash.kpwd=60748b13f122c465a5b6f373cd82751fb803044db848ebb1d801281e3b48c154 #nifi.security.ambari.hash.tspwd=60748b13f122c465a5b6f373cd82751fb803044db848ebb1d801281e3b48c154
... View more
08-18-2020
04:04 AM
@mattw Please find the error trace found in nifi-app.log 2020-08-18 07:00:34,997 ERROR [Timer-Driven Process Thread-2] o.a.n.p.standard.GenerateTableFetch GenerateTableFetch[id=bd177184-822e-32b9-983f-b76ae3bb613d] Failed to retrieve observed maximum values from the State Manager. Will not perform query until this is accomplished.: java.io.IOException: Failed to obtain value from ZooKeeper for component with ID bd177184-822e-32b9-983f-b76ae3bb613d with exception code CONNECTIONLOSS java.io.IOException: Failed to obtain value from ZooKeeper for component with ID bd177184-822e-32b9-983f-b76ae3bb613d with exception code CONNECTIONLOSS at org.apache.nifi.controller.state.providers.zookeeper.ZooKeeperStateProvider.getState(ZooKeeperStateProvider.java:420) at org.apache.nifi.controller.state.manager.StandardStateManagerProvider$1.getState(StandardStateManagerProvider.java:278) at org.apache.nifi.controller.state.StandardStateManager.getState(StandardStateManager.java:63) at org.apache.nifi.controller.lifecycle.TaskTerminationAwareStateManager.getState(TaskTerminationAwareStateManager.java:52) at org.apache.nifi.processors.standard.GenerateTableFetch.onTrigger(GenerateTableFetch.java:247) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /nifi/components/bd177184-822e-32b9-983f-b76ae3bb613d at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.getData(ZooKeeper.java:1155) at org.apache.zookeeper.ZooKeeper.getData(ZooKeeper.java:1184) at org.apache.nifi.controller.state.providers.zookeeper.ZooKeeperStateProvider.getState(ZooKeeperStateProvider.java:403) ... 14 common frames omitted 2020-08-18 07:00:34,965 INFO [Heartbeat Monitor Thread-1] o.a.n.c.c.node.NodeClusterCoordinator Status of w0lxqhdp04.:9091 changed from NodeConnectionStatus[nodeId=w0lxqhdp04.:9091, state=CONNECTED, updateId=10] to NodeConnectionStatus[nodeId=w0lxqhdp04.:9091, state=DISCONNECTED, Disconnect Code=Lack of Heartbeat, Disconnect Reason=Have not received a heartbeat from node in 70 seconds, updateId=11] 2020-08-18 07:00:58,612 WARN [Timer-Driven Process Thread-10] org.apache.hadoop.hdfs.DataStreamer Slow waitForAckedSeqno took 70145ms (threshold=30000ms). File being written: /data/operations/asop/psr_mongo_collab/psr_mongo_collab_2020_08_17, block: BP-1869526027-10.49.70.13-1580132125118:blk_1076124283_64574827, Write pipeline datanodes: [DatanodeInfoWithStorage[10.49.70.16:50010,DS-8d8421c2-b322-48e1-9922-5bdd4614667b,DISK], DatanodeInfoWithStorage[10.49.70.19:50010,DS-7791c66a-538f-4bb9-88e2-95f46ef522c2,DISK], DatanodeInfoWithStorage[10.49.70.13:50010,DS-35a98ef4-1d66-47f9-b29b-594b2a2fb308,DISK]]. 2020-08-18 07:00:58,612 WARN [Timer-Driven Process Thread-9] org.apache.hadoop.hdfs.DataStreamer Slow waitForAckedSeqno took 47248ms (threshold=30000ms). File being written: /data/operations/asop/terminatn_status/terminatn_status_2020_08_17, block: BP-1869526027-10.49.70.13-1580132125118:blk_1076124281_64574829, Write pipeline datanodes: [DatanodeInfoWithStorage[10.49.70.16:50010,DS-8d8421c2-b322-48e1-9922-5bdd4614667b,DISK], DatanodeInfoWithStorage[10.49.70.17:50010,DS-d37465a6-e98d-495a-aa2e-d1bac42906d2,DISK], DatanodeInfoWithStorage[10.49.70.18:50010,DS-f28e02f8-bf18-421f-9c24-391f7a2ad881,DISK]]. 2020-08-18 07:00:58,617 WARN [Heartbeat Monitor Thread-1] o.a.n.c.c.node.NodeClusterCoordinator Event Reported for w0lxqhdp04.:9091 -- Node disconnected from cluster due to Have not received a heartbeat from node in 70 seconds Bootstrap Log: Bootstrap Config File: /usr/hdf/current/nifi/conf/bootstrap.conf 2020-08-18 06:44:16,813 ERROR [main] org.apache.nifi.bootstrap.Command Failed to send shutdown command to port 38531 due to java.net.SocketTimeoutException: Read timed out. Will kill the NiFi Process with PID 28486.
... View more
08-18-2020
04:00 AM
Hi Team,
We Could see our NIFI is not working for the last 2 days, Even we restarted NIFI services multiple times but nothing worked out as it is SSL enabled. Could you please help us to fix this.
@mattw
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Zookeeper
-
NiFi Registry
03-16-2020
01:21 AM
Hi Team,
I could see the NIFI content storage has reached the maximum threshold and current storage capacity is 195 GB and it has been reaching the max capacity frequently. How to fix this space issue? Do I need to change any properties in nifi?
[nifi@w0lxdhdp01 conf]$ du -sh /var/lib/nifi/content_repository/ 146G /var/lib/nifi/content_repository/ [nifi@w0lxdhdp01 conf]$ du -sh /var/lib/nifi/database_repository/ 15M /var/lib/nifi/database_repository/ [nifi@w0lxdhdp01 conf]$ du -sh /var/lib/nifi/flowfile_repository/ 103M /var/lib/nifi/flowfile_repository/ [nifi@w0lxdhdp01 conf]$ du -sh /var/lib/nifi/provenance_repository/
[nifi@w0lxdhdp01 conf]$ cat nifi.properties | grep content nifi.content.claim.max.appendable.size=1 MB nifi.content.claim.max.flow.files=100 nifi.content.repository.always.sync=false nifi.content.repository.archive.enabled=true nifi.content.repository.archive.max.retention.period=4 hours nifi.content.repository.archive.max.usage.percentage=30% nifi.content.repository.directory.default=/var/lib/nifi/content_repository nifi.content.repository.implementation=org.apache.nifi.controller.repository.FileSystemRepository nifi.content.viewer.url=../nifi-content-viewer/ nifi.remote.contents.cache.expiration=30 secs
@MattWho
... View more
Labels:
- Labels:
-
Apache NiFi
03-04-2020
06:25 AM
By "hung", you mean the processor shows active thread(s) in the upper right corner continuously and never produce any ERROR or WARN log output? Active threads show as a small number in parenthesis (1). @MattWho exactly we are getting the same, we do not know for which process that we need to take thread dump and inspect it. Could you please guide us to fix this issue.
... View more