Member since
04-06-2019
125
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
847 | 03-07-2017 04:15 PM |
12-01-2019
05:45 PM
Hi Rehman, May i know how much blocks you have in your cluster ? , it's look like issue with your heap size for namenode service. Please refer the below namenode heap calculation and configure based on that. One namenode object uses about 150 bytes to store metadata information. Assume a 128 MB block size - you should increase the block size if you have lot of data (PB scale or even 500+ TB in some cases). Assume a file size 150 MB. The file will be split in two blocks. First block with 128 MB and second block with 22MB. For this file following information will be stored by Namenode. 1 file inode and 2 blocks. That is 3 namenode objects. They will take about 450 bytes on namenode. For example, at 1MB block size, in this case we will have 150 file blocks. We will have one inode and 150 blocks information in namenode. This means 151 namenode objects for same data. 151 x 150 bytes = 22650 bytes. Even worse would be to have 150 files with 1MB each. That would require 150 inodes and 150 blocks = 300 x 150 bytes = 45000 bytes. See how this all changes. That's why we don't recommend small files for Hadoop. Now assuming 128 MB file blocks, on average 1GB of memory is required for 1 million blocks. Now let's do this calculation at PB scale. Assume 6000 TB of data. That's a lot. Imagine 30 TB capacity for each node. This will require 200 nodes. At 128 MB block size, and replication factor of 3. Cluster capacity in MB = 30 x 1000 (convert to GB) x 1000 (convert to MB) x 200 nodes = 6 000000000 MB (6000 TB) How many blocks can we store in this cluster? 6 000 000 000 MB/128 MB = 46875000 (that's 47 million blocks) Assume 1 GB of memory required per million blocks, you need a mere 46875000 blocks / 1000000 blocks per GB = 46 GB of memory. Namenodes with 64-128 GB memory are quite common.
... View more
04-06-2019
03:09 AM
Can you share further more details to help further. Cloudera bundle version zookeeper version. Complete logs to cross check the error messages. With Regards Anishkumar Valsalam
... View more
04-06-2018
05:27 PM
@Matt Clarke Can you explain.. how we can ensure the remaining threads was using by other components Ex :controller services / Reporting tasks. If you explain that really helpful to close this thread.
... View more
04-05-2018
05:09 PM
@Matt Clarke How we can ensure that ? the remaining threads was using by other Ex :controller services / Reporting tasks.
... View more
04-05-2018
04:57 PM
@Matt Clarke Recently we didn't do any update and we have tried with restart also but still we are seeing differences in the thread values in status bar. What actually we configured. What actually running. How this drastic differences ? pls help to close this.
... View more
04-05-2018
03:49 AM
We are running only two processor group and the overall usage you can see in the 2 PG ( 8 + 4). Hdf - 3.0.1.0-43 Maximum Timer Driven Thread Count = 100 Max Event Driven Thread Count = 5 we have two nodes but it not configured as cluster.
... View more
04-05-2018
03:40 AM
We are running only two processor group and the overall usage you can see in the 2 PG ( 8 + 4). Hdf - 3.0.1.0-43 Maximum Timer Driven Thread Count = 100 Max Event Driven Thread Count = 5 we have two nodes but it not configured as cluster.
... View more
04-04-2018
04:51 PM
Hi Team, How to calculate Nifi threads. ? because in the screenshot it showing totally 68 threads its using but the actual used is only 12 so how to find the remaining threads details ?? Kinldy help on this.
... View more
Labels:
- Labels:
-
Apache NiFi
03-27-2018
06:04 PM
@Matt Clarke Hi Matt, We are having an internal argument on whether its a good idea to keep VMS for production NIFI need your inputs on this What are the pros and cons of each hardware configuration?
... View more
03-27-2018
05:53 PM
Hi Team, Why Virtual Machine is not advisable for Nifi Production environment ?
... View more
Labels:
- Labels:
-
Apache NiFi
03-27-2018
05:51 PM
Hi Team, Why Virtual Machine is not advisable for Nifi Production environment ?
... View more
Labels:
- Labels:
-
Apache NiFi
02-22-2018
04:29 PM
Hi Team, We are planning to move HDF nodes from some non hdf nifi instances befor moving we just want to calculate the current utilzation of utilization of Nifi (non HDF). for better planning. How to get that stats for example for last 1day nifi how much mem/cores used?
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera DataFlow (CDF)
11-09-2017
03:46 PM
Hi team, Please advice on the above issue it will helpful for others too.
... View more
11-08-2017
04:59 PM
@Matt Clarke hi matt can you guide me to resolve this issue.
... View more
11-08-2017
04:57 PM
Hi Team, Is there any option to allocate resourcces for nifi processor group? see in the below image we have allocated "Maximum Timer Driven Thread Count " 30cores all the 30 cores took the below processor itself becuase of that other PG not processing. is there any option available allocate resource for PG or set priority ? in nifi. Please guide me to close this issue.
... View more
Labels:
- Labels:
-
Apache NiFi
11-01-2017
06:38 PM
Hi Abdelkrim, Thanks its working now The same we can do in standalone as well right? in rpg also it taking from one node? how we can process from multiple nodes and balance the load. Why we need to put input port in outside of processor group? root path.
... View more
11-01-2017
06:09 PM
@Abdelkrim Hadjidj Thanks if the new file comes it processing and it fetch to hdfs but i am getting the below error also in Fetchsftp. 2017-11-02 01:56:41,942 ERROR [Timer-Driven Process Thread-5] o.a.nifi.processors.standard.FetchSFTP FetchSFTP[id=788c7b35-015f-1000-0000-00000f3979fd] Failed to fetch content for StandardFlowFileRecord[uuid=af918b98-5632-4877-9876-221477204e73,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1509553663114-11, container=default, section=11], offset=677691, length=0],offset=0,name=a.txt,size=0] from filename /home/1548691/*.txt on remote host 10.20.174.137:22 due to java.io.IOException: Failed to obtain file content for /home/1548691/*.txt; routing to failure: java.io.IOException: Failed to obtain file content for /home/1548691/*.txt
2017-11-02 01:56:41,943 ERROR [Timer-Driven Process Thread-5] o.a.nifi.processors.standard.FetchSFTP
java.io.IOException: Failed to obtain file content for /home/1548691/*.txt
at org.apache.nifi.processors.standard.util.SFTPTransfer.getInputStream(SFTPTransfer.java:300) ~[nifi-standard-processors-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.processors.standard.FetchFileTransfer.onTrigger(FetchFileTransfer.java:236) ~[nifi-standard-processors-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) [nifi-api-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1099) [nifi-framework-core-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_60]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_60]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_60]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_60]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]
Caused by: com.jcraft.jsch.SftpException: /home/1548691/*.txt is not unique: [/home/1548691/b.txt, /home/1548691/HKLPATHAS03.txt, /home/1548691/a.txt]
at com.jcraft.jsch.ChannelSftp.isUnique(ChannelSftp.java:2965) ~[jsch-0.1.54.jar:na]
at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:1314) ~[jsch-0.1.54.jar:na]
at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:1290) ~[jsch-0.1.54.jar:na]
at org.apache.nifi.processors.standard.util.SFTPTransfer.getInputStream(SFTPTransfer.java:292) ~[nifi-standard-processors-1.1.0.2.1.1.0-2.jar:1.1.0.2.1.1.0-2]
... 13 common frames omitted
@ how to process all set of files instead of .txt and it picking from only one node? how to pick from all 3 nodes ?
... View more
11-01-2017
05:52 PM
yes i have data in that. [root@HKLPATHAS03 1548691]# ll
total 4
-rw-r--r-- 1 1548691 root 0 Nov 2 00:38 HKLPATHAS03.txt
drwxr-xr-x 9 1548691 domain users 4096 Nov 2 00:27 nifi-toolkit-1.1.2
see it not taking that file to hdfs.
[root@HKLPATHAS03 1548691]# sudo -u hdfs hadoop fs -ls /user/1548691/HKLPATHAS03.txt
ls: `/user/1548691/HKLPATHAS03.txt': No such file or directory
[root@HKLPATHAS03 1548691]#
... View more
11-01-2017
05:45 PM
Hi Team, I am trying to configure the RPG in my cluster environemt (3nodes) how to distribute the data between the 3 nodes. This the below processor i am trying with RPG but it not processing at all attached the details FYR. Kindly help me to close this thread.
... View more
Labels:
- Labels:
-
Apache NiFi
11-01-2017
03:20 PM
@Abdelkrim Hadjidj Thanks i am able to create policy now , I was checked into default flow policy so its was not enabled. In Input port policy it is enabled And In your flow i can see this is between cluster to cluster. https://community.hortonworks.com/articles/88473/site-to-site-communication-between-secured-https-a.html How we can test the same between 3 instances ( 3 node cluster) ? I am looking for the sample flows to test RPG in the cluster. Thanks Matt and Abdelkrim.
... View more
10-31-2017
03:30 PM
Hi @Matt Clarke @Abdelkrim Hadjidj Thanks for the detailed note but stll the below 2 options are not enabled eventhough we careated polocies (Retrieve Site-To-Site Details) for nodes. -> receive data via site-to-site --> send data via site-to-site And please guide now in standalone cluster we are facing some load(mem / cpu) , In cluster do we face this kind of issue? or it will distribute the data and process so that the above mem/cpur load issue we can avoid. Is my understanding correct?
... View more
10-21-2017
06:16 PM
@Bryan Bende @Pierre Villard Need your advice on this issue.
... View more
10-21-2017
05:49 PM
@Matt Clarke need your help on this.
... View more
10-21-2017
05:48 PM
Hi Team, After upgrade to HDF 3.0.1 nifi not starting and getting below error while starting.kinldy helpt to fix this error. /loginIdentityProviders>
2017/10/22 01:42:07 ERROR [main] org.apache.nifi.properties.ConfigEncryptionTool: Encountered an error
javax.crypto.BadPaddingException: pad block corrupted
at org.bouncycastle.jcajce.provider.symmetric.util.BaseBlockCipher$BufferedGenericBlockCipher.doFinal(Unknown Source)
at org.bouncycastle.jcajce.provider.symmetric.util.BaseBlockCipher.engineDoFinal(Unknown Source)
at javax.crypto.Cipher.doFinal(Cipher.java:2165)
at javax.crypto.Cipher$doFinal$2.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at org.apache.nifi.properties.ConfigEncryptionTool.decryptFlowElement(ConfigEncryptionTool.groovy:541)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:384)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:190)
at org.apache.nifi.properties.ConfigEncryptionTool$_migrateFlowXmlContent_closure4.doCall(ConfigEncryptionTool.groovy:636)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at groovy.lang.Closure.call(Closure.java:426)
at groovy.lang.Closure.call(Closure.java:442)
at org.codehaus.groovy.runtime.StringGroovyMethods.getReplacement(StringGroovyMethods.java:1543)
at org.codehaus.groovy.runtime.StringGroovyMethods.replaceAll(StringGroovyMethods.java:2580)
at org.codehaus.groovy.runtime.StringGroovyMethods.replaceAll(StringGroovyMethods.java:2506)
at org.codehaus.groovy.runtime.dgm$1127.invoke(Unknown Source)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:274)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133)
at org.apache.nifi.properties.ConfigEncryptionTool.migrateFlowXmlContent(ConfigEncryptionTool.groovy:635)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:71)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.apache.nifi.properties.ConfigEncryptionTool.main(ConfigEncryptionTool.groovy:1184)
pad block corrupted
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera DataFlow (CDF)
10-12-2017
04:39 PM
1 Kudo
Hi Team, We are seeing some weird error in HDF 3.0.1.0-43 NIFI it not starting using ambari but it starting without any issues in commandline. It something failing in the encryption, kindly guide me to resolve this thread. resource_management.core.exceptions.ExecutionFailed: Execution of 'JAVA_HOME=/usr/jdk64/jdk1.8.0_60 /var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/files/nifi-toolkit-1.2.0.3.0.1.0-43/bin/encrypt-config.sh -v -b /usr/hdf/current/nifi/conf/bootstrap.conf -n /usr/hdf/current/nifi/conf/nifi.properties -f /data/nifi/conf/flow.xml.gz -s '[PROTECTED]' -l /usr/hdf/current/nifi/conf/login-identity-providers.xml -p '[PROTECTED]'' returned 1. stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 309, in <module>
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 850, in restart
self.start(env, upgrade_type=upgrade_type)
File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 177, in start
self.configure(env, is_starting = True)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 150, in configure
params.nifi_flow_config_dir, params.nifi_sensitive_props_key, is_starting)
File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 304, in encrypt_sensitive_properties
Execute(encrypt_config_script_prefix, user=nifi_user,logoutput=False)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'JAVA_HOME=/usr/jdk64/jdk1.8.0_60 /var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/files/nifi-toolkit-1.2.0.3.0.1.0-43/bin/encrypt-config.sh -v -b /usr/hdf/current/nifi/conf/bootstrap.conf -n /usr/hdf/current/nifi/conf/nifi.properties -f /data/nifi/conf/flow.xml.gz -s '[PROTECTED]' -l /usr/hdf/current/nifi/conf/login-identity-providers.xml -p '[PROTECTED]'' returned 1.
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Handling encryption of login-identity-providers.xml
2017/10/12 23:54:22 WARN [main] org.apache.nifi.properties.ConfigEncryptionTool: The source login-identity-providers.xml and destination login-identity-providers.xml are identical [/usr/hdf/current/nifi/conf/login-identity-providers.xml] so the original will be overwritten
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Handling encryption of nifi.properties
2017/10/12 23:54:22 WARN [main] org.apache.nifi.properties.ConfigEncryptionTool: The source nifi.properties and destination nifi.properties are identical [/usr/hdf/current/nifi/conf/nifi.properties] so the original will be overwritten
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Handling encryption of flow.xml.gz
2017/10/12 23:54:22 WARN [main] org.apache.nifi.properties.ConfigEncryptionTool: The source flow.xml.gz and destination flow.xml.gz are identical [/data/nifi/conf/flow.xml.gz] so the original will be overwritten
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: bootstrap.conf: /usr/hdf/current/nifi/conf/bootstrap.conf
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (src) nifi.properties: /usr/hdf/current/nifi/conf/nifi.properties
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (dest) nifi.properties: /usr/hdf/current/nifi/conf/nifi.properties
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (src) login-identity-providers.xml: /usr/hdf/current/nifi/conf/login-identity-providers.xml
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (dest) login-identity-providers.xml: /usr/hdf/current/nifi/conf/login-identity-providers.xml
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (src) flow.xml.gz: /data/nifi/conf/flow.xml.gz
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: (dest) flow.xml.gz: /data/nifi/conf/flow.xml.gz
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.NiFiPropertiesLoader: Loaded 128 properties from /usr/hdf/current/nifi/conf/nifi.properties
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.NiFiPropertiesLoader: Loaded 128 properties from /usr/hdf/current/nifi/conf/nifi.properties
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Loaded NiFiProperties instance with 128 properties
2017/10/12 23:54:22 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Loaded LoginIdentityProviders content (108 lines)
2017/10/12 23:54:22 WARN [main] org.apache.nifi.properties.AESSensitivePropertyProvider: JCE Unlimited Strength Cryptography Jurisdiction policies are not available, so the max key length is 128 bits
2017/10/12 23:54:22 WARN [main] org.apache.nifi.properties.AESSensitivePropertyProvider: JCE Unlimited Strength Cryptography Jurisdiction policies are not available, so the max key length is 128 bits
2017/10/12 23:54:23 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: No encrypted password property elements found in login-identity-providers.xml
2017/10/12 23:54:23 WARN [main] org.apache.nifi.properties.AESSensitivePropertyProvider: JCE Unlimited Strength Cryptography Jurisdiction policies are not available, so the max key length is 128 bits
2017/10/12 23:54:23 WARN [main] org.apache.nifi.properties.AESSensitivePropertyProvider: JCE Unlimited Strength Cryptography Jurisdiction policies are not available, so the max key length is 128 bits
2017/10/12 23:54:23 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Attempting to encrypt property
2017/10/12 23:54:23 INFO [main] org.apache.nifi.properties.AESSensitivePropertyProvider: AES Sensitive Property Provider encrypted a sensitive value successfully
2017/10/12 23:54:23 INFO [main] org.apache.nifi.properties.ConfigEncryptionTool: Updated XML content: <?xml version="1.0" encoding="UTF-8"?><loginIdentityProviders>
<provider>
<identifier>ldap-provider</identifier>
<class>org.apache.nifi.ldap.LdapProvider</class>
<property name="Identity Strategy">USE_DN</property>
<property name="Authentication Strategy">SIMPLE</property>
<property name="Manager DN">svc.haasdev.001@zone1.scbdev.net</property>
<property name="Manager Password" encryption="aes/gcm/128">cYAgEgKZAsfFiVjR||jB6aUU2eBkjpqUx2q7DvmE3AclRJZPdkKvQ4</property>
<property name="TLS - Keystore"/>
<property name="TLS - Keystore Password"/>
<property name="TLS - Keystore Type"/>
<property name="TLS - Truststore"/>
<property name="TLS - Truststore Password"/>
<property name="TLS - Truststore Type"/>
<property name="TLS - Client Auth"/>
<property name="TLS - Protocol">TLS</property>
<property name="TLS - Shutdown Gracefully"/>
<property name="Referral Strategy">FOLLOW</property>
<property name="Connect Timeout">10 secs</property>
<property name="Read Timeout">10 secs</property>
<property name="Url">ldap://HKWVADIDM05.zone1.scbdev.net:389</property>
<property name="User Search Base">OU=ITSC,dc=zone1,dc=scbdev,dc=net</property>
<property name="Identity Strategy">USE_USERNAME</property>
<property name="User Search Filter">sAMAccountName={0}</property>
<property name="Authentication Expiration">12 hours</property>
</provider>
</loginIdentityProviders>
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:137)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:121)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:569)
at java.lang.StringBuilder.append(StringBuilder.java:190)
at org.apache.commons.io.output.StringBuilderWriter.write(StringBuilderWriter.java:143)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:2370)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:2348)
at org.apache.commons.io.IOUtils.copy(IOUtils.java:2325)
at org.apache.commons.io.IOUtils.copy(IOUtils.java:2273)
at org.apache.commons.io.IOUtils.toString(IOUtils.java:1041)
at org.apache.commons.io.IOUtils$toString.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133)
at org.apache.nifi.properties.ConfigEncryptionTool$_loadFlowXml_closure2$_closure19.doCall(ConfigEncryptionTool.groovy:488)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at groovy.lang.Closure.call(Closure.java:426)
at groovy.lang.Closure.call(Closure.java:442)
at org.codehaus.groovy.runtime.IOGroovyMethods.withCloseable(IOGroovyMethods.java:1622)
at org.codehaus.groovy.runtime.NioGroovyMethods.withCloseable(NioGroovyMethods.java:1754)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
stdout:
2017-10-12 23:53:50,252 - Group['ranger'] {}
2017-10-12 23:53:50,253 - Group['hadoop'] {}
2017-10-12 23:53:50,253 - Group['nifi'] {}
2017-10-12 23:53:50,253 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-12 23:53:50,255 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-12 23:53:50,255 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-12 23:53:50,256 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-12 23:53:50,257 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-10-12 23:53:50,257 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger']}
2017-10-12 23:53:50,258 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'nifi']}
2017-10-12 23:53:50,259 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-12 23:53:50,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-10-12 23:53:50,265 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-10-12 23:53:50,278 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-10-12 23:53:50,284 - Skipping Execute[('setenforce', '0')] due to not_if
2017-10-12 23:53:50,501 - Stack Feature Version Info: stack_version=3.0, version=3.0.1.0-43, current_cluster_version=3.0.1.0-43 -> 3.0.1.0-43
2017-10-12 23:53:50,546 - File['/usr/hdf/current/nifi/bin/nifi-env.sh'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0755}
2017-10-12 23:53:50,548 - Execute['export JAVA_HOME=/usr/jdk64/jdk1.8.0_60;/usr/hdf/current/nifi/bin/nifi.sh stop >> /data/nifi/log/nifi-setup.log'] {'user': 'nifi'}
2017-10-12 23:54:20,757 - Pid file /var/run/nifi/nifi.pid is empty or does not exist
2017-10-12 23:54:20,759 - Directory['/var/run/nifi'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,759 - Directory['/data/nifi/log'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,763 - Directory['/data/nifi'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,936 - Directory['/data/nifi/database_repository'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,937 - Directory['/data/nifi/flowfile_repository'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,947 - Directory['/data/nifi/provenance_repository'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,957 - Directory['/usr/hdf/current/nifi/conf'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,958 - Directory['/data/nifi/conf'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,958 - Directory['/data/nifi/state/local'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,959 - Directory['/usr/hdf/current/nifi/lib'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,961 - Directory['{{nifi_content_repo_dir_default}}'] {'owner': 'nifi', 'create_parents': True, 'group': 'nifi', 'recursive_ownership': True}
2017-10-12 23:54:20,962 - Directory['/data/nifi/content_repository'] {'owner': 'nifi', 'group': 'nifi', 'create_parents': True, 'recursive_ownership': True}
2017-10-12 23:54:21,027 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2017-10-12 23:54:21,030 - File['/etc/security/limits.d/nifi.conf'] {'content': Template('nifi.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2017-10-12 23:54:21,174 - PropertiesFile['/usr/hdf/current/nifi/conf/nifi.properties'] {'owner': 'nifi', 'group': 'nifi', 'mode': 0600, 'properties': ...}
2017-10-12 23:54:21,178 - Generating properties file: /usr/hdf/current/nifi/conf/nifi.properties
2017-10-12 23:54:21,178 - File['/usr/hdf/current/nifi/conf/nifi.properties'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0600}
2017-10-12 23:54:21,258 - Writing File['/usr/hdf/current/nifi/conf/nifi.properties'] because contents don't match
2017-10-12 23:54:21,261 - File['/usr/hdf/current/nifi/conf/bootstrap.conf'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0600}
2017-10-12 23:54:21,265 - File['/usr/hdf/current/nifi/conf/logback.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0400}
2017-10-12 23:54:21,268 - File['/usr/hdf/current/nifi/conf/state-management.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0400}
2017-10-12 23:54:21,273 - File['/usr/hdf/current/nifi/conf/authorizers.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0400}
2017-10-12 23:54:21,277 - File['/usr/hdf/current/nifi/conf/login-identity-providers.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0600}
2017-10-12 23:54:21,278 - File['/usr/hdf/current/nifi/bin/nifi-env.sh'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0755}
2017-10-12 23:54:21,280 - File['/usr/hdf/current/nifi/conf/bootstrap-notification-services.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi', 'mode': 0400}
2017-10-12 23:54:21,281 - Encrypting NiFi sensitive configuration properties
2017-10-12 23:54:21,281 - File['/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/files/nifi-toolkit-1.2.0.3.0.1.0-43/bin/encrypt-config.sh'] {'mode': 0755}
2017-10-12 23:54:21,429 - Execute[(u'JAVA_HOME=/usr/jdk64/jdk1.8.0_60', '/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/files/nifi-toolkit-1.2.0.3.0.1.0-43/bin/encrypt-config.sh', '-v', '-b', u'/usr/hdf/current/nifi/conf/bootstrap.conf', '-n', u'/usr/hdf/current/nifi/conf/nifi.properties', '-f', u'/data/nifi/conf/flow.xml.gz', '-s', [PROTECTED], '-l', u'/usr/hdf/current/nifi/conf/login-identity-providers.xml', '-p', [PROTECTED])] {'logoutput': False, 'user': 'nifi'}
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera DataFlow (CDF)