<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148159#M110688</link>
    <description>&lt;P&gt;Hello everyone i have been following the same tuto but while installing nifi (addind it as service in ambari, sandbox 2.6) i encounter this problem, would you mind helping me please.&lt;/P&gt;&lt;P&gt;stderr:&lt;/P&gt;&lt;P&gt; 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 131, in &amp;lt;module&amp;gt;
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 40, in install
    Execute('wget '+params.snapshot_package+' -O '+params.temp_file+' -a '  + params.nifi_log_file, user=params.nifi_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'wget &lt;A href="https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz" target="_blank"&gt;https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz&lt;/A&gt; -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log' returned 8.
 stdout:
2018-05-31 17:59:52,083 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -&amp;gt; 2.6
2018-05-31 17:59:52,083 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-05-31 17:59:52,085 - Group['livy'] {}
2018-05-31 17:59:52,088 - Group['spark'] {}
2018-05-31 17:59:52,089 - Group['ranger'] {}
2018-05-31 17:59:52,089 - Group['hdfs'] {}
2018-05-31 17:59:52,089 - Group['zeppelin'] {}
2018-05-31 17:59:52,089 - Group['hadoop'] {}
2018-05-31 17:59:52,090 - Group['nifi'] {}
2018-05-31 17:59:52,090 - Adding group Group['nifi']
2018-05-31 17:59:52,126 - Group['users'] {}
2018-05-31 17:59:52,130 - Group['knox'] {}
2018-05-31 17:59:52,131 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,132 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,132 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,133 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,134 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,135 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,136 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,137 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger'], 'uid': None}
2018-05-31 17:59:52,138 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,139 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-05-31 17:59:52,140 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,140 - Adding user User['nifi']
2018-05-31 17:59:52,196 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,197 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,198 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,200 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,201 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,203 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-05-31 17:59:52,204 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,205 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,206 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,208 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,209 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,211 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,212 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,216 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-05-31 17:59:52,243 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-05-31 17:59:52,243 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-05-31 17:59:52,247 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,249 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,250 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-05-31 17:59:52,278 - call returned (0, '1002')
2018-05-31 17:59:52,279 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-05-31 17:59:52,318 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if
2018-05-31 17:59:52,319 - Group['hdfs'] {}
2018-05-31 17:59:52,319 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
2018-05-31 17:59:52,320 - FS Type: 
2018-05-31 17:59:52,320 - Directory['/etc/hadoop'] {'mode': 0755}
2018-05-31 17:59:52,336 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-05-31 17:59:52,337 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2018-05-31 17:59:52,338 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-05-31 17:59:52,352 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-31 17:59:52,361 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-05-31 17:59:52,365 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-05-31 17:59:52,365 - Repository with url &lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0&lt;/A&gt; is not created due to its tags: set(['GPL'])
2018-05-31 17:59:52,366 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-31 17:59:52,369 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-05-31 17:59:52,369 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-05-31 17:59:52,370 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,503 - Skipping installation of existing package unzip
2018-05-31 17:59:53,503 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,630 - Skipping installation of existing package curl
2018-05-31 17:59:53,630 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,739 - Skipping installation of existing package hdp-select
2018-05-31 17:59:53,740 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-05-31 17:59:53,741 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.
2018-05-31 17:59:53,975 - Directory['/var/run/nifi'] {'owner': 'nifi', 'group': 'nifi'}
2018-05-31 17:59:53,977 - Creating directory Directory['/var/run/nifi'] since it doesn't exist.
2018-05-31 17:59:53,977 - Changing owner for /var/run/nifi from 0 to nifi
2018-05-31 17:59:53,977 - Changing group for /var/run/nifi from 0 to nifi
2018-05-31 17:59:53,977 - Directory['/var/log/nifi'] {'owner': 'nifi', 'group': 'nifi'}
2018-05-31 17:59:53,978 - Creating directory Directory['/var/log/nifi'] since it doesn't exist.
2018-05-31 17:59:53,978 - Changing owner for /var/log/nifi from 0 to nifi
2018-05-31 17:59:53,978 - Changing group for /var/log/nifi from 0 to nifi
2018-05-31 17:59:53,978 - Execute['touch /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2018-05-31 17:59:54,074 - Execute['wget &lt;A href="https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz" target="_blank"&gt;https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz&lt;/A&gt; -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2018-05-31 17:59:55,866 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-05-31 17:59:55,885 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.
Command failed after 1 tries&lt;/P&gt;</description>
    <pubDate>Fri, 01 Jun 2018 20:40:33 GMT</pubDate>
    <dc:creator>ibrahimadofall</dc:creator>
    <dc:date>2018-06-01T20:40:33Z</dc:date>
    <item>
      <title>issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148152#M110681</link>
      <description>&lt;P&gt;Hi i've been following this tutorial : &lt;A href="https://community.hortonworks.com/articles/1282/sample-hdfnifi-flow-to-push-tweets-into-solrbanana.html" target="_blank"&gt;https://community.hortonworks.com/articles/1282/sample-hdfnifi-flow-to-push-tweets-into-solrbanana.html&lt;/A&gt; &lt;/P&gt;&lt;P&gt;and i ran into some trouble when i  tried to create the collection called tweets. I'm getting this error message, any ideas? &lt;/P&gt;&lt;PRE&gt;Exception during parsing file: solrconfig.xml:org.xml.sax.SAXParseException; systemId: solrres:/solrconfig.xml; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.&lt;/PRE&gt;&lt;P&gt;
,&lt;/P&gt;</description>
      <pubDate>Thu, 18 Feb 2016 21:47:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148152#M110681</guid>
      <dc:creator>lubinlemarchand</dc:creator>
      <dc:date>2016-02-18T21:47:21Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148153#M110682</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/2855/lubinlemarchand.html" nodeid="2855"&gt;@Lubin Lemarchand&lt;/A&gt;&lt;P&gt;try to edit the solr xml file using vi, if you did it with notepad, it tends to add funny characters. Just validate your xml file.&lt;/P&gt;</description>
      <pubDate>Thu, 18 Feb 2016 22:15:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148153#M110682</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-02-18T22:15:52Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148154#M110683</link>
      <description>&lt;P&gt;Thank you for your answer. Sorry to bother you with this kind of stuff but when i only type the path of the xml, i receive the message : line 1. syntax error near unexpected token 'newline'. &lt;/P&gt;&lt;P&gt;When i edit the file with vi the first line is :&lt;/P&gt;&lt;PRE&gt;&amp;lt;?xml version="1.0" encoding="UTF-8" ?&amp;gt;&lt;/PRE&gt;&lt;P&gt;Am i missing something?&lt;/P&gt;</description>
      <pubDate>Thu, 18 Feb 2016 23:00:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148154#M110683</guid>
      <dc:creator>lubinlemarchand</dc:creator>
      <dc:date>2016-02-18T23:00:18Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148155#M110684</link>
      <description>&lt;P&gt;Windows and linux tend to use different newline characters, you could possibly use dos2unix to clean the file up.&lt;/P&gt;</description>
      <pubDate>Fri, 19 Feb 2016 00:38:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148155#M110684</guid>
      <dc:creator>dchaffey</dc:creator>
      <dc:date>2016-02-19T00:38:40Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148156#M110685</link>
      <description>&lt;P&gt;Alright, so i typed :&lt;/P&gt;&lt;PRE&gt;dos2unix -o /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/solrconfig.xml&lt;/PRE&gt;&lt;P&gt;which gives me&lt;/P&gt;&lt;PRE&gt;dos2unix: converting [file path] to UNIX format...&lt;/PRE&gt;&lt;P&gt;yet i have the same error message when i type the file path or when i try to create the collection&lt;/P&gt;</description>
      <pubDate>Fri, 19 Feb 2016 17:15:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148156#M110685</guid>
      <dc:creator>lubinlemarchand</dc:creator>
      <dc:date>2016-02-19T17:15:42Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148157#M110686</link>
      <description>&lt;P&gt;Please paste the XML file snippet where you pasted the code it requested. &lt;A rel="user" href="https://community.cloudera.com/users/2855/lubinlemarchand.html" nodeid="2855"&gt;@Lubin Lemarchand&lt;/A&gt; you probably didn't close a tag or outside the tags&lt;/P&gt;</description>
      <pubDate>Fri, 19 Feb 2016 19:42:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148157#M110686</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-02-19T19:42:12Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148158#M110687</link>
      <description>&lt;P&gt;It appears i solved the problem. For those who might one day be in the same situation i will try my best to explain what happenned : there was a mistake in the solrconfig.xml file the first time i tried to create the collection (the '&amp;lt;' was missing line 1). Yet this version of the solrconfig.xml was loaded into zookeeper's environment which, from what i understand, is used when using solrcloud. To solve my problem i had to correct the xml and then &lt;STRONG&gt;push the upload into zookeeper&lt;/STRONG&gt; using the command :&lt;/P&gt;&lt;PRE&gt;/opt/lucidworks-hdpsearch/solr/server/scripts/cloud-scripts/zkcli.sh -zkhost 192.xxx.xx.xx:2181 -cmd upconfig -confname tweets -confdir /opt/lucidworks-hdpsearch/solr/server/solr/configsets/
basic_configs/conf   
&lt;/PRE&gt;</description>
      <pubDate>Mon, 22 Feb 2016 18:50:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148158#M110687</guid>
      <dc:creator>lubinlemarchand</dc:creator>
      <dc:date>2016-02-22T18:50:01Z</dc:date>
    </item>
    <item>
      <title>Re: issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive</title>
      <link>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148159#M110688</link>
      <description>&lt;P&gt;Hello everyone i have been following the same tuto but while installing nifi (addind it as service in ambari, sandbox 2.6) i encounter this problem, would you mind helping me please.&lt;/P&gt;&lt;P&gt;stderr:&lt;/P&gt;&lt;P&gt; 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 131, in &amp;lt;module&amp;gt;
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 40, in install
    Execute('wget '+params.snapshot_package+' -O '+params.temp_file+' -a '  + params.nifi_log_file, user=params.nifi_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'wget &lt;A href="https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz" target="_blank"&gt;https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz&lt;/A&gt; -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log' returned 8.
 stdout:
2018-05-31 17:59:52,083 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -&amp;gt; 2.6
2018-05-31 17:59:52,083 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-05-31 17:59:52,085 - Group['livy'] {}
2018-05-31 17:59:52,088 - Group['spark'] {}
2018-05-31 17:59:52,089 - Group['ranger'] {}
2018-05-31 17:59:52,089 - Group['hdfs'] {}
2018-05-31 17:59:52,089 - Group['zeppelin'] {}
2018-05-31 17:59:52,089 - Group['hadoop'] {}
2018-05-31 17:59:52,090 - Group['nifi'] {}
2018-05-31 17:59:52,090 - Adding group Group['nifi']
2018-05-31 17:59:52,126 - Group['users'] {}
2018-05-31 17:59:52,130 - Group['knox'] {}
2018-05-31 17:59:52,131 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,132 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,132 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,133 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,134 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,135 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,136 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,137 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger'], 'uid': None}
2018-05-31 17:59:52,138 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,139 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-05-31 17:59:52,140 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,140 - Adding user User['nifi']
2018-05-31 17:59:52,196 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,197 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,198 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-05-31 17:59:52,200 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,201 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,203 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-05-31 17:59:52,204 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,205 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,206 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,208 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,209 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,211 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-05-31 17:59:52,212 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,216 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-05-31 17:59:52,243 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-05-31 17:59:52,243 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-05-31 17:59:52,247 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,249 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-31 17:59:52,250 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-05-31 17:59:52,278 - call returned (0, '1002')
2018-05-31 17:59:52,279 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-05-31 17:59:52,318 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if
2018-05-31 17:59:52,319 - Group['hdfs'] {}
2018-05-31 17:59:52,319 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
2018-05-31 17:59:52,320 - FS Type: 
2018-05-31 17:59:52,320 - Directory['/etc/hadoop'] {'mode': 0755}
2018-05-31 17:59:52,336 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-05-31 17:59:52,337 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2018-05-31 17:59:52,338 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-05-31 17:59:52,352 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-31 17:59:52,361 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-05-31 17:59:52,365 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-05-31 17:59:52,365 - Repository with url &lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0&lt;/A&gt; is not created due to its tags: set(['GPL'])
2018-05-31 17:59:52,366 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-31 17:59:52,369 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-05-31 17:59:52,369 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-05-31 17:59:52,370 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,503 - Skipping installation of existing package unzip
2018-05-31 17:59:53,503 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,630 - Skipping installation of existing package curl
2018-05-31 17:59:53,630 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-31 17:59:53,739 - Skipping installation of existing package hdp-select
2018-05-31 17:59:53,740 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-05-31 17:59:53,741 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.
2018-05-31 17:59:53,975 - Directory['/var/run/nifi'] {'owner': 'nifi', 'group': 'nifi'}
2018-05-31 17:59:53,977 - Creating directory Directory['/var/run/nifi'] since it doesn't exist.
2018-05-31 17:59:53,977 - Changing owner for /var/run/nifi from 0 to nifi
2018-05-31 17:59:53,977 - Changing group for /var/run/nifi from 0 to nifi
2018-05-31 17:59:53,977 - Directory['/var/log/nifi'] {'owner': 'nifi', 'group': 'nifi'}
2018-05-31 17:59:53,978 - Creating directory Directory['/var/log/nifi'] since it doesn't exist.
2018-05-31 17:59:53,978 - Changing owner for /var/log/nifi from 0 to nifi
2018-05-31 17:59:53,978 - Changing group for /var/log/nifi from 0 to nifi
2018-05-31 17:59:53,978 - Execute['touch /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2018-05-31 17:59:54,074 - Execute['wget &lt;A href="https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz" target="_blank"&gt;https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz&lt;/A&gt; -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2018-05-31 17:59:55,866 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-05-31 17:59:55,885 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.
Command failed after 1 tries&lt;/P&gt;</description>
      <pubDate>Fri, 01 Jun 2018 20:40:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/issue-following-Sample-HDF-NiFi-flow-to-Push-Tweets-into/m-p/148159#M110688</guid>
      <dc:creator>ibrahimadofall</dc:creator>
      <dc:date>2018-06-01T20:40:33Z</dc:date>
    </item>
  </channel>
</rss>

