Support Questions

Find answers, ask questions, and share your expertise

issue following Sample HDF/NiFi flow to Push Tweets into Solr/Banana, HDFS/Hive

avatar
Expert Contributor

Hi i've been following this tutorial : https://community.hortonworks.com/articles/1282/sample-hdfnifi-flow-to-push-tweets-into-solrbanana.h...

and i ran into some trouble when i tried to create the collection called tweets. I'm getting this error message, any ideas?

Exception during parsing file: solrconfig.xml:org.xml.sax.SAXParseException; systemId: solrres:/solrconfig.xml; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.

,

1 ACCEPTED SOLUTION

avatar
Expert Contributor

It appears i solved the problem. For those who might one day be in the same situation i will try my best to explain what happenned : there was a mistake in the solrconfig.xml file the first time i tried to create the collection (the '<' was missing line 1). Yet this version of the solrconfig.xml was loaded into zookeeper's environment which, from what i understand, is used when using solrcloud. To solve my problem i had to correct the xml and then push the upload into zookeeper using the command :

/opt/lucidworks-hdpsearch/solr/server/scripts/cloud-scripts/zkcli.sh -zkhost 192.xxx.xx.xx:2181 -cmd upconfig -confname tweets -confdir /opt/lucidworks-hdpsearch/solr/server/solr/configsets/
basic_configs/conf   

View solution in original post

7 REPLIES 7

avatar
Master Mentor
@Lubin Lemarchand

try to edit the solr xml file using vi, if you did it with notepad, it tends to add funny characters. Just validate your xml file.

avatar
Expert Contributor

Thank you for your answer. Sorry to bother you with this kind of stuff but when i only type the path of the xml, i receive the message : line 1. syntax error near unexpected token 'newline'.

When i edit the file with vi the first line is :

<?xml version="1.0" encoding="UTF-8" ?>

Am i missing something?

avatar

Windows and linux tend to use different newline characters, you could possibly use dos2unix to clean the file up.

avatar
Expert Contributor

Alright, so i typed :

dos2unix -o /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/solrconfig.xml

which gives me

dos2unix: converting [file path] to UNIX format...

yet i have the same error message when i type the file path or when i try to create the collection

avatar
Master Mentor

Please paste the XML file snippet where you pasted the code it requested. @Lubin Lemarchand you probably didn't close a tag or outside the tags

avatar
Expert Contributor

It appears i solved the problem. For those who might one day be in the same situation i will try my best to explain what happenned : there was a mistake in the solrconfig.xml file the first time i tried to create the collection (the '<' was missing line 1). Yet this version of the solrconfig.xml was loaded into zookeeper's environment which, from what i understand, is used when using solrcloud. To solve my problem i had to correct the xml and then push the upload into zookeeper using the command :

/opt/lucidworks-hdpsearch/solr/server/scripts/cloud-scripts/zkcli.sh -zkhost 192.xxx.xx.xx:2181 -cmd upconfig -confname tweets -confdir /opt/lucidworks-hdpsearch/solr/server/solr/configsets/
basic_configs/conf   

avatar
New Contributor

Hello everyone i have been following the same tuto but while installing nifi (addind it as service in ambari, sandbox 2.6) i encounter this problem, would you mind helping me please.

stderr:

Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 131, in <module> Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 40, in install Execute('wget '+params.snapshot_package+' -O '+params.temp_file+' -a ' + params.nifi_log_file, user=params.nifi_user) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'wget https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log' returned 8. stdout: 2018-05-31 17:59:52,083 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2018-05-31 17:59:52,083 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-05-31 17:59:52,085 - Group['livy'] {} 2018-05-31 17:59:52,088 - Group['spark'] {} 2018-05-31 17:59:52,089 - Group['ranger'] {} 2018-05-31 17:59:52,089 - Group['hdfs'] {} 2018-05-31 17:59:52,089 - Group['zeppelin'] {} 2018-05-31 17:59:52,089 - Group['hadoop'] {} 2018-05-31 17:59:52,090 - Group['nifi'] {} 2018-05-31 17:59:52,090 - Adding group Group['nifi'] 2018-05-31 17:59:52,126 - Group['users'] {} 2018-05-31 17:59:52,130 - Group['knox'] {} 2018-05-31 17:59:52,131 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,132 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,132 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,133 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,134 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-05-31 17:59:52,135 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,136 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-05-31 17:59:52,137 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger'], 'uid': None} 2018-05-31 17:59:52,138 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-05-31 17:59:52,139 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None} 2018-05-31 17:59:52,140 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,140 - Adding user User['nifi'] 2018-05-31 17:59:52,196 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,197 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,198 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-05-31 17:59:52,200 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,201 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,203 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2018-05-31 17:59:52,204 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,205 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,206 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,208 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,209 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,211 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-05-31 17:59:52,212 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-31 17:59:52,216 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-05-31 17:59:52,243 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-05-31 17:59:52,243 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-05-31 17:59:52,247 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-31 17:59:52,249 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-31 17:59:52,250 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-05-31 17:59:52,278 - call returned (0, '1002') 2018-05-31 17:59:52,279 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-05-31 17:59:52,318 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if 2018-05-31 17:59:52,319 - Group['hdfs'] {} 2018-05-31 17:59:52,319 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']} 2018-05-31 17:59:52,320 - FS Type: 2018-05-31 17:59:52,320 - Directory['/etc/hadoop'] {'mode': 0755} 2018-05-31 17:59:52,336 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-05-31 17:59:52,337 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match 2018-05-31 17:59:52,338 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-05-31 17:59:52,352 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-05-31 17:59:52,361 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-05-31 17:59:52,365 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-05-31 17:59:52,365 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0 is not created due to its tags: set(['GPL']) 2018-05-31 17:59:52,366 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-05-31 17:59:52,369 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-05-31 17:59:52,369 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-05-31 17:59:52,370 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-05-31 17:59:53,503 - Skipping installation of existing package unzip 2018-05-31 17:59:53,503 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-05-31 17:59:53,630 - Skipping installation of existing package curl 2018-05-31 17:59:53,630 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-05-31 17:59:53,739 - Skipping installation of existing package hdp-select 2018-05-31 17:59:53,740 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-05-31 17:59:53,741 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure. 2018-05-31 17:59:53,975 - Directory['/var/run/nifi'] {'owner': 'nifi', 'group': 'nifi'} 2018-05-31 17:59:53,977 - Creating directory Directory['/var/run/nifi'] since it doesn't exist. 2018-05-31 17:59:53,977 - Changing owner for /var/run/nifi from 0 to nifi 2018-05-31 17:59:53,977 - Changing group for /var/run/nifi from 0 to nifi 2018-05-31 17:59:53,977 - Directory['/var/log/nifi'] {'owner': 'nifi', 'group': 'nifi'} 2018-05-31 17:59:53,978 - Creating directory Directory['/var/log/nifi'] since it doesn't exist. 2018-05-31 17:59:53,978 - Changing owner for /var/log/nifi from 0 to nifi 2018-05-31 17:59:53,978 - Changing group for /var/log/nifi from 0 to nifi 2018-05-31 17:59:53,978 - Execute['touch /var/log/nifi/nifi-setup.log'] {'user': 'nifi'} 2018-05-31 17:59:54,074 - Execute['wget https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log'] {'user': 'nifi'} 2018-05-31 17:59:55,866 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-05-31 17:59:55,885 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure. Command failed after 1 tries