Member since
04-04-2016
147
Posts
40
Kudos Received
16
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1199 | 07-22-2016 12:37 AM | |
4417 | 07-21-2016 11:48 PM | |
1632 | 07-21-2016 11:28 PM | |
2308 | 07-21-2016 09:53 PM | |
3430 | 07-08-2016 07:56 PM |
03-10-2017
07:22 PM
1 Kudo
Adding TTL on Solr: cd to this directory Step1: Step2: Step3: vi managed-schema: add these 3 lines <field
name="_timestamp_" type="date" indexed="true"
stored="true" multiValued="false" />
<field name="_ttl_" type="string"
indexed="true" multiValued="false" stored="true"
/>
<field name="_expire_at_" type="date"
multiValued="false" indexed="true" stored="true"
/> Step4: vi solrconfig.xml Replace the below 3 lines with the lines after it: <updateRequestProcessorChain
name="add-unknown-fields-to-the-schema">
<!-- UUIDUpdateProcessorFactory will generate an id if none is present in
the incoming document -->
<processor class="solr.UUIDUpdateProcessorFactory"/> as <updateRequestProcessorChain
name="add-unknown-fields-to-the-schema"> <processor
class="solr.TimestampUpdateProcessorFactory"> <str
name="fieldName">_timestamp_</str> </processor> <processor
class="solr.DefaultValueUpdateProcessorFactory"> <str
name="fieldName">_ttl_</str> <str
name="value">+30SECONDS</str> </processor> <processor
class="solr.processor.DocExpirationUpdateProcessorFactory"> <str
name="ttlFieldName">_ttl_</str> <str
name="ttlParamName">_ttl_</str> <int
name="autoDeletePeriodSeconds">30</int> <str
name="expirationFieldName">_expire_at_</str> </processor> <processor
class="solr.FirstFieldValueUpdateProcessorFactory"> <str
name="fieldName">_expire_at_</str> </processor> <!--
UUIDUpdateProcessorFactory will generate an id if none is present in the
incoming document --> <processor
class="solr.UUIDUpdateProcessorFactory" />
Things that might be useful: Make sure to start solr like this so that configs related to
solr goes to /solr in zookeeper: 1./opt/lucidworks-hdpsearch/solr/bin/solr
start -c –z lake1.field.hortonworks.com:2181, lake2.field.hortonworks.com:2181,
lake3.field.hortonworks.com:2181/solr 2.create
the collection like this /opt/lucidworks-hdpsearch/solr/bin/solr create -c
tweets -d data_driven_schema_configs -s 1 -rf 1 3.to
delete the collection: http://testdemo.field.hortonworks.com:8983/solr/admin/collections?action=DELETE&name=tweets 4.also
remove it from zkCli.sh as rmr /solr/config/tweets Thanks, Sujitha Sanku please ping me or email me at ssanku@hortonworks.com in case of any issues.
... View more
Labels:
11-21-2016
09:37 PM
Special thanks to Michael Young for the help to be my mentor. Step1: cd /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/ screen-shot-2016-11-21-at-100524-am.png Step2: vi managed-schema: add these 3 lines <field
name="_timestamp_" type="date" indexed="true"
stored="true" multiValued="false" />
<field name="_ttl_" type="string"
indexed="true" multiValued="false" stored="true"
/>
<field name="_expire_at_" type="date"
multiValued="false" indexed="true" stored="true"
/> screen-shot-2016-11-21-at-100929-am.png Step3: vi solrconfig.xml on the same directory. Replace the below 3 lines with the lines after it:
<updateRequestProcessorChain name="add-unknown-fields-to-the-schema">
<!-- UUIDUpdateProcessorFactory will generate an id if none is present in
the incoming document -->
<processor /> as <updateRequestProcessorChain name="add-unknown-fields-to-the-schema">
<processor>
<str name="fieldName">_timestamp_</str>
</processor>
<processor>
<str name="fieldName">_ttl_</str>
<str name="value">+30SECONDS</str>
</processor>
<processor
class="solr.processor.DocExpirationUpdateProcessorFactory">
<str name="ttlFieldName">_ttl_</str>
<str name="ttlParamName">_ttl_</str>
<int name="autoDeletePeriodSeconds">30</int>
<str name="expirationFieldName">_expire_at_</str>
</processor>
<processor>
<str name="fieldName">_expire_at_</str>
</processor>
<!-- UUIDUpdateProcessorFactory will generate an id if none is present in
the incoming document --> <processor
class="solr.UUIDUpdateProcessorFactory" /> screen-shot-2016-11-21-at-101045-am.png Hope that helps. Thanks, Sujitha
... View more
Labels:
11-03-2016
01:23 AM
@Artem Ervits, this solution still give me the same issue. Also I have these changes on the edge node. That is correct right? @brahmasree b did you find solution to this question? if so can you please post
... View more
10-25-2016
12:35 AM
1 Kudo
Hi @Bryan Bende, Thanks for the reply. Yes I realized the error and I followed these steps, https://community.hortonworks.com/articles/26551/accessing-kerberos-enabled-kafka-topics-using-getk.html Also I name my principle as "nifi/iotdemo.field.hortonworks.com@LAKE" also do I need to mention these lines in my zookeeper.properties? 3. Added 3 additional properties to the bottom of the zookeeper.properties file you have configured per the linked procedure above: authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider jaasLoginRenew=3600000 requireClientAuthScheme=sasl Right now my error is: "Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user" Please find attached my PutKafka processor configurations. Any help is highly appreciated.. screen-shot-2016-10-24-at-53412-pm.png screen-shot-2016-10-24-at-53535-pm.png Thanks a lot, Sujitha
... View more
10-24-2016
07:44 PM
Hi, I have an error while trying to stream the data using NiFi flow, in a kerberized environment with LDAP integrated. The error "failed while waiting for acks from Kafka" , I attached the error and properties screenshot. By the way there is a conf called " Kerberos Service Name" is that the error, Any help is highly appreciated. Thanks, Sujitha screen-shot-2016-10-24-at-124017-pm.png screen-shot-2016-10-24-at-124006-pm.png
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
08-18-2016
05:43 AM
1 Kudo
Solr indexing the
MySQL database table on HDP 2.5 Tech Preview: Solr version used: solr 4.9.0 Step1: Downloaded the solr 4.9.0.zip from https://archive.apache.org/dist/lucene/solr/4.9.0/ Step2: Extract the file: Step3: modify the solrconfig.xml, schema.xml and add the
db-data-config.xml at Step4: add the jar at this location
a.vi solrconfig.xml: add these lines in between
the config tags. <lib
dir="../../../contrib/dataimporthandler/lib/"
regex=".*\.jar" /> <lib dir="../../../dist/"
regex="solr-dataimporthandler-\d.*\.jar" /> <lib dir="../../../lib/"
regex="mysql-connector-java-5.0.8-bin.jar" /> <requestHandler name="/dataimport"
class="org.apache.solr.handler.dataimport.DataImportHandler">
<lst name="defaults">
<str name="config">db-data-config.xml</str>
</lst> </requestHandler>
b.vi schema.xml add the below line: <dynamicField name="*_name" type="text_general" multiValued="false"
indexed="true" stored="true"
/>
c.Create a file called db-data-config.xml at the
same path later in this session I would create a database employee in mysql
server add these <dataConfig>
<dataSource type="JdbcDataSource"
driver="com.mysql.jdbc.Driver"
url="jdbc:mysql://localhost:3306/employees" user="root"
password="hadoop" />
<document>
<entity name="id" query="select emp_no as 'id',
first_name, last_name from employees limit 1000;" />
</document> </dataConfig> After this
is complete run the below command (d) to start solr and check if solr is up and
running at url below: 8983 is the default port of solr d.java –jar start.jar http://localhost:8983/solr/#/
e.select the core selector as collection1. f.Click on Data Import, expand configuration and check if its
pointing to our db-data-config.xml file we created. g.After the completion of Step5 below click on execute on the page. Step5: Setting
up database: Import an already
available database into Mysql:
Ref:
https://dev.mysql.com/doc/employee/en/employees-installation.html shell> tar -xjf
employees_db-full-1.0.6.tar.bz2 shell> cd
employees_db/ shell> mysql -t
< employees.sql With this
installation of employees db in mysql is complete. Step6: With this our indexing is complete using
Solr. To do: I will try indexing the tables in Mysql using latest
version of Solr. Reference: http://blog.comperiosearch.com/blog/2014/08/28/indexing-database-using-solr/ Hope this helps…. Thanks, Sujitha
... View more
Labels:
08-15-2016
11:03 PM
Hi, I am trying to login to my mysql prompt on HDP 2.5. Its doesn't allow me to login. Not sure whats the password set on it. please find the attached. Any help is highly appreciated. Thanks, Sujitha
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
07-29-2016
08:29 PM
Hi @milind pandit, Thanks for the info. Yes apart from Nifi can I focus on something http://hortonworks.com/partner/sap/ and http://hortonworks.com/partner/informatica/ would that makes sense to add these as examples. Thanks again for the reply. Thanks, Sujitha
... View more
07-29-2016
05:59 PM
Hi there, I am looking for a better way of answering this question with any references and documentation. Platform architected with “Ease of integration” with other applications or technologies? This is from one of my RFP questions. Any help is highly appreciated. Thanks, Sujitha
... View more