SOLR Connection Database Configuration

Source: Internet
Author: User
Tags solr

The information that is typically searched is stored in the database, but we cannot search the database directly, so only the information that SOLR will search for is indexed on the search server and then used by clients.

One, link database 1. SQL configuration

Take SQL Server, for example, to download the jar package for SQL Server first:

Http://msdn.microsoft.com/en-us/data/aa937724.aspx

After decompression, copy the Sqljdbc4.jar to the Webapps\solr\web-inf\lib, which is the example:

D:\apache-tomcat-7.0.57\webapps\solr\WEB-INF\lib

2. Create a query

Then create a new profile under D:\apache-tomcat-7.0.57\webapps\solr\solr_home\collection1\conf: Data-config.xml

Then edit the configuration file in this directory: Solrconfig.xml

Find a number of RequestHandler nodes and add them at the bottom:

<RequestHandlername= "/dataimport"class= "Org.apache.solr.handler.dataimport.DataImportHandler">     <LSTname= "Defaults">           <Strname= "config">D:\apache-tomcat-7.0.57\webapps\solr\solr_home\collection1\conf\data-config.xml</Str>     </LST> </RequestHandler>

What you want to configure is the path to the new file you just created.

3. Copy the Dist, contrib folder under the solr-4.10.2 folder under the initial download to the Tomcat root directory:

4. Then modify the solrconfig.xml you just edited to add two folders and modify some parameters:

About 75 rows By default there are 8 paths, but the path is not necessarily all right, along the path of this configuration to find folders, according to the number of layers to modify the path configuration, and add a new configuration, the full text is as follows:

  <Libdir=".. /.. /.. /.. /.. /contrib/extraction/lib "Regex= ". *\.jar" />  <Libdir=".. /.. /.. /.. /.. /dist/"Regex= "Solr-cell-\d.*\.jar" />  <Libdir=".. /.. /.. /.. /.. /contrib/clustering/lib/"Regex= ". *\.jar" />  <Libdir=".. /.. /.. /.. /.. /dist/"Regex= "Solr-clustering-\d.*\.jar" />  <Libdir=".. /.. /.. /.. /.. /contrib/langid/lib/"Regex= ". *\.jar" />  <Libdir=".. /.. /.. /.. /.. /dist/"Regex= "Solr-langid-\d.*\.jar" />  <Libdir=".. /.. /.. /.. /.. /contrib/velocity/lib "Regex= ". *\.jar" />  <Libdir=".. /.. /.. /.. /.. /dist/"Regex= "Solr-velocity-\d.*\.jar" />  <Libdir=".. /.. /.. /.. /.. /dist/"Regex= "Solr-dataimporthandler-\d.*\.jar" />

This configuration is based on the derivation of the machine, everyone according to their actual situation to modify.

5. Place the Dist folder under the

Copy to Webapps\solr\web-inf\lib.

6. Then edit

D:\apache-tomcat-7.0.57\webapps\solr\solr_home\collection1\conf\data-confing.xml file, take the books table of the native Mybookshop database as an example

The configuration is as follows:

<?XML version= "1.0" encoding= "UTF8"?> <Dataconfig>    <DataSourceDriver= "Com.microsoft.sqlserver.jdbc.SQLServerDriver"URL= "Jdbc:sqlserver://localhost:1433;databasename=mybookshop"User= "sa"Password= "111"/>    <Documentname= "Info"PK= "id">        <Entityname= "Zpxx"Transformer= "Clobtransformer"PK= "id"Query= "SELECT [Id],[title] as name from [Mybookshop]. [dbo]. [Books] "Deltaimportquery= "SELECT [Id],[title] as name from [Mybookshop]. [dbo]. [Books] where [publishdate] > ' ${dataimporter.last_index_time} ' "Deltaquery= "SELECT ID from [Mybookshop]. [dbo]. [Books] where [publishdate] > ' ${dataimporter.last_index_time} ' ">                                <Fieldcolumn= "id"name= "id"      />                     <Fieldcolumn= "Name"name= "Name"      />          </Entity>    </Document></Dataconfig>

This configuration is very important.

The above configuration instructions are as follows:

Query is the SQL that gets all the data (SOLR gets those data from SQL), and multiple columns

Deltaimportquery is the SQL used when retrieving incremental data (database additions to SOLR data), multiple columns

Deltaquery is to obtain the PK of SQL (the database new data is, append to SOLR data when the condition, according to ID, condition is last acquired time, ${dataimporter.last_index_time, last acquired time}), a column

Ensure that SQL Server is configured with extranet access.

Configuration to this basic end.

Second, the establishment of the index

Start SOLR and delete all index data:

http://localhost:8080/solr/update/?stream.body=<delete><query>*:* </query></delete> &stream.contenttype=text/xml;charset=utf-8&commit=true

Stop SOLR and check if it's emptied:

Start creating a new index, browser execution mode:

Stop Run Index:http://localhost:8080/solr/collection1/dataimport?command=abort

Start index:http://localhost:8080/solr/collection1/dataimport?command=full-import

Incremental Index:http://localhost:8080/solr/collection1/dataimport?command=delta-import

It is recommended to use the UI when learning to update, to see the running process:

Need to wait for execution to finish, can click "Refrush status" to see the result:

Indexing completed. added/updated:1076 documents. Deleted 0 documents. (duration:03s)

Indicates the end.

Then click on the query on the left to verify the index results:

A query result appears indicating that the creation was successful.

Third, update the index, increase the index, delete the index 1. Update data:

We first search for a keyword "potatoes", here are all computer books, there should be no potatoes.

The title of one of the books is then updated to "on the cultivation techniques of potatoes"

Do not update the index is not searched.

To perform an incremental update:

Http://localhost:8080/solr/dataimport?command=delta-import&clean=false&commit=true

Then go to query:

Can already be queried.

2. Add a new data: "On the Art of Apple"
Insert  into [Mybookshop].[dbo].[Books]([Title],[Author],[publishdate],[Wordscount],[UnitPrice],[contentdescription],[aurhordescription],[editorcomment],[TOC])SELECT 'on the art of Apple',[Author],[publishdate],[Wordscount],[UnitPrice],[contentdescription],[aurhordescription],[editorcomment],[TOC] from [Mybookshop].[dbo].[Books] whereId=4942

The new results are as follows:

Then perform the incremental index:

Http://localhost:8080/solr/dataimport?command=delta-import&clean=false&commit=true

Re-query:

found that the new results were queried.

3. Delete data

Delete the Apple data, still be able to query out, perform a single index delete:

7168&stream.contenttype=text/xml;charset=utf-8&commit=true ">http://localhost:8080/solr/update/? stream.body=<delete><id>7168</id></delete>&stream.contenttype=text/xml;charset= Utf-8&commit=true

Check again to find out:

Database download

SOLR Connection Database Configuration

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.