一 、安裝solr 環境說明:centos 7.3,solr 6.6,zookeeper3.4,Tomcat8.5,jdk1.8 zookeeper的部署請參考:http://www.cnblogs.com/Sunzz/p/8464284.html 1. 在/opt/下解壓tomcat、solr 2 ...
一 、安裝solr
環境說明:centos 7.3,solr 6.6,zookeeper3.4,Tomcat8.5,jdk1.8
zookeeper的部署請參考:http://www.cnblogs.com/Sunzz/p/8464284.html
1. 在/opt/下解壓tomcat、solr
[root@solr_1 ~]# tar -xf apache-tomcat-8.5.23.tar.gz -C /opt/ [root@solr_1 ~]# tar -xf solr-6.6.2.tgz -C /opt/ [root@solr_1 ~]# cd /opt [root@solr_1 opt]# ln -sv apache-tomcat-8.5.23 tomcat [root@solr_1 opt]# ln -sv solr-6.6.2 solr
2. 將solr-6.6.0/server/solr-webapp/webapp複製到tomcat/webapps下,並改名為solr
[root@solr_1 ~]# cp -r /opt/solr/server/solr-webapp/webapp/ /opt/tomcat/webapps/ [root@solr_1 ~]# mv /opt/tomcat/webapps/webapp /opt/tomcat/webapps/solr
3. 複製所需jar包到tomcat中solr項目的lib下
① 將solr-6.6.0/server/lib/ext下的jar、
② 將solr-6.6.0/server/lib下以metrics開頭的5個jar(metrics-core-3.2.2.jar、metrics-ganglia-3.2.2.jar、metrics-graphite-3.2.2.jar、metrics-jetty9-3.2.2.jar、metrics-jvm-3.2.2.jar)、
③ 將solr-6.6.0/dist/下的solr-dataimporthandler-6.6.0.jar和solr-dataimporthandler-extras-6.6.0.jar
拷貝到apache-tomcat-8.5.20/webapps/solr/WEB-INF/lib下
[root@solr_1 ~]# cp /opt/solr/server/lib/ext/*.jar /opt/solr/server/lib/metrics*.jar /opt/solr/dist/solr-dataimporthandler-*.jar /opt/tomcat/webapps/solr/WEB-INF/lib/
4. 將solr-6.6.0/server/solr下的文件拷貝至新建的solr-home下
[root@solr_1 ~]# mkdir /opt/solr/solr-home [root@solr_1 ~]# cp -r /opt/solr/server/solr/* /opt/solr/solr-home/
5. 修改apache-tomcat-8.5.20/webapps/solr/WEB-INF下的web.xml
① 找到<env-entry>,解開註釋,並修改env-entry-value為solr-home的路徑
命令:
[root@solr_1 ~]# vim /opt/tomcat/webapps/solr/WEB-INF/web.xml
修改後
<env-entry> <env-entry-name>solr/home</env-entry-name> <env-entry-value>/opt/solr/solr-home</env-entry-value> <env-entry-type>Java.lang.String</env-entry-type> </env-entry>
② 去掉許可權,不然訪問solr會出現沒有授權的錯誤,將兩個security-constraint標簽註釋。
修改後:
<!-- <security-constraint> <web-resource-collection> <web-resource-name>Disable TRACE</web-resource-name> <url-pattern>/</url-pattern> <http-method>TRACE</http-method> </web-resource-collection> <auth-constraint/> </security-constraint> <security-constraint> <web-resource-collection> <web-resource-name>Enable everything but TRACE</web-resource-name> <url-pattern>/</url-pattern> <http-method-omission>TRACE</http-method-omission> </web-resource-collection> </security-constraint> -->
6. 在/opt/tomcat/webapps/solr/WEB-INF/下創建classes文件夾
並將solr-6.6.0/server/resources/log4j.properties拷貝過去
命令:
[root@solr_1 ~]# cd /opt/tomcat/webapps/solr/WEB-INF/ [root@ WEB-INF]# mkdir classes [root@ WEB-INF]# cp -rf /opt/solr/server/resources/log4j.properties ./classes/
7. 在solr-home目錄下,新建collection1文件夾
並將 /solr-6.6.0/server/solr/configsets/basic_configs中conf文件夾複製到新建的collection1文件夾中.在collection1目錄下新建data文件夾.
[root@solr_1 ~]# mkdir /opt/solr/solr-home/collection1 [root@solr_1 ~]# cp -r /opt/solr/server/solr/configsets/basic_configs/conf/ /opt/solr/solr-home/collection1/ [root@solr_1 ~]# mkdir /opt/solr/solr-home/collection1/data
collection1中創建文件core.properties,寫入內容
[root@solr_1 ~]# vim /opt/solr/solr-home/collection1/core.properties
name=collection1 config=solrconfig.xml schema=managed-schema dataDir=data
8. 修改solr的埠與tomcat相同
修改/usr/local/solr/solr-home/solr.xml中的
<int name="hostPort">${jetty.port:8080}</int>
9. 啟動tomcat
在瀏覽器輸入地址:http://192.168.29.110:8080/solr/index.html
二 、配置IKAnalyzer分詞
1. 解壓ikanalyzer-solr6.5.zip
[root@solr_1 ~]# unzip ikanalyzer-solr6.5.zip [root@solr_1 ~]# mv ikanalyzer-solr6.5 /opt/
把ext.dic、IKAnalyzer.cfg.xml和stopword.dic複製到apache-tomcat-8.5.20\webapps\solr\WEB-INF\classes中,
[root@solr_1 ~]# mkdir /opt/tomcat/webapps/solr/WEB-INF/classes [root@solr_1 ~]# cp /opt/ikanalyzer-solr6.5/ikanalyzer-solr5/ext.dic /opt/ikanalyzer-solr6.5/ikanalyzer-solr5/IKAnalyzer.cfg.xml /opt/ikanalyzer-solr6.5/ikanalyzer-solr5/stopword.dic /opt/tomcat/webapps/solr/WEB-INF/classes
把ik-analyzer-solr5-5.x.jar 和 solr-analyzer-ik-5.1.0.jar複製到apache-tomcat-8.5.20/webapps/solr/WEB-INF/lib中;
[root@solr_1 ~]# cp /opt/ikanalyzer-solr6.5/ikanalyzer-solr5/*.jar /opt/tomcat/webapps/solr/WEB-INF/lib/
2. 打開solr-home/collection1/conf下的managed-schema文件
[root@solr_1 ~]# vim /opt/solr/solr-home/collection1/conf/managed-schema
在</schema>前加入配置:
<!-- IK分詞 --> <fieldType name="text_ik" class="solr.TextField"> <analyzer type="index"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="false"/> </analyzer> <analyzer type="query"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="true"/> </analyzer> </fieldType>
3. 重啟tomcat
進入http://192.168.29.110:8080/solr/index.html進行確認。
三 拼音配置
1.複製相關jar文件
將pinyinTokenFilter-1.1.0-RELEASE.jar和pinyinAnalyzer4.3.1.jar和pinyin4j-2.5.0.jar複製到apache-tomcat-8.5.20/webapps/solr/WEB-INF/lib目錄下
[root@solr_1 ~]# cp /opt/ikanalyzer-solr6.5/pinyin* /opt/tomcat/webapps/solr/WEB-INF/lib/
2. 修改solr-home/collection1/conf下的managed-schema文件
(修改後)
<!-- IK分詞 --> <fieldType name="text_ik" class="solr.TextField"> <analyzer type="index"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="false"/> <filter class="top.pinyin.index.solr.PinyinTokenFilterFactory" pinyin="true" isFirstChar="true" minTermLenght="2" /> <filter class="com.shentong.search.analyzers.PinyinNGramTokenFilterFactory" minGram="2" maxGram="20" /> </analyzer> <analyzer type="query"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="true"/> </analyzer> </fieldType>
(上方標紅的為添加內容)
重啟tomcat,測試
四、同義詞
1. 修改managed-schema文件中的ik分詞配置
<fieldType name="text_ik" class="solr.TextField"> <analyzer type="index"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="false"/> <filter class="top.pinyin.index.solr.PinyinTokenFilterFactory" pinyin="true" isFirstChar="true" minTermLenght="2"/> <filter class="com.shentong.search.analyzers.PinyinNGramTokenFilterFactory" minGram="2" maxGram="20"/> <filter class="solr.LowerCaseFilterFactory"/> </analyzer> <analyzer type="query"> <tokenizer class="org.apache.lucene.analysis.ik.IKTokenizerFactory" useSmart="true"/ <filter class="solr.SynonymFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/> <filter class="solr.LowerCaseFilterFactory"/> </analyzer> </fieldType>
2. 在solr-home\collection1\conf\synonyms.txt中添加同義詞
hell,二是
誅仙,誅仙2,夢幻誅仙
五、聯想詞
1. 添加聯想類型
首先需要加上用以聯想的欄位,這裡假設我們對name欄位進行聯想,配置如下(managed-schema文件):
<field name="name" type="text_ik" multiValued="false" indexed="true" stored="true"/> <field name="suggestion" type="text_suggest" indexed="true" stored="true" multiValued="true" /> <copyField source="name" dest="suggestion"/>
suggestion欄位即為suggest聯想所取的欄位。這裡將suggestion欄位設為text_suggest類型,text_suggest是一個自定義的類型,具體作用和配置後面再說。然後利用copyField將name欄位拷貝到suggestion欄位。那麼為什麼我們不直接對name欄位進行聯想,而是專門建立一個欄位把name欄位拷貝過去,乃至專門建立了一個欄位類型呢?在配置中我們可以看到,name欄位採用了IKAnalyzer進行了中文分詞,如果我們直接對name欄位進行分詞,則聯想出來的就會是分詞之後的結果。例如期望聯想的記錄是“先吃水果然後吃冰淇淋”,最後聯想出來的卻是“先吃”。
2. 配置聯想欄位
接下來就需要建立一個專門的欄位類型來配合suggest模塊進行檢察建議了。這裡該欄位名稱為text_suggest,配置如下(managed-schema文件):
<fieldType name="text_suggest" class="solr.TextField"> <analyzer type="index"> <tokenizer class="solr.KeywordTokenizerFactory"/> <filter class="solr.LowerCaseFilterFactory"/> </analyzer> <analyzer type="query"> <tokenizer class="solr.KeywordTokenizerFactory"/> <filter class="solr.LowerCaseFilterFactory"/> </analyzer> </fieldType>
在這裡我們要對整個欄位進行聯想,因此採用KeywordTokenizerFactory作為分詞器,並且使用LowerCaseFilterFactory來保證其可以不區分大小寫。可以根據需要替換成自己需要的analyzer。
3. suggest模塊配置
現在我們的記錄表結構已經建立好了,下麵我們進行suggest模塊的配置。
首先我們來添加suggest模塊。編輯solrconfig.xml文件,添加如下配置:
<searchComponent name="suggest" class="solr.SuggestComponent"> <lst name="suggester"> <str name="name">suggest</str> <str name="lookupImpl">AnalyzingLookupFactory</str> <str name="dictionaryImpl">DocumentDictionaryFactory</str> <str name="field">suggestion</str> <str name="suggestAnalyzerFieldType">text_suggest</str> <str name="buildOnStartup">false</str> </lst> </searchComponent>
說明:在本配置中
name為該suggest模塊的名字; lookUpImpl為查找器,預設為JaspellLookupFactory; dictionaryImpl為字典實現; field為要聯想的欄位; suggestAnalyzerFieldType規定了進行聯想操作所使用類型所對應的Analyzer(該欄位必填); buildOnStartup表示是否在啟動時建立索引。
具體配置信息詳見https://cwiki.apache.org/confluence/display/solr/Suggester。
4. requestHandler配置
接下來我們需要配置suggest模塊的requestHandler。編輯solrconfig.xml文件,添加如下配置:
<requestHandler name="/suggest" class="org.apache.solr.handler.component.SearchHandler"> <lst name="defaults"> <str name="suggest">true</str> <str name="suggest.dictionary">suggest</str> <str name="suggest.count">10</str> </lst> <arr name="components"> <str>suggest</str> </arr> </requestHandler>
下麵解釋配置中涉及到的參數。suggest參數不用說了,必須為true;
suggest.dictionary為suggest操作所需要用到的字典,應當與上面suggest模塊配置中的name屬性保持一致;
suggest.count為候選詞數量,這裡為10。
具體配置可在solr官網中找到:https://lucene.apache.org/solr/guide/6_6/suggester.html
5. 建立索引
這裡我們就已經把suggest模塊配置完畢了。如果suggest模塊配置中buildOnStartup設置為false,則需要手動建立一次索引。建立索引鏈接形如:
http://192.168.29.110:8080/solr/collection1/suggest?suggest=true&suggest.dictionary=suggest&wt=json&suggest.q=Ath&suggest.build=true
6. 測試
六、集成zookeeper
本實例採用zookeeper3.4.10
1. 把solrhome中的配置文件上傳到zookeeper集群
使用:zookeeper的客戶端上傳。
[root@solr_1 ~]# cd /opt/solr/server/scripts/cloud-scripts/ [root@solr_1 cloud-scripts]# ./zkcli.sh -zkhost 192.168.29.110:2181,192.168.29.120:2181,192.168.29.130:2181 -cmd upconfig -confdir /opt/solr/solr-home/core_shopdemo_product2/conf/ -confname myconf
查看配置文件是否上傳成功:
[root@bogon bin]# bash /usr/local/zookeeper/zoo1/zookeeper-3.4.10/bin/zkCli.sh
Connecting to localhost:2181 [zk: localhost:2181(CONNECTED) 0] ls / [configs, zookeeper] [zk: localhost:2181(CONNECTED) 1] ls /configs [myconf] [zk: localhost:2181(CONNECTED) 2] ls /configs/myconf [admin-extra.menu-top.html, currency.xml, protwords.txt, mapping-FoldToASCII.txt, _schema_analysis_synonyms_english.json, _rest_managed.json, solrconfig.xml, _schema_analysis_stopwords_english.json, stopwords.txt, lang, spellings.txt, mapping-ISOLatin1Accent.txt, admin-extra.html, xslt, synonyms.txt, scripts.conf, update-script.js, velocity, elevate.xml, admin-extra.menu-bottom.html, clustering, schema.xml]
2. 修改每一臺solr的tomcat 的 bin目錄下catalina.sh文件
在其中加入DzkHost指定zookeeper伺服器地址:
JAVA_OPTS="$JAVA_OPTS $JSSE_OPTS"
# Register custom URL handlers # Do this here so custom URL handles (specifically 'war:...') can be used in the security policy JAVA_OPTS="$JAVA_OPTS -Djava.protocol.handler.pkgs=org.apache.catalina.webresources" JAVA_OPTS="$JAVA_OPTS -DzkHost=192.168.29.110:2181,192.168.29.120:2181,192.168.29.130:2181"
(上方標紅的為添加內容)
3. 重新啟動tomcat
4. 使用collections管理功能
添加collection
說明:
config set:配置文件存放位置 numShards:片區數量 replicationFactor:每一個片區提供服務的機器數量(小於機器總數) Show advanced 顯示高級設置 maxShardsPerNode:最大片區數量
(5、6 非必須步驟)
5. 創建一個兩片的collection,每片是一主一備。
在瀏覽器中訪問:
http://192.168.29.110:8080/solr/admin/collections?action=CREATE&name=collection2&numShards=2&replicationFactor=2
連接中需要修改的內容:
ip:伺服器ip
name:數據集名稱
numShards:數據集有幾個片區
replicationFactor:每一個片區提供服務的機器數量(小於機器總數)
6. 刪除collection1.
http://192.168.29.110:8080/solr/admin/collections?action=DELETE&name=collection1
連接中需要修改的內容:
ip:伺服器ip
name:數據集名稱
七、Solr集群的使用
1. 使用solrj操作集群環境的索引庫
在pom.xml增加solr的jar
<dependency> <groupId>org.apache.solr</groupId> <artifactId>solr-solrj</artifactId> <version>6.6.0</version> </dependency>
代碼:
package com.demo.util.solr; import java.io.IOException; import java.util.ArrayList; import java.util.Collection; import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.SolrQuery; import org.apache.solr.client.solrj.SolrServerException; import org.apache.solr.client.solrj.impl.CloudSolrClient; import org.apache.solr.client.solrj.response.QueryResponse; import org.apache.solr.common.SolrDocument; import org.apache.solr.common.SolrDocumentList; import org.apache.solr.common.SolrInputDocument;
//SolrCloud 索引增刪查
public class SolrCloudTest { private static CloudSolrClient cloudSolrClient; private static synchronized CloudSolrClient getCloudSolrClient(final String zkHost) { if (cloudSolrClient == null) { try { cloudSolrClient = new CloudSolrClient(zkHost); } catch (Exception e) { e.printStackTrace(); } } return cloudSolrClient; } private static void addIndex(SolrClient solrClient) { try { SolrInputDocument doc1 = new SolrInputDocument(); doc1.addField("id", "421245251215121452521251"); doc1.addField("name", "張三"); doc1.addField("age", 30); doc1.addField("desc", "張三是個農民,勤勞致富,奔小康"); SolrInputDocument doc2 = new SolrInputDocument(); doc2.addField("id", "4224558524254245848524243"); doc2.addField("name", "李四"); doc2.addField("age", 45); doc2.addField("desc", "李四是個企業家,白手起家,致富一方"); SolrInputDocument doc3 = new SolrInputDocument(); doc3.addField("id", "2224558524254245848524299"); doc3.addField("name", "王五"); doc3.addField("age", 60); doc3.addField("desc", "王五好吃懶做,溜須拍馬,跟著李四,也過著小康的日子"); Collection<SolrInputDocument> docs = new ArrayList<SolrInputDocument>(); docs.add(doc1); docs.add(doc2); docs.add(doc3); solrClient.add(docs); solrClient.commit(); } catch (SolrServerException e) { System.out.println("Add docs Exception !!!"); e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } catch (Exception e) { System.out.println("Unknowned Exception!!!!!"); e.printStackTrace(); } } public static void search(SolrClient solrClient, String String) { SolrQuery query = new SolrQuery(); query.setQuery(String); try { QueryResponse response = solrClient.query(query); SolrDocumentList docs = response.getResults(); System.out.println("文檔個數:" + docs.getNumFound()); System.out.println("查詢時間:" + response.getQTime()); for (SolrDocument doc : docs) { String id = (String) doc.getFieldValue("id"); String name = (String) doc.getFieldValue("name"); Integer age = (Integer) doc.getFieldValue("age"); String desc = (String) doc.getFieldValue("desc"); System.out.println("id: " + id); System.out.println("name: " + name); System.out.println("age: " + age); System.out.println("desc: " + desc); System.out.println(); } } catch (SolrServerException e) { e.printStackTrace(); } catch (Exception e) { System.out.println("Unknowned Exception!!!!"); e.printStackTrace(); } } public static void deleteAllIndex(SolrClient solrClient) { try { solrClient.deleteByQuery("*:*");// delete everything! solrClient.commit(); } catch (SolrServerException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } catch (Exception e) { System.out.println("Unknowned Exception !!!!"); e.printStackTrace(); } } public static void main(String[] args) throws IOException { final String zkHost = "192.168.29.110:2181,192.168.29.120:2181,192.168.29.130:2181"; final String defaultCollection = "collection1"; final int zkClientTimeout = 20000; final int zkConnectTimeout = 1000; CloudSolrClient cloudSolrClient = getCloudSolrClient(zkHost); System.out.println("The Cloud cloudSolrClient Instance has benn created!"); cloudSolrClient.setDefaultCollection(defaultCollection); cloudSolrClient.setZkClientTimeout(zkClientTimeout); cloudSolrClient.setZkConnectTimeout(zkConnectTimeout); cloudSolrClient.connect(); System.out.println("The cloud Server has been connected !!!!");
//創建索引
SolrCloudTest.addIndex(cloudSolrClient);
//查詢
SolrCloudTest.search(cloudSolrClient, "name:李四");
//刪除
SolrCloudTest.deleteAllIndex(cloudSolrClient); SolrCloudTest.search(cloudSolrClient, "name:李四"); cloudSolrClient.close(); } }
2 在solr的管理頁面增加collection1的索引欄位“name”,“age”,“desc”。
name和desc的欄位類型使用添加的IK分詞“text_ik”,
age的欄位類型使用int
八 對資料庫數據進行索引
資料庫主機以及賬號密碼:
mysql: 192.168.29.100:3306 user:root password:123456
1.複製相關jar文件
將solr自帶的solr-dataimporthandler-6.6.0.jar, solr-dataimporthandler-extras-6.6.0.jar和mysql-connector-java-5.1.44.jar拷貝到tomcat中solr的lib下
2.修改solrconfig.xml
找到“<requestHandler name="/select" class="solr.SearchHandler">”,在其上方增加配置 <requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler"> <lst name="defaults"> <str name="config">data-config.xml</str> </lst> </requestHandler>
3.在solrconfig.xml的同級目錄下新建data-config.xml
詳細配置:
<?xml version="1.0" encoding="UTF-8" ?> <dataConfig> <dataSource name="source1" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://192.168.29.100:3306/test1" user="root" password="123456"/> <document name="salesDoc"> <entity pk="id" dataSource="source1" name="user" query="select id,name,sex,age,insertTime from user" deltaQuery="select id,name,sex,age,insertTime from user where insertTime >'${dih.last_index_time}'"> <field name="id" column="id"/> <field name="name" column="name"/> <field name="sex" column="sex"/> <field name="age" column="age"/> <field name="insertTime" column="insertTime"/> </entity> </document> </dataConfig>
配置說明:
dataSource:設置數據源 document:Solr的信息的基本單位,它是一組描述某些事物的數據集合 entity:對應數據表 pk:表主鍵 dataSource:指定使用哪個數據源 name:表名 query:查詢sql deltaQuery:增量更新時使用的查詢sql ${dih.last_index_time}:最後更新時間 field:表欄位
4.在solrconfig.xml的同級目錄下新建dataimport.properties
dataimport.properties內容:
#Mon Nov 06 13:03:53 CST 2017
last_index_time=2017-11-06 13\:03\:50
user.last_index_time=2017-11-06 13\:03\:50
user.last_index_time指定user表的最後更新時間(建議使用此種方式,因為如果有多張表的話可以分別更新)
5.修改managed-schema
<field name="id" type="int" indexed="true" stored="true" required="true" multiValued="false" /> <field name="name" type="text_ik" indexed="true" stored="true"/> <field name="sex" type="int" indexed="true" stored="true"/> <field name="age" type="int" indexed="true" stored="true"/> <field name="insertTime" type="int" indexed="true" stored="true"/>
6.上傳文件
如solr的配置已上傳至zookeeper,需重覆“集成zookeeper”中的第一步將配置文件上傳至zookeeper。(也可以執行“常用命令”中的“更新solr配置到zookeeper”進行單個文件上傳)
7.重啟tomcat,執行數據導入操作
說明:
full-import:全量索引 delta-import:增量索引 clean:清除原有索引 commit:執行後提交 entity:數據源表
8.驗證是否成功
九 、定時增量更新索引
1.將solr-dataimportscheduler-1.1.jar拷貝到tomcat中solr的lib目錄下麵
2.修改tomcat中solr下WEB-INF/web.xml
在servlet節點前面增加:
<listener> <listener-class>org.apache.solr.handler.dataimport.scheduler.ApplicationListener</listener-class> </listener>
3.在solr-home下創建conf文件夾
進入conf,在其中新建dataimport.properties
dataimport.properties配置
[root@solr_1 ~]# vim /opt/solr/solr-home/conf/dataimport.properties
1 ################################################# 2 # # 3 # dataimport scheduler properties # 4 # # 5 ################################################ 6 # to sync or not to sync 7 # 1 - active; anything else - inactive 8 syncEnabled=1 9 # which cores to schedule 10 # in a multi-core environment you can decide which cores you want syncronized 11 # leave empty or comment it out if using single-core deployment 12 syncCores=collection1 13 # solr server name or IP address 14 # [defaults to localhost if empty] 15 server=localhost 16 # solr server port 17 # [defaults to 80 if empty] 18 port=8080 19 # application name/context 20 # [defaults to current ServletContextListener's context (app) name] 21 webapp=solr 22 # URL params [mandatory] 23 # remainder of URL 24 #增量 25 params=/dataimport?command=delta-import&clean=false&commit=true 26 # schedule interval 27 # number of minutes between two runs 28 # [defaults to 30 if empty] 29 interval=1 30 # 重做索引的時間間隔,單位分鐘,預設7200,即1天; 31 # 為空,為0,或者註釋掉:表示永不重做索引 32 reBuildIndexInterval=7200 33 # 重做索引的參數 34 reBuildIndexParams=/dataimport?command=full-import&clean=true&commit=true 35 # 重做索引時間間隔的計時開始時間,第一次真正執行的時間=reBuildIndexBeginTime+reBuildIndexInterval*60*1000; 36 # 兩種格式:2012-04-11 03:10:00 或者 03:10:00,後一種會自動補全日期部分為服務啟動時的日期 37 reBuildIndexBeginTime=03:10:00
4.重啟tomcat,並驗證是否成功
在mysql中增加一條數據,等待1分鐘,在solr的管理頁面查看是否有增加數據
更新solr配置到zookeeper
修改schema.xml配置文件之後,根本不用登錄zookeeper刪除原有文件,文件會自動覆蓋,這裡直接上傳即可,命令如下:
[root@solr_1 ~]# cd /opt/solr/server/scripts/cloud-scripts/
[root@ cloud-scripts]# ./zkcli.sh -zkhost 192.168.29.110:2181,192.168.29.120:2181,192.168.29.130:2181 -cmd upconfig -confdir /opt/solr/solr-home/core_shopdemo_product2/conf/ -confname myconf
此命令是在配置上傳至zookeeper後,修改配置時使用的