Friday 15 July 2011

solr4 - Solr full-import performance -


I have a small set of questions and institutions and even if the performance is very bad, I just want to know if Tricks and configurations that I can do to increase performance?

Note that I have Solr 4.1.

You should try to reduce the number of commits during your imports That if you do not do the time when adding documents to solar, then the self-employed self-commodity SLR config. XML will be based on auto-committment settings:

  & lt; AutoCommit & gt; & Lt; MaxDocs & gt; 10000 & lt; / MaxDocs & gt; & Lt; MaxTime & gt; 15000 & lt; / MaxTime & gt; & Lt; OpenSearcher & gt; False & lt; / OpenSearcher & gt; & Lt; / AutoCommit & gt; Maximize both  maxDocs  and  max time  and see if you get better speed ( Maximum Time  Milliseconds is in second, so the default setting is only 15 seconds, which is too small for bulk imports.)  

You can also try to disable auto-commute during your bulk import and After all documents have been added, issue a command command. If it does not make an out-of-memory exception with solar, then this is the best speed you can get.

If you were importing an RDBMS, then I would have suggested to capture as many areas as possible using johns and reducing the number of sub-entities, because each sub-unit Opens a separate connection for DB. Since you are importing from Mango, it is not applicable to you. You can experiment with creating all the data needed for Solar by creating a new Mongo archive, keep a single unit in your data importer and see if it is in import speed Improves.

No comments:

Post a Comment