I am having a strangeness regarding the MaximumNumberOfDocumentsToReturnPerSearch.
Does this value in the config Table need to be changed prior to a crawl? What is happening is that I change the number in the DB from 200 to 1000 and the pager control updates showing 60 plus pages but any of the pages over 20 display no results. Only 200 results are displayed no matter what the value is or how many result pages are returned?
Strange... I don't have this line.
I will manually force the update of the locations in code that have this reference and hopefully (or not) it is an SVN hiccup.
This update will come with the fix from the storage requirements thread.
For best service when you require assistance:
I just flushed a test index locally... I will put on a crawl and let you know.
Cool let me know what you get. Any pages from 21 onwards are just blank for me....
I'm not seeing this.
Yeah I am still getting this problem. I have updated the configuration table to allow 1000 results returned and any page from 21 onwards is blank. Is there anywhere else I need to update this value other than the configuration table?
Just found the other place, for some reason the value at the end was hard coded to 200 when I updated it to 1000 it worked fine
SearchResults<Document> searchResults = SearchManager.GetDocuments(Global.DefaultQueryParser, Global.CustomQueryParser, Global.IndexSearcher, query, (DiscoveryType)Enum.Parse(typeof(DiscoveryType), discoveryType), pageNumber, 10, shouldDocumentsBeClustered, null, 200);