arachnode.net
An Open Source C# web crawler with Lucene.NET search using SQL Server 2008/2012/2014/2016/CE An Open Source C# web crawler with Lucene.NET search using MongoDB/RavenDB/Hadoop

Completely Open Source @ GitHub

Does arachnode.net scale? | Download the latest release

Making the crawler work faster

rated by 0 users
Answered (Not Verified) This post has 0 verified answers | 26 Replies | 2 Followers

Top 25 Contributor
25 Posts
samarhtc posted on Wed, Nov 21 2012 9:12 AM

Hi

I'm in a situation where I have to crawl some very large websites. The resulting db is apx 120 gb. This is actually ok, but the time it takes to crawl one website is way to long. I'm crawling with 100 thread, but still it takes days. 

So I was wondering how to minimize the crawl time. What I need from the crawling is the webpages that is raw data only meaning html code only, no images, or anything else. 

I was thinking if I somehow disable the encoding of the Source column within the webpages table, maybe this would speed up the process, since now AN does not have to encode/decode the webpages it simply insert the html inside the source column (or a new column next to it source_2!) in nvarchar format.

what do you think, any suggestion would be appreciated.

 

 

Page 1 of 1 (3 items) | RSS
An Open Source C# web crawler with Lucene.NET search using SQL 2008/2012/CE

copyright 2004-2017, arachnode.net LLC