arachnode.net
An Open Source C# web crawler with Lucene.NET search using SQL Server 2008/2012/2014/2016/CE An Open Source C# web crawler with Lucene.NET search using MongoDB/RavenDB/Hadoop

Completely Open Source @ GitHub

Does arachnode.net scale? | Download the latest release

The request was aborted: The connection was closed unexpectedly.

rated by 0 users
Not Answered This post has 0 verified answers | 5 Replies | 2 Followers

Top 10 Contributor
229 Posts
megetron posted on Fri, Apr 30 2010 6:18 PM

Hello, seems like something wrong here...

Error:

The request was aborted: The connection was closed unexpectedly.

Stack trace:

   at System.Net.ConnectStream.Read(Byte[] buffer, Int32 offset, Int32 size)     at System.Net.Cache.ForwardingReadStream.Read(Byte[] buffer, Int32 offset, Int32 count)     at Arachnode.SiteCrawler.Components.WebClient.DownloadData(String absoluteUri)

 

Please see Daroz comment here:

http://community.salesforce.com/t5/NET-Development/The-connection-was-closed-unexpectedly/m-p/14414

Thanks,

 

 

All Replies

Top 10 Contributor
229 Posts

ok, maybe some of my changes in the plugin system caused some problems?

What I did is to create an abstract plugin class:

public

 

abstract class MasterPages : ACrawlAction

And another plugin inherits it:

public

 

class MasterPagesMusicSela : MasterPages {
protected List<Tbh_Categories> Movie_Categories;
public MasterPagesMusicSela(): base()
{....

Top 10 Contributor
1,905 Posts

Don't think you have done anything to cause it.  This error is a server problem, from what I know.  The server itself is closing the connections, but not arachnode.net.

For best service when you require assistance:

  1. Check the DisallowedAbsoluteUris and Exceptions tables first.
  2. Cut and paste actual exceptions from the Exceptions table.
  3. Include screenshots.

Skype: arachnodedotnet

Top 10 Contributor
229 Posts

That would be just great idea if we could try again to get the page.

As we agreed, a page that requested and failed (from any reason) would be fetched again late on the crawl...nice feature. can help alot.

 

Top 10 Contributor
229 Posts

But what is the reason this error is inside DisallowedAbsoluteUris table too? exception table is the only place should populate this error. please correct me if I got it all wrong.

Top 10 Contributor
1,905 Posts

If an exception is encountered, an entry will also be made in the DisallowedAbsoluteUris table.

For best service when you require assistance:

  1. Check the DisallowedAbsoluteUris and Exceptions tables first.
  2. Cut and paste actual exceptions from the Exceptions table.
  3. Include screenshots.

Skype: arachnodedotnet

Page 1 of 1 (6 items) | RSS
An Open Source C# web crawler with Lucene.NET search using SQL 2008/2012/CE

copyright 2004-2017, arachnode.net LLC