Skip site navigation (1)Skip section navigation (2)
Date:      Tue, 8 May 2001 18:49:49 +0200
From:      Brad Knowles <brad.knowles@skynet.be>
To:        Mike Meyer <mwm@mired.org>
Cc:        Rakhesh Sasidharan <csu96154@cse.iitd.ernet.in>, freebsd-chat <freebsd-chat@FreeBSD.ORG>
Subject:   Re: Hosting an NNTP server
Message-ID:  <p0510030bb71dd2699d92@[194.78.241.123]>
In-Reply-To: <p05100308b71da9e117e6@[194.78.241.123]>
References:   <Pine.LNX.4.10.10105081647500.1732-100000@deskar.cse.iitd.ernet.in> <15095.58155.785819.365229@guru.mired.org> <p05100304b71d988f061d@[194.78.241.123]> <15095.62470.978304.686969@guru.mired.org> <p05100308b71da9e117e6@[194.78.241.123]>

next in thread | previous in thread | raw e-mail | index | archive | help
At 3:55 PM +0200 5/8/01, Brad Knowles wrote:

>  	An absolutely full feed is running over 250GB of traffic per day, and
>  that would require an average of about 24Mbits/sec bandwidth, and
>  probably would have frequent peaks to at least 36Mbits/sec bandwidth.

	Heck, even a full non-binary feed is 1-2GB/day, and while the 
bandwidth requirements (average ~200Kbits/sec, probable frequent 
peaks to ~300Kbits/sec) would easily fit within the limitations of a 
residential broadband connection, that would still take a significant 
chunk of the available bandwidth, and if there are daily, weekly, or 
monthly volume limitations, you could bust those by just taking the 
news feed.


	Much better would be to contract out the news spool services, and 
then run a local news caching solution, so that you only ever pull 
down articles that are actually read, and the local cache ensures 
that you don't pull down the same article more than once.

	If you contracted out the news spool to a site running Diablo and 
you set up a local Diablo caching reader server, they could give you 
a header-only feed for the articles in question (so you could build 
up a complete overview database).  That way you'd have a local server 
with a complete list of what articles are available in what 
newsgroups, etc... and then you would only pull down those articles 
which are actually read, and then cache them locally so that they 
don't need to be pulled down again.

	IMO, the local Diablo caching reader server is the best 
combination of running your own server with implementation of local 
caching to minimize wasted bandwidth.

-- 
Brad Knowles, <brad.knowles@skynet.be>

/*        efdtt.c  Author:  Charles M. Hannum <root@ihack.net>          */
/*       Represented as 1045 digit prime number by Phil Carmody         */
/*     Prime as DNS cname chain by Roy Arends and Walter Belgers        */
/*                                                                      */
/*     Usage is:  cat title-key scrambled.vob | efdtt >clear.vob        */
/*   where title-key = "153 2 8 105 225" or other similar 5-byte key    */

dig decss.friet.org|perl -ne'if(/^x/){s/[x.]//g;print pack(H124,$_)}'

To Unsubscribe: send mail to majordomo@FreeBSD.org
with "unsubscribe freebsd-chat" in the body of the message




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?p0510030bb71dd2699d92>