Skip site navigation (1)Skip section navigation (2)
Date:      Sun, 27 Jul 2003 18:26:52 +0200
From:      "Daan Vreeken [PA4DAN]" <Danovitsch@Vitsch.net>
To:        "Dragoncrest" <dragoncrest@voyager.net>
Cc:        FreeBSD-questions@FreeBSD.org
Subject:   Re: Simple cron script to copy remote webpage locally?
Message-ID:  <200307271826.52780.Danovitsch@Vitsch.net>
In-Reply-To: <200307271529.h6RFTVIW035658@mail0.mx.voyager.net>
References:  <200307271529.h6RFTVIW035658@mail0.mx.voyager.net>

next in thread | previous in thread | raw e-mail | index | archive | help
On Sunday 27 July 2003 16:51, Dragoncrest wrote:
> I've got a webpage that updates dynamically on one of our servers and
> lists a bunch of statistics about spam and such on our servers.  Proble=
m
> is, the script puts a load on the server if too many people access it
> and it eventually kills the server.  I would like to lower the traffic
> on this server by setting up a script on a remote server that is
> activated every 10 minutes by cron and automatically loads the remote
> script then copies the results to a local file on the new public server
> which people can then view at their leasure without killing our stats
> server.  What is going to be the easiest way to do this?  I'm sure ther=
e
> has to be a simple way to do this, but I'm kinda drawing a blank on how=
=2E
>  Can anyone help?

Take a look at the "fetch" program
basically you just need to supply it a URL and a local file-name.

# man fetch

grtz,
Daan



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?200307271826.52780.Danovitsch>