[linux-elitists] web server software for tarpitting?
Thu Feb 14 19:27:48 PST 2008
* James Sparenberg <email@example.com> [2008-02-13 17:29-0800]
> On Tuesday 12 February 2008 10:33:17 Gerald Oskoboiny wrote:
> > Yes, but a single IP address re-fetching the same URL thousands
> > or hundreds of thousands of times a day seems excessive.
> Perhaps a mirroring system is needed? Also a change to the standard that
> says straight up that the DTD needs to be brough down local and cached. What
> seems to be missing, IF, I understand the problem space correctly, is a means
> where by high volume users can cache a local copy. Yet, have a reasonable
> assurance that if the master is updated (by w3c.org in this case,) this
> update is propagated in a manor not dissimilar to, oh, a dns update. People
> could grab and hold a copy of the master, with a TTL of say 1 week,
> dramatically lowering your overhead (or anyone else's) while at the same time
> creating the ability to have master control without fragmentation.
HTTP has all this stuff built in already, it's just being
ignored. These resources generally haven't changed in 5+ years,
are unlikely to change in the future, and are served with
explicit expiry times of 90 days to a year. There's no reason
anyone should need to fetch them even once a week, let alone
thousands of times a day.
Gerald Oskoboiny <firstname.lastname@example.org>
More information about the linux-elitists