20 Oct
2008
20 Oct
'08
4:33 a.m.
On Tue, 21 Oct 2008, Miskell, Craig wrote:
We already have, by having a robots.txt file. Shouldn't have to ask twice.
User-agent: * Disallow: /recruitment Which I think highlights the problem. Many people have robots.txt files because they have some content they don't want archived by others, other people have load and bandwidth issues. The National Library really has to ignore the first group but at the cost of hitting the second group. -- Simon Lyall | Very Busy | Web: http://www.darkmere.gen.nz/ "To stay awake all night adds a day to your life" - Stilgar | eMT.