If you read the FAQ from the National Library, it does say:
"In practical terms, this means webmasters can expect the harvester to work in bursts, taking 100 URLs from each website before moving on the next. Eventually the harvester will cycle back around to collect the next 100 URLs from the site. The exceptions to this are Government, Research, and Maori sites (.govt.nz, .ac.nz, .cri.nz and .maori.nz) where we harvest 500 URLs at a time."
Which means you can expect to only see 100 pages requested at a time then some time for your 286 to recover before the next 100 requests comes along.
This should resolve any worries about the crawler crapping all over the performance of your site(s).
Cheers, Patrick
That ignores the fact that you may have 1000's domains on one box - as we do where 80% of our domains are accessed once a year, and we have 4 or 5 really active domains on a box. When the crawler comes and it grabs 30 domains on one box its suddenly doing 3000 requests - oh and when it says pages - that means a page, the bot also grabs any images for that page. Hopefully you can see how this would cause an impact on shared hosting providers. G