I would be curious who has actually tracked this traffic (themselves or via upstream ISP) and come up with a cost to their organisation?

Are they invoicing natlib? if not.. why not?

 

If I were to do this sort of trolling.. personally, I would resolve addresses to ip's and sort out what is local NZ IP's and off shore and troll from the appropriate source... as much as I feel the ignorance of robots.txt is against the goodwill and security of the big bad world :\

 

 


 

I do find it quite amusing as to how many people in this country seem to have such a simplistic view on contracts that they think that anyone that may have caused them trouble they can randomly invoice for their time/expenses. You can invoice people with whom you have a service/supply contract, any one else your only course of reproach is court action. While the actions of NATLIB in this case may or may not be questionable with the ignoring of the robots.txt file this does not give anyone the right to think that they can invoice them for accessing their publicly available website. If YOU as a web hoster have decided to serve websites, the basis for how you have contracted your bandwidth is YOUR problem. If you want to be protected against unforeseen spikes in traffic get flat rate hosting, not data charged hosting.

Frankly, I have always thought hosting websites on a data charged basis is a very risky and short sighted option, any one in the world is quite entitled to drag whatever traffic they want off your site as much as they wish and cost you money. It makes no difference who or what is sucking traffic off your website, if you have chosen to host websites on a data charged basis that’s your choice and you need to live with the consequences.  If you want piece of mind get a flat rate option (yes they are available in NZ), otherwise stop winging.

 

My 2c