Going to jump into the middle of this, I have as much cryptographic knowledge as a French door but this statement does sound a bit like "The unsinkable ship". Rather than focus on the technical aspects of it (2048bit will always be safer than 1280bit) I would like to ask why. What exactly is the extra hassle/cost in running with a 2048bit key? Is it really big enough to justify going with a weaker key? I understand that the key lifespan is planned to be shorter but it does sound like an unnecessary risk. Cheers -- Tristram Cheer Network Architect Tel. 09 438 5472 Ext 803 | Mobile. 022 412 1985 Fax. | tristram.cheer(a)ubergroup.co.nz | www.ubergroup.co.nz PS: Follow us on facebook: www.ubergroup.co.nz/fb or twitter https://twitter.com/#!/ubergroupltd -----Original Message----- From: nznog-bounces(a)list.waikato.ac.nz [mailto:nznog-bounces(a)list.waikato.ac.nz] On Behalf Of Jay Daley Sent: Thursday, 9 June 2011 3:26 p.m. To: Ewen McNeill Cc: nznog(a)list.waikato.ac.nz Subject: Re: [nznog] I don't trust the NZRS DNSSEC procedures... Yet
[reordered]
[...] but at the same time we don't over-engineer as it is clear that that approach introduces as many problems as it solves.
In civil engineering one of the rules of thumb is that things be designed to handle _at_minimum_ three times the maximum expected load. That's clearly "over engineering". But it's accepted Best Practice, because it provides a margin for error in case some of the estimates turn out not to be true (or the future presents something that was not anticipated -- people marching in step on the London Millennium bridge for instance).
If you were, eg, suggesting the KSK be 4096-bit keys, rolled every 2 months, or something else that was orders of magnitude more "paranoid" than common practice, then it would be right to be concerned about "over engineering". Where you want to engineer something with much less margin for error than common practice, it's reasonable to expect that others will want to look closely at the justifications for "under specifying" too. And in particular whether there is still adequate margin for error, throughout the expected deployment lifetime. I, like Dean I believe, remain to be convinced on this point.
Taking your engineering argument as a way forward - the largest RSA key to have been broken so far (that is publicly known) is 1023 bits and even that was a very special key. A 1280 bit key is 2^257 or 231,584,178,474,632,390,847,141,970,017,375,815,706,539,969,331, 281,128,078,915,168,015,826,259,279,872 times as strong as that. So let's say someone announced today that they could factor a 1024 bit key in just 1 second, it would take them 3,671,743,063,080,802,746,815,416,825,491,118,336,290,905,145,409,708,398,004,109,081,935,347 years to factor a 1280 bit key. We are already "over-engineering" but not "over-over-engineering". cheers Jay
Ewen
-- Jay Daley Chief Executive .nz Registry Services (New Zealand Domain Name Registry Limited) desk: +64 4 931 6977 mobile: +64 21 678840 _______________________________________________ NZNOG mailing list NZNOG(a)list.waikato.ac.nz http://list.waikato.ac.nz/mailman/listinfo/nznog