PeterL: gnupg.org/faq/gnupg-faq.html << "If you need more security than RSA-2048 offers, the way to go would be to switch to elliptical curve cryptography — not to continue using RSA." hmm, interesting
PeterL: "RSA-4096 is not a bad idea: it’s just, generally speaking, unnecessary. You gain very little in the way of additional resistance to brute-forcing and cryptanalysis."
To best understand the lie, and the imposture, let us delve into detail.
Historically a 768-bit RSA keypair was cracked in 2009 by Lenstra et ali. They at the time credibly claimed that a "normal" computer (defined as 2.2ghz AMD Opteron with 2GB RAM) would need approximately 1.5 mn years to repeat their feat.ii
Every additional bit doubles the size of N, and so every other additional bit doubles the size of both p and q, the secret factors that together multiply to N. To factor a key, one needs to find piii, and if the one is proceeding through brute force then you could say that since there's 168 primes smaller than 1`000 and 1`229 primes smaller than 10`000iv whenever you increase p tenfold you increase the count of factors he has to try by ~9.43 fold, more or less, and so going from 768-bit RSA to 2048-bit RSA you go from 1.5 million years (of single Opteron time) to 1.5 * 2 ^ 640 million years (of single Opteron time), which exceeds whatever random "visualization" value one might wish to come up with ("age of the Universe" according to Ted Talks / I Fucking Love A Black Dude With a Moustache And Pictures Of Galaxies / the people that gave you "global warming" is always popular) and so therefore "should be safe". And "you wouldn't need even more".
There's a number of problems with this nonsense. Let's enumerate :
Problem #1 with Werner Koch's guilty nonsense : 2009 was before Bitcoin. According to exactly the same evaluation procedures, the time it takes Bitcoin blocks to be found is no less than 2.5 * 1014 seconds, or roughly speaking 7`927`447.99 years.v Each. From experience no block takes more than maybe an hour ; and in many cases blocks succeed each other within seconds.
It is of course up to you to presume that your adversary runs a legacy Opteron lavishly furnished with 2GB of RAM and drawing 85 Watts from the wall in a basement somewhere. Meanwhile in actual practice, the "1.5 million years" happen five times a second all the damned time in the hands of amateurs ; and the NSA is building rather large basements drawing power by the Gigawatt for some reason.
Problem #2 with Werner Koch's guilty nonsense : The "brute force" approach is a gross misrepresentation of the process. In actual practice there's two distinct and recognizable parts to factoring RSA keys, called sieving and linear reduction respectively, for simplicity. Better computers can and do expedite the sieving part ; however doing linear reduction on arbitrary length integers is not a trivial matter. It involves coming out with correct parameters, the methods are extremely intricate, there's not that many people alive who can productively (ie, usefully) participate in this part of the activity. In any and all practical applications, twiddling the linear reduction part is the bottleneck.
Which sets the stage for Koch's shame, because what he (deliberately) omits to mention is that you can't reuse these. In other words, if your USG masters sink a lot of very valuable mathematician time into figuring out the parameters for 2048-bit keys, they sure as fuck want everyone to use THOSE.
Remember this if you'll remember nothing else - heterogenicity is always important, and always useful, in the fight against the Empire. Always make use of it, for it was since forever and it will remain forever your first, and your best weapon.
Problem #3 with Werner Koch's guilty nonsense is simply that he was already caught deliberately and significantly subverting the most delicate part of RSA key generation (entropy) in GnuPG. How many of those 2048 bits can you trust are actually, certifiably, correctly and properly derived from an entropic source ? Think again!
The relation is not as simple as "half your so-called entropic bits are predictable, therefore make a key twice as long and you'll be fine", of course. Nevertheless, the shorter the keys the less "slack in the gears". This slack may be by necessity or definition impossible to evaluate ; yet it's there. Why push it ?
There's no real performance gain from downgrading your key from the 4096-bit Republican standard to the 2048-bit Imperial ideal, so why would you ?
And forget about "elliptic curve cryptography" altogether, it's a red herring in any discussion of cryptography, enjoying all the shelf life of USG crapolade generally.———
- This, as far as publicly known, is still the record. [↩]
- If by the feat we mean applying the General Number Sieve with already known parameters for the linear reduction part - then perhaps. As it stands, nobody knows that anyone knows what parameters to use for a 1024 bit integer (nor does there exist such a thing as "a" integer in this field), let alone 2048 - or for that matter 4096. [↩]
- Specifically p because it is defined as the smaller of the two. [↩]
- Or, better yet, 24`739`954`287`740`860 primes smaller than a gazillion but only 2`623`557`157`654`233 smaller than hundred quadrillion. [↩]
- This estimate rates Lenstra's Opteron at a very generous 1 KHps - and bear in mind that the intricate calculations involved in producing one Bitcoin hash represent significantly more computing effort than bignum multiplication, in whatever form implemented. [↩]