You know a lot more about this topic than I do, and I'm not an authority on this subject by any means. Having said that, I found these two descriptions of GPG prime generation.

From what I can tell, GPG doesn't *prove* that the numbers it generates are prime; it finds numbers that are "probably" prime. *If* the consideration of larger primes leads to the greater likelihood of a false positive on the primality tests, then it seems like increasing the iterations of the primality tests would maintain the high chance of actually generating a prime. Additionally, it may be prudent to test more low primes as well (maybe primes up to 50k/500k?). Aside from that, it doesn't seem that the prime generating algorithm would have issues with generating arbitrarily large primes.

Yes, I am aware that PGP doesn't prove that its primes really are prime. For a 1024 bit key (using 512 bit primes), the chances that a "prime" is actually composite are 10

^{-44}, which is very slim indeed. The problem is, are larger primes much more likely to be composite, and if so, are these pseudoprimes worse than smaller primes in terms of ease of factorisation?

...Though I don't know what you mean by primes up to 50k/500k. If you're talking about number of bits, those are

*ludicrously* large primes! If you're talking about actual integer values, then they're ludicrously

*small* - a 4k RSA key uses primes with 612 decimal digits!

...I'm confused as to what exactly you're worried about as well. Wouldn't an imperfect prime generator simply generate composite numbers on occasion, and thus occasionally produce keypairs that would distort almost all of the data they process? It's far from desirable, but the issue is immediately noticeable upon an attempted decryption, and for even an adequate prime generator, all you'd have to do is generate another keypair (or if you're *really* unlucky, two). From a usability standpoint, it's horrendous, but it doesn't seem to affect things from a security standpoint when it comes to keys that are manually verified as working. I feel like I'm missing something important...

You are. Composite numbers will

*seem* to work fine, in that you can create a key-pair with them and use it to encrypt and decrypt. But there's a problem. To see why composite numbers are bad, here's how the an RSA key is generated, in three easy steps:

Step 1: Create two prime numbers, p and q, and multiply them to create the modulus, n = pq.

Step 2: Create a third number, e, that is relatively prime to (is not divisible by) the product (p-1)(q-1). This is the public exponent.

Step 3: Calculate an integer d, such that the quotient (ed-1)/((p-1)(q-1)) is an integer. This is the private exponent.

The public key consists of the modulus and the public exponent (n,e), and the private key consists of the modulus and the private exponent (n,d). For encryption, the ciphertext, C, is created from the plaintext, M, using the public key (n,e), thusly:

C=M

^{e} mod n

For decryption, you just repeat using the private key:

M=C

^{d} mod n

The plaintext, in most applications (including PGP), is a randomly generated key to a symmetric cipher (such as AES), which is used to encrypt the actual message, because symmetric ciphers are so much faster for the same level of security.

Now for the big problem. Can an attacker calculate the private key (n,d) from the public key (n,e)? (If he can, the key is obviously broken) All he has to do is perform step 3 above, which requires him to know the values of p and q (which requires that factorise n, which is widely assumed to be an NP-hard problem). At least, he does if p and q are primes. It actually works with

*any* values of p and q such that pq = n. As long as p and q are prime, they are only values that work, but if p and q are composite, their factors will work just as well. Or, to put it more simply, factorising a number which is the product of two primes is hard, but factorising a number that is the product of two composites is easy (or at least easier, depending on how big the prime factors of the composites are).

TL;DR: Composites are bad, in that they make it much easier for someone to break your private key. However, they still encrypt and decrypt normally, so you don't know they're bad until it's too late.

If this does work without any practical or theoretical hitches whatsoever, it makes me question the need of the upper limit in the first place. It might be better to have an upper limit calculated from a cursory glance at computer resources, then to initialize the secure memory calculated from the generated key size...

Theoretical hitches first - an asymmetric key is only as strong as the underlying symmetric cipher and digest algorithms (but see the last paragraph of my first post). Back when 128 bit symmetric ciphers where used, anything more than a 4096 bit asymmetric key was overkill. But now everyone is using (or ought to be using) 256 bit asymmetric ciphers, which is ridiculous - it is much,

*much* easier to break a 4096 bit asymmetric key than the 256 bit symmetric key, so the 256 bit key doesn't buy you any extra security over a 128 bit key, unless you use a huge asymmetric key.

Practical hitches - huge keys require a correspondingly huge amount of entropy to generate. A 16k key requires more entropy than most computers can generate in a reasonable period of time (generating the test key took 10 minutes on a 3.2GHz Phenom II X6). But that's only to generate the keys - merely

*using* huge keys is slower than smaller keys, but not excessively so. This may be an issue on mobile devices with pathetically slow CPUs, but otherwise it's not a problem. Hmm...