Maybe OT: asymetrical in processing power encrypter / decrypter ?

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Maybe OT: asymetrical in processing power encrypter / decrypter ?

Darryl Miles

This isn't an OpenSSL specific question.  Please excuse it if you dont
find the question interesting and OT.

I'm looking for a mathematical algorithm which takes a small block of
data 128bits to 4Kb and performs an encryption transform.  But I want to
encryption process to *require* vast amounts of CPU power, compared to
the decryption/verification of the result back into plaintext.

I also want to be able to turn up and down the imbalance (if possible)
maybe by requiring a number of iterations where the imbalance curve
works at some useful range (maybe 1000:1 through to 1000000000000:1) so
that future advances in computer power can be thwarted by requiring more
iterations during the encryption part without vastly increasing the
amount of mathematic operations required to decrypt.  Maybe my ranges
are over the top, but any workable range, in practical terms I'm
thinking like Pentium 90MHz could do 1 encrypt per second (the the lower
end 1000:1) and a Pentium4 4.0GHz could do 1 encryption per second (at
maybe 1000000:1 hypothetical guessed value) and then I have acres of
room left to exponentially turn things up.  Although theoretically if
computer power grows at that rate I have less than 12 years left of
life.  But I think you get the thought here.

Excuse me for using the term iterations loosely without talking of a
specific implementation that works like this, think iterations as being
an input value which scales the amount of work required to encrypt
compared to decrypt.


I'm ruling out a hash on the basis of truncation of information which
would leave any such algorithm open to flaws.  The input plaintext can
be a fixed size if the algorithm if limited to a block but must be large
enough to stop it from being pre-computed and stored, I've picked 128bit
as the starting point as I think UUIDs are that length, but would
probably prefer something a little bigger to future proof it.

I notice that RSA keys take up some amount of CPU horse power to
compute, does the amount of time scale with the keysize and is there a
maximum limit.  I want the total amount of work the decryption end has
to do limited, so there it is not an option for the decrypted to also
generate a key as that ends up back with 1:1 processing power balance.

Your thoughts.


Darryl

______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    [hidden email]
Automated List Manager                           [hidden email]