Thanks for the helping me understand the whole entropy thing better. It is still get the feeling that this is a "best effort" thing and that nobody can actually proof what is correct. I am probably just bringing the math down to my level - sorry.

With that said for validation I still need to be sure that I give the required entropy back from the OpenSSL callback. Now since I am not allowed to use a hash with the DRBGs (FIPS lab and SP800-90B section 8.4), can you please confirm that, with a source of raw 2b/B entropy data, I need to return 4 times the data from the callback function?

> -----Original Message-----

> From: openssl-dev [mailto:

[hidden email]] On Behalf Of

> Paul Dale

> Sent: 28 July 2016 02:33 AM

> To:

[hidden email]
> Subject: Re: [openssl-dev] DRBG entropy

>

> John's spot on the mark here. Testing gives a maximum entropy not a

> minimum. While a maximum is certainly useful, it isn't what you really

> need to guarantee your seeding.

>

> A simple example which passes the NIST SP800-90B first draft tests with

> flying colours:

>

> seed = π - 3

> for i = 1 to n do

> seed = frac(exp(1+2*seed))

> entropy[i] = 256 * frac(2^20 * seed)

>

> where frac is the fractional part function, exp is the exponential

> function.

>

> I.e. start with the fractional part of the transcendental π and iterate

> with a simple exponential function. Take bits 21-28 of each iterate as a

> byte of "entropy". Clearly there is really zero entropy present: the

> formula is simple and deterministic; the floating point arithmetic

> operations will all be correctly rounded; the exponential is evaluated in

> a well behaved area of its curve where there will be minimal rounding

> concerns; the bits being extracted are nowhere near where any rounding

> would occur and any rounding errors will likely be deterministic anyway.

>

> Yet this passes the SP800-90B (first draft) tests as IID with 7.89 bits of

> entropy per byte!

>

> IID is a statistical term meaning independent and identically distributed

> which in turn means that each sample doesn't depend on any of the other

> samples (which is clearly incorrect) and that all samples are collected

> from the same distribution. The 7.89 bits of entropy per byte is pretty

> much as high as the NIST tests will ever say. According to the test

> suite, we've got an "almost perfect" entropy source.

>

>

> There are other test suites if you've got sufficient data. The Dieharder

> suite is okay, however the TestU01 suite is most discerning I'm currently

> aware of. Still, neither will provide an entropy estimate for you. For

> either of these you will need a lot of data -- since you've got a hardware

> RNG, this shouldn't be a major issue. Avoid the "ent" program, it seems

> to overestimate the maximum entropy present.

>

>

> John's suggestion of collecting additional "entropy" and running it

> through a cryptographic has function is probably the best you'll be able

> to achieve without a deep investigation. As for how much data to collect,

> be conservative. If the estimate of the maximum entropy is 2.35 bits per

> byte, round this down to 2 bits per byte, 1 bit per byte or even ½ bit per

> byte. The lower you go the more likely you are to be getting the entropy

> you want. The trade-off is the time for the hardware to generate the data

> and for the processor to hash it together.

>

>

> Pauli

> --

> Oracle

> Dr Paul Dale | Cryptographer | Network Security & Encryption Phone +61 7

> 3031 7217 Oracle Australia

>

> -----Original Message-----

> From: John Denker [mailto:

[hidden email]]

> Sent: Wednesday, 27 July 2016 11:40 PM

> To:

[hidden email]
> Subject: Re: [openssl-dev] DRBG entropy

>

> On 07/27/2016 05:13 AM, Leon Brits wrote:

> >

> > I have a chip (FDK RPG100) that generates randomness, but the

> > SP800-90B python test suite indicated that the chip only provides

> > 2.35 bits/byte of entropy. According to FIPS test lab the lowest value

> > from all the tests are used as the entropy and 2 is too low. I must

> > however make use of this chip.

>

> That's a problem on several levels.

>

> For starters, keep in mind the following maxim:

> Testing can certainty show the absence of entropy.

> Testing can never show the presence of entropy.

>

> That is to say, you have ascertained that 2.35 bits/byte is an /upper

> bound/ on the entropy density coming from the chip. If you care about

> security, you need a lower bound. Despite what FIPS might lead you to

> believe, you cannot obtain this from testing.

> The only way to obtain it is by understanding how the chip works.

> This might require a treeeemendous amount of effort and expertise.

>

> ================

>

> Secondly, entropy is probably not even the correct concept. For any given

> probability distribution P, i.e. for any given ensemble, there are many

> measurable properties (i.e. functionals) you might look at.

> Entropy is just one of them. It measures a certain /average/ property.

> For cryptologic security, depending on your threat model, it is quite

> possible that you ought to be looking at something else. It may help to

> look at this in terms of the Rényi functionals:

> H_0[P] = multiplicity = Hartley functional

> H_1[P] = plain old entropy = Boltzmann functional

> H_∞[P] = adamance

>

> The entropy H_1 may be appropriate if the attacker needs to break all

> messages, or a "typical" subset of messages. The adamance H_∞ may be more

> appropriate if there are many messages and the attacker can win by

> breaking any one of them.

>

> To say the same thing in other words:

> -- A small multiplicity (H_0) guarantees the problem is easy for the

> attacker.

> -- A large adamance (H_∞) guarantees the problem is hard for the

> attacker.

>

> ================

>

> Now let us fast-forward and suppose, hypothetically, that you have

> obtained a lower bound on what the chip produces.

>

> One way to proceed is to use a hash function. For clarity, let's pick

> SHA-256. Obtain from the chip not just 256 bits of adamance, but 24 bits

> more than that, namely 280 bits. This arrives in the form of a string of

> bytes, possibly hundreds of bytes. Run this through the hash function.

> The output word is 32 bytes i.e. 256 bits of high-quality randomness. The

> key properties are:

> a) There will be 255.99 bits of randomness per word, guaranteed

> with high probability, more than high enough for all practical

> purposes.

> b) It will be computationally infeasible to locate or exploit

> the missing 0.01 bit.

>

> Note that it is not possible to obtain the full 256 bits of randomness in

> a 256-bit word. Downstream applications must be designed so that 255.99

> is good enough.

>

> ========

>

> As with all of crypto, this requires attention to detail. You need to

> protect the hash inputs, outputs, and all intermediate calculations. For

> example, you don't want such things to get swapped out.

> --

> openssl-dev mailing list

> To unsubscribe:

https://mta.openssl.org/mailman/listinfo/openssl-dev> --

> openssl-dev mailing list

> To unsubscribe:

https://mta.openssl.org/mailman/listinfo/openssl-dev